Skip to: [part 1] [part 2] [part 3] [part 4]
Previously...
In part 1 I explored the concept of personal data as a currency. Personal data has assumed more importance to the online advertising industry in particular than mere audience; data capture has somehow become an essential component in online advertising.
Yet this isn't a fundamental law of the market. There is still value to advertisers in raw audience; after all we still have billboards on the M4.
Online, the market is currently working in the interests of those paying for personal data or data dependent products like advertising; the few handing over cold hard cash.
It's basic economics: the only group making purchasing decisions are advertisers and those who buy personal data, therefore a competitive market will act ultimately in the interests of advertisers, driving up quality of product and driving down price.
But quality of product if today's trend continues will require gathering and sharing even more sensitive data about us, the users of online services.
If the market is to work instead in the interests of the public, the public must start to make their own purchasing decisions. They must chose to use or avoid services based on how their personal data is used and protected.
In many respects personal data is a currency. It might not be the only currency, we can still choose to pay in pounds or dollars, and in many cases simple participation may be sufficient.
We can build a market which acts in the interests of the end user, but only if users regain control and start to make informed decisions about the currency they hand over.
Law and enforcement, conflict and confusion
Today the public are not in control. They are conflicted and confused.
This is described by some sociologists as cognitive polyphasia: we want privacy but we want the benefits of sharing. We choose to use privacy-invasive services because they bring rewards, despite privacy worries. And less invasive alternatives seem slow to emerge.
And they're confused by the mechanics of data sharing: what is being shared, with whom, for what purposes; the length of time data is to be retained and who can see or access the data.
So can and should the law play a role in protecting consumers?
Whilst I broadly support the aims of European and US legislators to create stronger rights for citizens to protect their personal data and online privacy, driving a solution solely through legislation at a time of rapid innovation causes concern.
Legislators seem focussed not on targeted regulation but a broad package of reform.
But we can't foresee how networked technologies will evolve and society might benefit, so regulation must be flexible or risk stifling innovation.
But flexible law in many cases is worse than no law. A lack of strict definition in a complex area introduces uncertainty, making legislation inaccessible to most people; we end up relying on lawyers to interpret what we can and can't do.
Uncertainty leaves businesses shouldering increased risk when their business model may at some point be found illegal by a future court ruling.
And judge-led definition is a challenge to democratic accountability. Law makers and their electorate alike don't understand the precise meaning, never mind the implications, of a law passed today until some future court ruling years or decades down the line.
01001001
Enforcement is another concern. Even well defined targeted legislation of limited scope is useless if it's unenforceable.
The rules today aren't focussed on the 'person' in personal data, they're focussed on the 'data', and data is everywhere.
Personal information is on nearly every single phone, nearly every single computer and the majority of storage devices.
All individuals and all businesses hold information about other people and most of us carry much of this data around with us most of the time.
Data protection is complex yet skilled police and other enforcement resources are in short supply. Complex prosecutions take time, cost a lot of money, and, due to their intricacy, stand a higher risk of failing on a point of law.
Organisations today managing even a moderate amount of personal data struggle to ensure strict compliance. This isn't just my view, senior partners at major law firms echo this.
And unworkable and largely unenforceable laws risk consumer harm. The public will proceed in the false belief their data is safe, protected by laws which in reality are unenforceable and largely ignored; laws in name only.
01000100
At this point readers could be forgiven for sensing defeat.
The market today is not working in the interests of end users, there are limits to what the law can possibly achieve in a complex and rapidly-evolving area, and a conflicted public is charging ahead with apparently little regard for the dangers many foresee down the line.
Perhaps you've come to one of two conclusions: privacy is dead and we just need to get on with enjoying the benefits of digital innovation; or, the situation is out of control and strict laws to stop the personal data land grab are necessary now, despite the risks to innovation.
Personally I feel the current situation can't and won't go unchecked; personal data is already being exploited.
Instead of minimising personal data capture, most companies proceed under the widespread belief that data is valuable, either now or in the future. This fuels a personal data land grab: capture everything, indefinitely, just in case it might have some future value.
By accident or design, companies who want to gather, store and profit from personal data benefit from a conflicted and confused public, as it allows their land grab to continue unchecked.
But the solution doesn't lie mainly in law or with national governments - because we don't yet know where the lines should be drawn.
01001001
Rather than frame this as an all-or-nothing question - will privacy survive - we must instead ask what online privacy means to society and give technology time to mature, before regulating if necessary when dangers and benefits emerge.
Indeed, given time technology may evolve to solve some or all of today's data protection problems - in much the same way as the invention of the speedometer in 1888 allowed speed limits above walking pace to be imposed on road vehicles.
The situation isn't sustainable as it risks undermining trust in technology, but the solution is in the hands of the technology industry and the motivation is to bring sustainability to an industry that's at serious risk of a damaging backlash.
At some point the public will accept no more. We cherish individuality and autonomy, and if we see a serious threat to either a rebellion is inevitable.
People will start to reject technology if they don't trust technology, and trust erosion can have similar consequences to over-regulation: it could stifle innovation.
The challenge today is to minimise a backlash and integral to this is preventing a bubble. Instead of charging ahead with a personal data land grab we must start to act with caution to find the boundaries of acceptability.
01000100
And we all must take responsibility for our own privacy. The buck stops with us. The next time a service asks for your email address or your web browser prompts you to "sign in" think why? What's in it for me? Why do you want me to do this?
I'm not advocating privacy zealotry. I'm not saying stop using a service you like because it gathers data about you. I'm asking you to consider the times you hand data over and receive very little in return, or - if you're worried about privacy - to consider less invasive alternatives.
Whilst I assert that there are practical limits to what law regulation can achieve, it can play a role - in much the same way as shops are bound to provide minimum standards of consumer protection to help the market to function.
In part 3 I look at specific measures which will improve clarity and transparency for end users, helping them make informed decisions in order to create a market which works in the interests of the end user.
And in part 4 I will look at some of the other barriers to competition in the online service market, such as crowd mentality in social networks - users want to be where their friends are, another factor fuelling our cognitive polyphasia.
Skip to: [part 1] [part 2] [part 3] [part 4]
No comments:
Post a Comment