Tuesday 27 November 2012

Children and privacy rights within the family: are we normalising a surveillance culture?

A question which seems to come up with increasing frequency is whether children have privacy rights from their parents or carers (including schools, etc).

A really good starting point on this is a 2011 paper by Schmueli and Blecher published in the Columbia Human Rights Law Review.

Schmueli and Blecher highlight the lack of privacy rights enshrined in US law (and within the EU, as far as I can ascertain) for children in a family setting.

That is, children have little or no legal right to privacy from their parents.

But they also discuss why this is. Parents have a duty of care:
"To be sure, the primary role and responsibility of parents is to protect their children"
The question of childhood privacy is not new; but online dangers - many real, many perhaps over-hyped - most definitely are.

Their paper discusses a shift in attitudes towards child privacy brought on by the perceived threat from dangers online:
"parents have always been able to invade their children’s privacy by going through their schoolbags, reading their personal diaries and the like"
Noting that being able to "trust" a child in your care, where trust is warranted, was traditionally a sign of good parenting.  I, and I assume most people, were raised with the notion that it was definitely not right to read someone else's diary, even your child's.
"Whereas “good parents” may have traditionally been encouraged to trust their children, today they are encouraged to safeguard their children, where this incontrovertibly involves invading their privacy in one way or another. Monitoring has become associated with good parenting, and the surveillance of children has been framed in a language of safety, protection, and care.
There are many examples of technology used to monitor children to e.g. check social media use, track their location, trawl their internet history, etc.

I suspect a large group of parents believe this shift is fair and necessary, whilst another significant group remains worried by what I will describe as a significant invasion of virtual personal space at a time when children need private spaces to learn and develop their thinking and behaviour.

Schmueli and Blecher don't dismiss the safeguarding element out of hand, recognising a child's need for guidance from their parents who "act in their children’s best interests" through “natural bonds of affection”.

But they talk about a need for balance; and, notably, a need to recognise that a child's safety is not always at stake:
"Thus, when children’s physical or emotional safety is at stake, whatever interest in privacy they may have is outweighed by society’s interest in their protection. However, not all situations involve a child’s safety, and even when the goal is to prevent harm to the child, the harm of infringing upon the child’s privacy should also  be taken into account."

This balance will evolve as a child matures.  I scrawled the diagram below as a rough attempt to highlight how a parent's duty of care in the early days when a baby is entirely dependent on its parents evolves into a duty to respect a growing child's autonomous rights as they approach adulthood:


From my own experience as a parent it's notable how early a child displays independence and free will, making demands to be left to perform tasks such as feeding or dressing themselves.

Yet, as noted in the paper:
"There is a widespread consensus that children show less concern than adults about privacy. However, very few empirical studies have demonstrated this."
Is there conflict between physical demonstrations of free will from an early age and a child's natural demands for privacy from its parents; or, is the above conclusion drawn from insufficient or flawed data; or, maybe we simply don't understand enough about privacy itself to draw conclusions about a child's needs.
"It seems more accurate to argue that privacy is important to children, though their conceptions of privacy and the private differ from those of adults."
One question I'm left with hinges on whether some forms of parental monitoring cause distress or other psychological harm to the child.

E.g. a child could understandably become distressed if they discover a parent has read their private diary, hurt by the breach of trust.

Whereas a child has legal protection from most forms of physical harm, bar "reasonable chastisement", even from a parent, they have little if any legal protection from the emotional harm caused by unwarranted and persistent invasions of privacy, since both EU and US legal thinking groups privacy rights into "family units" of a sort.

In the way that the abused can become abusers themselves, will such persistent breaches of trust create a generation normalised to pervasive surveillance? A generation who won't think twice about imposing a similar regime on others, including their own children, and so on?

And what effect will this have; will our children be afraid to experiment, through fear of being watched, as they would in "normal" environments?

Would such inhibited behaviour limit itself to online contexts, or will it shape all aspects of child development, e.g. creative experimentation?

I for one absolutely did not want my parents to read any of my creative writing essays set by my school.  In later life I understand this as wanting to be free to explore ideas and behaviour that my parents didn't particularly subscribe to.  That surely is a healthy part of a child's development.

There is a possibility that none of the above will happen.  That children will adapt to find a level of privacy they are comfortable with that their parents simply cannot invade, e.g. through the use of technology such as strong encryption and other secure storage.

In fact a child only needs to be one step ahead of their parent's technological capability to evade surveillance, which raises another, different question: how is a parent expected to fulfil their duty of care when a child knows far more about technology than its parents?

@JamesFirth

Wednesday 18 April 2012

Update 18-April

Whilst I'm still struggling to find a way to keep Open Digital staffed I'm making some progress on the personal front.

I need a job since there's now very little prospect of a salary from Open Digital in the near future and I've invested all my spare cash getting it going.

Lots of people have been in touch with potential job offers, please keep them coming as it's a matter of finding a tolerant employer happy for me to continue my public advocacy in my spare time if Open Digital is to survive at all.

I've also now started writing a weekly blog at Computerworld UK.

Although after 2 years SRoC was well read and, so I discovered, by some influential people; I hope moving my writing there will help get issues raised with a different audience and might help the rescue package I'm trying to cobble together for Open Digital.  Hopefully more on this later.

That's about all for now.  You can read my first post over at Computerworld:

Why the end of Consumer Focus is a blow to UK tech policy

When a government in the midst of a cuts agenda releases a report entitled Empowering and Protecting Consumers, you just know it will not be good news for under-empowered and unprotected consumers in the cut-throat utilities sector. 
I imagine the report might have started life as a memo: ‘make a case for getting rid of these meddling windbags’.
What many people, even those in the tech sector, perhaps don’t appreciate is the role Consumer Focus plays in batting for sense in UK tech policy. 
... 
Consumer Focus is not Big Government. It’s a safeguard against unavoidable monopolies in the utilities sector, including Internet Service Providers. It fights largely from behind the scenes, pushing back against many of the self-interested demands of tech lobbyists.

>> Read the full post >>



@JamesFirth

Tuesday 10 April 2012

Thanks and goodbye - the end of a relatively short road

Thanks for all who've helped and supported us over the last year with Open Digital and the last 2+ years for me as an independent policy advocate.

Sadly we can't fund year 2 so this project is on hold. Indefinitely.

Whilst it's never over till the hard disks are wiped and the organisation is liquidised the financial reality is that I can't put any more time into this or SRoC until I can make it pay.

I have mouths to feed and my public CV is here if anyone wants to hire me:
http://www.opendigital.org/consultants/OpenDigital_JamesFirthCV.html

Sorry it's been so brief. Whilst I'm sure there are many out there funding work towards ethical data policy I have not managed to get any of it to flow in our direction.

We all have to eat; and, sadly, as I have written many times over, policy follows the money.

Bye - for now at least,

@JamesFirth

Monday 26 March 2012

Help fund us: call for sponsors for our work on standardised personal data licenses

We've had tremendous feedback from our proposal for a set of standardised personal data licenses to provide user clarity on privacy, as well as some constructive criticism that I genuinely welcome as part of the wider debate (I'll blog all of this later - UPDATE, see here).

We want to go ahead and develop the concept, but it will cost more than Julian or I can self-fund. We need to commission two studies to be sure of the user psychology and commercial demand before we even start with the legal side in drafting the terms for each of the licenses and finalising the iconography.

We hope to raise money from a mixture of crowdfunding, commercial sponsorship and selling shares in Open Digital Policy Organisation Limited.



We can't do anything without significant financial support

To put it bluntly, Open Digital is at a crossroads.

If we can't fund this project or find another ethical way of securing funding which doesn't tie us to vested corporate interests we can't continue as an independent voice promoting the open internet as a platform to benefit both ethical data businesses and the public alike.

It really is that simple. This is the model I tried to build.  I grew up with technology, I spent my 15-year career building connected technology, and the last 3 years advocating the benefits of open connected platforms.

But the risks can't be overlooked and technologists can't carry on blind to the growing public worries just as legislators can't carry on regulating the internet as if it's just an extension of TV or radio broadcasting.

We must work to further our understanding of how technology can be used and abused in order to make better well-informed policy decisions, and we are at the centre of that debate in the UK.

Since forming Open Digital just 10 months ago we have been involved at so many levels. Our first report (pdfgot a mention on page 2 of the FT, I have met with government ministers, been interviewed on prime time telly and spoken at meetings of senior telecom execs, legal and human rights conferences, social media policy workshops and in parliament.

We have responded to government consultations and are involved with every aspect of digital policy, from cyber security to child protection, copyright and privacy.

Yet we will have to park all our work within 7 weeks if we can't find a way forward on funding.  We won't exist in this form by the end of May unless we can crack the funding challenge. I have ploughed all my time and most of my own money into my own full-time research in this area for 3 years. (So if you can't help fund us, then... if you have a job going!)

We can't continue without significant financial support. It's that simple.  I'm particularly calling for shareholders and/or commercial sponsors who agree with our charter to support our work.



Project plan for the standardised personal data scheme

First we will commission 2 studies to guide the rest of the project:
  • Can a project like this change human behaviour; and if so, what elements are important to achieve maximum impact
  • Business attitudes to privacy and requirements from standardised licenses. 
We need to produce licenses that businesses will adopt, users will understand and will be effective in driving-down unecessary data capture.

We will then use the outcomes of the two studies to draft the licenses and finalise the iconography. This will include quite a lot of expensive legal advice - however we would be willing to partner with a suitable chambers or institution if any step forward.



Share sale

Should our scheme be successful, there is opportunity for sustainable commercial returns through growing our consultancy business helping companies who want to adopt our standard.

Potential investors should however note that, as an organisation dedicated to ethical business, we want to avoid accusations of profiteering from a protected monopoly.

We therefore pledge to hold the intellectual property from the standardised privacy policies project in trust, so that any suitably qualified global standards body can take ownership should this become too big for any company, however ethically minded that company sets out to be.

That said, there is plenty of scope to make fair returns.

The largest risk is of course non-adoption, but we hope like-minded investors will see the importance of this and our wider work at Open Digital and see the worthy contribution to the wider debate as helping balance the risk vs reward equation for investment.


Report sponsors


There is opportunity for 2-4 commercial sponsors to be involved with each of the two feasibility studies.


Any other ideas?


Seriously, just tell us: contact@opendigital.org

Tuesday 20 March 2012

Trust bubbles: how security, trust and economic prosperity are interlinked

Here's a fascinating area of research I heard about from Virgil Gligor at the CSIT cyber security summit in Belfast last week.  Societies with a high degree of trust have better economic growth; there is a link between trust and economic prosperity.

A 2007 study by Dincer and Uslaner uses US economic data to prove a positive link between 'generalized trust' - trust in strangers - and economic growth. A 10 percentage point increase in trust yields a 0.5% increase in GDP.

This has some important consequences, especially given the current push to improve cyber security.

I can't stress how important it is to understand that 'generalized trust' is about trust in strangers, not trust bonds within community groups, social networks etc.

Typically, online security research tends to focus on the latter. How can Alice establish that Bob is a friend, and can she trust each communication she receives is actually from Bob and not some interloper?

There's a temptation to develop technology to facilitate trust in two ways. Firstly to provide certainty in point-to-point connections: Alice knows she's talking to Bob, and vice versa.

And secondly to build systems which measure and display the trustworthiness of strangers to other strangers. For example the buyer feedback mechanism on eBay and other customer review systems.

But neither of these help with generalised trust, and generalised trust is important to economic growth.

Whilst a customer review mechanism may seem to help foster trust between strangers, it is still only a specialised application to help established sellers prove their trustworthiness.  It's not a system that builds on generalised trust but a system which predicts specific trust.

A customer review mechanism can also actively exclude innovative new sellers simply because they don't yet have a formal reputation.

With zero generalised trust, we believe that any deal with a stranger is guaranteed to fail.  Customer review mechanisms can help by allowing established sellers to advertise their trustworthiness.

But in such a society only established players can ever compete. New entrants are effectively barred by virtue of having no trustworthiness and no way to establish this and the market risks becoming lazy or cartel-like.

There will be reduced drivers for improvement between existing sellers.

Now, introduce even even a moderate degree of generalised trust. Not all deals with strangers are guaranteed to fail and some buyers will be prepared to take a risk to get a bargain.

There is now a mechanism to establish trustworthiness from scratch and innovative new sellers can enter the market and disrupt lazy incumbents.

The link between generalised trust and economic prosperity poses an interesting conundrum for cyber securocrats and digital policy-makers alike.

It seems obvious. If I knew the email was from my bank - if I could trust the communications path - I would know it wasn't a phishing attack.

But establish an intricate system to broker trust in a digital community and we risk building the online equivalent of a closed community group fearful of strangers.

A group reliant on formalised trust mechanisms may become afraid to innovate or take risks outside of the trust bubble.

Or indeed take risks which buck the formal guidelines inside the bubble, leaving the equivalent of trust outcasts stuck in the bubble who will never again be trusted.

Rehabilitation is impossible inside the trust bubble.  Social mobility is also likely to be hampered.

There is also the possibility of systemic flaws being exploited within the trust bubble.  No system is perfect, and there's a risk those who learn to game the trust mechanism will prosper over other more-trusted competitors.

I suggest that a community which becomes highly reliant on formal trust mechanisms will be less prosperous than other more 'chaotic' communities.

There is an alternative. Technology has made certain cyber crimes and other exploits possible, but technology can also evolve to restore trust and reduce risk without relying on entirely new schemes to broker trust.

E.g. we all already have a built-in trust sensor honed over generations of evolution.  With faster broadband and camera-enabled devices soon it will be possible to have a face-to-face conversations with digital retailers e.g. for higher value transactions with previously unknown sellers.

Once spoken and facial cues are reintroduced back into the system it is possible to rely once again on human instinct to play a role in deciding who to trust.

Risks of exploit will fall-back to real life levels and the internet won't be seen in itself as a threat, just a communications channel.

On one hand this won't help the bottom line of retailers wanting to automate to save money, but economic prosperity on the whole will benefit through the link between generalised trust and economic growth.

With face-to-face contact becoming once again the norm, scams like phising won't scale and we will be able to have a degree of trust in online strangers.

Turing's confidence trickster? A question hangs over whether or not computers could ever fool humans, not just into thinking they're interacting with another human (ie pass the Turing test) but also to illicit a high degree of generalised trust in human subjects.

If so, the balance would swing back in favour of the phishers as online con-tricks would once again scale. One person could use technology to attempt to trick tens of millions of 'marks' at a time - until someone invents software for detecting non-humans!




Monday 19 March 2012

Belfast 2012: privacy initiatives as an enabler for cyber security

I was lucky to get an invite to the CSIT 2012 cyber security summit.

Credit to the CSIT team in Belfast for creating a unique atmosphere which got the balance absolutely right - not too academic for industry to feel excluded, government participation without the grandstanding or justification sometimes seen, and plenty of opportunity for detailed open discussion amongst global delegates from industry, academia and government.

I attended for three reasons. Firstly, I spent the first 8 years of my software career designing secure communications (a patent from back when I worked at Motorola).

Secondly, I want to advocate against locking-down aspects of the internet in the name of security because I passionately believe open societies are stronger societies, and this applies equally to cyberspace.

And finally I wanted to discuss our work on privacy as an enabler to cyber security.

When each of us act with a high degree of autonomy, taking responsibility for our own personal data and protecting our own systems due to our inherent desire for privacy, I believe our networks as a whole will be more secure than if we defer to governments as our primary cyber defender.

This was a recurring theme throughout discussions.

There is what was referred to as an asymmetry in human resources faced by those tasked with securing our networks: relatively small numbers of professionals with varying skill levels facing off against legions of online activists and cyber criminals with an impressive skill set and access to most of the same tools (or 'weapons') as governments.

It went unsaid but one can only assume some nation states have also their own legions.

In a democracy we should be able to rely on citizens to augment the work of government in securing our networks, but for this to happen we need both mutual trust between citizens and government security agencies, and for citizens to feel motivated and able to help.

Specifically on privacy there are additional reasons why privacy can be an enabler to cyber security.

All sections of society are vulnerable to data loss through their ordinary everyday use of the internet.  Lack of data privacy can become a national security risk when personal data of those in sensitive positions becomes accessible to those with hostile intent, who may use private information to extort or to blackmail.

Whilst traditionally privacy has been seen in some quarters as at odds with security and policing requirements - e.g. the use of network monitoring tools to spot threats and investigate crime - there's also an argument against this.

In some cases a focus on intrusive policing and intelligence-gathering techniques can come at the expense of developing more sustainable community-based policing models for cyberspace.

Information technology is still developing apace, therefore capability-based policing - a reliance on a power available only to police and security forces such as network monitoring, data seizure etc - will for a while at least remain a costly arms race.

Soon after each capability is installed in our networks either technology evolves, requiring further upgrades, or the bad guys up their game leaving the capability obsolete, or both.

A worse scenario exists: the capability might be exploited by the bad guys.

Take the data retention directive, the EU law (which as far as I can establish the UK lobbied for) compels ISPs to store information which might be useful to law enforcement for a period of 6 months to 2 years.

All this potentially exploitable data sitting around in huge silos at ISPs. A capability that is also clearly vulnerable to exploit, especially given the security of such data is in the hands of private companies.

Employees have in the past looked up private data from company databases and sold this information to private detectives working for divorce lawyers, journalists and criminals.

It's not a straight choice between privacy and security. It's a balance between privacy as an enabler to creating a more secure information culture and privacy-invasive policing as a tool to detect and prevent cyber crime.

UPDATE: also, one must not overlook the importance of public trust and the benefits of consensual policing. Invasive monitoring can cause mistrust between the public and security services.

Public trust and confidence in the work of the security forces is a bigger asset, even, than the ability to monitor all electronic communications. A consensual approach makes society stronger and inherently stable.

Tuesday 13 March 2012

Identonomics part 3: clarity in privacy and the need for a set of standardised personal data licenses

Exploring the economics of online identity

Skip to: [part 1] [part 2] [part 3] [part 4]

In part 1 of this series I propose that the open market in personal data will work ultimately in the interests of advertisers and those buying personal information - not the general public whose data is being used - unless members of the public exert pressure by shunning intrusive data services and selecting private alternatives.

In part 2 I look at some of the reasons regulation in this area may fail to protect consumers whilst risking a negative impact on innovation.

If personal data is the new currency as suggested by European Commissioner Reding , the consumer market is a classic confusopoly: a competitive market that is unable to function in the interests of consumers due to confused pricing.  ‘Buyers’ are unable to choose the ‘cheapest’ alternative and, as a direct consequence, ‘profiteering’ goes unchecked.

With enforcement hampered by the ubiquity and massively distributed nature of data, not to mention the trans-jurisdictional element, many companies are holding on to more and more information about us.

So I'm saying the market if left unchecked will fail to protect consumers, and that may lead to a collapse in trust which will impact innovation. As will regulation. We're doomed, right?

Restoring control to the user

Not quite.  One of the identified barriers to consumer choice is confusion, and our proposal for a standardised set of personal data licenses will help to break the confusopoly and temper the personal data land-grab.

Some data corporations today may not see it this way, but it really is in everyone’s interests to restore control to the user, and I'm pleasantly surprised by a number of global tech companies who do understand this.

Wednesday 7 March 2012

Arbitrary and Orwellian - Amazon Payment Services blacklists Open Digital with no reason given

As we continue to explore ways to fund our policy research I decided to create an Amazon Payments Account.  Among the options we're considering is publishing a book on the privacy issues discussed on this blog and my own blog, sroc.eu.

I was shocked to find this morning, having provided the necessary bank account details, credit card security and telephone/email verification that we had been arbitrarily refused an account, the reason given:
"It has come to our attention that your desired use of Amazon Payments may be in violation of our Acceptable Use Policy or User Agreement"
What shocks me is that I provided very scant information as to what we planned to to with our Amazon Payment's account.

This snap decision highlights the powerful position that a few dominant technology businesses hold over small enterprises, advocacy and research groups alike.

I have asked Amazon for clarification.

Update 13-March:


Amazon have responded with my request for clarification. It seems the refusal was due to their terms and conditions, which prohibit the sale of non-physical goods.  Since our website wasn't selling any physical goods, we were refused an account.

This explanation makes a lot of sense, and I wonder whether it might not be in everyone's interests for Amazon to be clear in their letter of refusal the reasons for refusal.

Without an explanation the decision looks, as per the title of this blog post, arbitrary and Orwellian.

James Firth

Tuesday 6 March 2012

Just over 2 weeks to ORGCon 2012

Our friends at the Open Rights Group are holding their 2nd conference on 24th March 2012.

You'll get a whole day of discussion, lectures and seminars on a range of topics related to internet policy and digital rights.

Click here to book tickets!

A few tickets are still available at only £12.50 for Open Rights Group members, £26 for non-members.  If you want to meet up with me on the day, follow @Open_Digital or @JamesFirth on twitter.

Friday 2 March 2012

Crowdsource/request for feedback, review: Hargreaves copyright reform consultation "preserving the public domain"

OK a blog isn't the best tool for this but here goes.

Open Digital will be submitting a response to the UK Government consultation on copyright reform.  I have a lot of views on this subject so have spent the last few months listening to what other groups are fighting for so I can narrow my own submission to the areas not so well covered by other groups.

For various reasons I've chosen to focus on preserving the public domain (works for which the copyright has expired) and ask for the law to be clarified so that if can not claim a new copyright when I take a picture of a public domain artwork or digitise a public domain sound recording.

This will bring the law in line with US law as regards photographs of 2-D art works, and I see a strong parallel between a photo of a 2-D artwork and a format-shift of a sound recording.

I would dearly welcome comments, corrections, suggestions and any other feedback on the following. Please use the comments section below, email review@opendigital.org for ping us on twitter @Open_Digital

Thanks,
James

DRAFT SUBMISSION TO HARGREAVES

Update 16:19 clarified New York (ht @Copyrightgirl) and added paragraph about Wikimedia (private comment via email).

Open Digital calls on the Government to clarify the copyright situation as to faithful reproductions of existing works and in particular protect the public domain by removing any copyright protection for faithful photographic (including digitally scanned) reproductions of 2-dimensional art works and faithful copies of musical works that are in the public domain.

In parallel to the debate around format shifting, online businesses and individuals who make use of digital self-publishing tools are finding online uses for works already in the public domain.

But they face a threat from copyright claims made by the individuals or organisations responsible for photographing or scanning the original works.

UK case law in this area is not clear, so whether or not the Government and Parliament agrees with my suggestion set out below, there is opportunity to clarify the law in this area to provide clear guidance to the public on their rights.

In other jurisdictions such as United States New York, case law is clear (Bridgeman v Corel Corp, 1999) and does not grant copyright protection to photographs of 2-dimensional public domain art works, and I argue UK law should mirror this.

Wikimedia foundation has unilaterally declared a global policy (ref) mirroring the legal situation in New York, potentially opening itself up to legal challenges in other jurisdictions.

I argue a strong analogy exists between a digitising (format shifting) or re-recording of a sound recording and faithful photographic reproduction.

Allowing a copyright to exist in a faithful representation of a public domain work such as a painting, drawing or sound recording can have the effect of substantially prolonging the effective release into the public domain of certain published works.  This appears contrary to public domain provisions in copyright law.

Whilst there is one seemingly powerful argument in favour of providing protection for reproductions - that copyright acts as an incentive for those wishing to invest in digitising our cultural heritage - this must be balanced against the potential for organisations to abuse their monopoly of ownership, resulting instead in fewer works appearing online for public consumption; instead appearing behind paywalls or protected in some other way.

Whilst there is no research I'm aware of in this area, anecdotal evidence shows volunteer documenting and archiving efforts by groups such as Wikimedia Foundation and Project Gutenberg receive incredible public support.  Individuals seem motivated to work for free to reproduce the public domain in digital format, questioning whether a financial incentive in the form of a market intervention (copyright) is necessary.

As well as my main argument - that the public domain should be accessible to the public without restriction - providing protection for reproductions via copyright questions a fundamental principle of copyright. 

Copyright is intended to support the creative industries by providing protection for creators and those who invest in creation.  Creation must include some element or originality, and faithful reproduction by definition minimises originality.

Copyright is not intended to reward skill, it is intended to reward creativity. Even if it were, modern photographic reproduction techniques involve little if any skill and in some cases the reprographic process can be completely automated.  

By allowing a copyright to exist in a scan of a drawing provides a reward for merely pressing a button.

Therefore if incentives are needed to encourage digital archiving activities, copyright is not the right method to provide these incentives.

Allowing copyright to exist in a faithful reproduction also causes public confusion.  It can be difficult even for a skilled person to differentiate between a copy of a copy and a separate copy of the original. 

This is also true for automated digital identification and copyright enforcement techniques.  It is likely to cause false-positive matches for online automated content identification and protection systems.  

Such systems are used by online publishers to detect suspected copyright infringement and are unlikely to make a distinction between a separate copy of the original and a copy of a copy. 

This will cause particular problems for users of services such as YouTube, a service which provides a semi-automated mechanism to alert content owners that a video with a similar soundtrack has been uploaded.

The semi-automated nature of content protection procedures couples with the high volume of notifications handled today by large rights holders creates a real risk that an "authorised" public domain copy could be improperly blocked by the copyright owner of another similar almost indistinguishable copy. 

In summary I believe allowing a reproduction of a public domain work to have its own separate copyright results in a situation nothing short of a farce, where multiple almost indistinguishable copies coexist, with multiple owners and each with differing protection terms. 

One legislative option which would minimise the impact on existing businesses reliant on layered copyrights in mechanical reproductions as well as other unforeseen and unintended consequences is to design legislation such that it only affects public domain works.  

For example, the copyright protection term of any faithful reproduction could be limited to the unexpired term in the original, thereby allowing mechanical recording rights to coexist until the original performance enters the public domain.

My primary interest is to ensure public domain works are widely available to access and re-use, as is the intention of limited-term copyright law. 


ENDS

Tuesday 28 February 2012

Identonomics part 2: Law and enforcement, conflict and confusion

Exploring the economics of online identity


Skip to: [part 1] [part 2] [part 3] [part 4]


Previously...

In part 1 I explored the concept of personal data as a currency. Personal data has assumed more importance to the online advertising industry in particular than mere audience; data capture has somehow become an essential component in online advertising.

Yet this isn't a fundamental law of the market. There is still value to advertisers in raw audience; after all we still have billboards on the M4.

Online, the market is currently working in the interests of those paying for personal data or data dependent products like advertising; the few handing over cold hard cash.

It's basic economics: the only group making purchasing decisions are advertisers and those who buy personal data, therefore a competitive market will act ultimately in the interests of advertisers, driving up quality of product and driving down price.

But quality of product if today's trend continues will require gathering and sharing even more sensitive data about us, the users of online services.

If the market is to work instead in the interests of the public, the public must start to make their own purchasing decisions. They must chose to use or avoid services based on how their personal data is used and protected.

In many respects personal data is a currency.  It might not be the only currency, we can still choose to pay in pounds or dollars, and in many cases simple participation may be sufficient.

We can build a market which acts in the interests of the end user, but only if users regain control and start to make informed decisions about the currency they hand over.


Law and enforcement, conflict and confusion

Today the public are not in control. They are conflicted and confused.

This is described by some sociologists as cognitive polyphasia: we want privacy but we want the benefits of sharing. We choose to use privacy-invasive services because they bring rewards, despite privacy worries.  And less invasive alternatives seem slow to emerge.

And they're confused by the mechanics of data sharing: what is being shared, with whom, for what purposes; the length of time data is to be retained and who can see or access the data.

So can and should the law play a role in protecting consumers?

Thursday 23 February 2012

Statement on our PAC president Eric Joyce MP

Eric has shown tremendous insight and interest in the internet and digital policy-making and I was therefore thrilled when Eric last summer agreed to be the president of our Policy Advisory Council.

I have come to know Eric well, having worked with him and Pictfor - the Parliamentary ICT Forum - over the last two years on a variety of legislative areas.

I was shocked to hear news of the alleged assault this morning. Eric has in the past invited me and other Open Digital PAC members to numerous events in the Commons, including drinks in the Strangers' Bar at the centre of last night's incident.

On our numerous meetings I've seen nothing but an honest, hard working and passionate MP who genuinely cares for the issues that interest me, Open Digital and internet users in general.

Importantly, Eric has taken great interest in opening up the workings of Parliament, giving a voice and insight to many interested parties.  He has hosted and attended many internet-themed events inside and outside Parliament, such as a Social Media Governance panel I sat on with Eric during Social Media Week.

Again, the news this morning came out of the blue and the allegations levelled against him are totally out of character for the man I know Eric to be.

It's clear Eric now faces a serious accusation. Above reiterating that Open Digital values Eric's input and commitment to digital policy issues there's little else I want to say or do until the police investigation concludes.

James Firth

CEO, Open Digital Policy Organisation

Tuesday 7 February 2012

Search neutrality at Pictfor, 6th February: did Google abuse its market position? Should we even care?

Search neutrality broadly describes the subject of skewing search engine results to favour one website or category of websites over another.  The skewing can be done by hand, i.e. maintaining lists of sites to favour or penalise, or by tweaking the algorithms that derive search results.

Discussions tend to fall into three brackets:
  • Should search engines "clean up" results to remove infringing or unlawful content? 
  • How can accountants and business owners quantify and mitigate search engine dependency risks for businesses whose income is highly dependent on the secret ranking algorithms of search providers, algorithms subject to change without notice? 
  • Antitrust and whether search engines abuse their role as gatekeepers in order to promote their own goods and services above rivals, and what if anything can be done about this?
Last night's event at Pictfor in the Grand Committee Room at Parliament focussed on the latter - the potential for market abuse - and heard from Shivaun Raff, co-founder and CEO of Foundem, a British vertical search engine (categorised search - a type of price comparison site for any product), security consultant and blogger Alec Muffett and internet veteran entrepreneur Mark Margaretten.

Monday 6 February 2012

Policing and social media, the Surrey Police App, Digital Surrey 25th January

Below are the videos from January's Digital Surrey, which Open Digital sponsored with the help and support of Surrey Enterprise and the University of Surrey.

Surrey Police have created a mobile phone app to help with neighbourhood policing.  I won't take anything away from Chief Superintendent Gavin Stephens, (@CCuptStephens) and developer Angus Fox (@nuxnix), you can hear them describe the project themselves and answer questions in the videos below.

Thanks very much also to the team at thebluedoor for help with organising and London Corporate Media for kindly donating the videos.


Surrey Police view
 

Developer view
 

Open Digital sponsor's Talk

Thursday 2 February 2012

Identonomics - the economics of online identity, part 1: personal data is the currency

Skip to: [part 2] [part 3]

Facebook's IPO reveals the value of online audience today. Each registered user is worth on average $4.38 per year in revenue, or $1.18 in profit.  The business is hugely profitable, making $1bn profit from $3.7bn revenue.

483m people use Facebook every day, and we know from previous data released that Facebook gets around 100 billion hits a day.  Together these stats give a very rough idea of the very low revenue per page impression and per daily unique user it receives. Around 2 cents revenue per active user per day; or, about a dollar from every 10,000 hits.

It's worth noting that not one cent of this income comes from Facebook's core users. Its services are free at the point of access.

But its users are paying in one way or another. EU Commissioner Viviane Reding said in a speech last Wednesday (25th January, video here):
“Personal data is the currency of today’s digital market”
On one hand, Reding is wrong; for ad-funded services free at the point of access, the value is in the audience and participation is the currency.

But on the other hand she's spot on, as our "spending decisions" when choosing how to use free online services must be based on how much we are prepared to reveal about ourselves.

Personal data must start to be seen as a currency - if there is to be any hope of market forces conspiring in the public interest rather than the interest of advertisers.

Tuesday 31 January 2012

Help fund a policy organisation

I never thought it would be easy and I knew I didn't have all the answers.

The challenge is to find ways to fund the kind of research and advocacy we're doing here at Open Digital without becoming enslaved by any single person or organisation.

If you can help in any way, by:
  • Spreading the word
  • Employing the services of our consulting arm, Open Digital Consulting (bound by charter to donate at least 50% of its profits to the policy organisation)
  • Becoming a donor, shareholder or both
 Please get in touch: support@opendigital.org or see more information on our Standardised Personal Data License project.

The funding conundrum

As I said at last Wednesday's Digital Surrey, as an organisation we must avoid some of the pitfalls of other groups working on digital policy.

Read more about some of the pitfalls of policy funding here.

Data ethics - why should your business care?

What's in it for your business, when much of our work seems to be focussed on public interest?

Because I firmly believe public concern must be embraced by any digital business aiming for sustainability.

The long term interest of any sustainable business is closely aligned with the interests of its consumers. It's as simple as that.

If you want to make money from the processing of, and trade in, data about individuals you will benefit from the trust and support of those individuals.

Acting in a highly ethical manner is the key to gaining trust, and even ethical businesses need help and support convincing the world of their ethical credentials.  If your business aims to do more than make a quick buck it needs to take proactive steps towards understanding the issues.

Our organisation's own plan for long term sustainability

My plan is to make Open Digital a sustainable self-funding organisation, but it will take time to get to this stage, and therefore we need cash help getting there.

I believe the organisational structure will then promote policy research in the public interest through semi-independent oversight of the policy organisation by our Policy Advisory Council (PAC).

Our structure also minimises the impact of cash dependency.  Our founding charter guarantees the policy organisation will receive 50% of the profits from the consulting business, and the PAC gets to decide how this is spent.

Open Digital -  structured to promote community interest
But my plan requires Open Digital Consulting to be profitable, and that won't happen overnight.  Hence my decision not to launch as a non-profit, but to offer shareholders willing to stand by the whole organisation a chance to share in the long term profits.

The first 9 months

For a fledgling policy organisation with only very weak connections to the political scene we've had an amazing start.  I attended ministerial meetings with Communications Minister Ed Vaizey to discuss web blocking, plus a 5-minute interview on Sky News' Jeff Randall Live on the same subject.

Our first report (pdf) was referenced in print on page 2 of the Financial Times and has been cited in numerous other reports.  I've been quoted on privacy, security and copyright issues in most major computer magazines.

I'm also told that our paper arguing against the government's plans for a Public Data Corporation (pdf) helped persuade decision makers inside the Cabinet Office to shelve the plans, opting instead for a more open approach to public data.

Aims and forthcoming projects


(More information on our Standardised Personal Data License project)

Our long term goal is to improve trust in digital products and services for the benefit of all; we believe this can be done whilst maintaining the principle of a free and open internet through fair market competition.

One of the barriers to trust we have identified is clarity and transparency over what data about us is being gathered and how that data is being used and traded.

Only through clarity and understanding can consumers make informed choices about what level of personal information to share with any given service.

If personal data is the new digital currency, a catchphrase many are using, we currently have a confusopoly in the market place.

In a confusopoly, the "price" of using the service is transparent, but too confusing for consumers to understand.  Therefore consumers make bad choices, and less than ethical businesses are able to profiteer.

A project we hope to launch over summer aims to iconify privacy, bringing clarity to users to help cut through the confusopoly.  It's been tried before, but that is not going to stop us:


Note, the icons pictured are just examples. We want to rank privacy on a simple scale of 1-6, maybe adding additional information to distinguish between passive tracking and active data gathering.

The final icon design will be decided through a community project, and that itself introduces a challenge of ownership in the end result. We need intellectual property in the icons to prevent misuse, but the community needs to see that Open Digital will never profit unduly from a community project.

We therefore propose to hold the intellectual property in the icons in trust.

So...

If you run a business or are a reasonably wealthy individual with an interest in privacy, trust or digital policy, please consider becoming a donor-shareholder.

Significant shareholders get a seat at the table of our truly unique organisation, and a chance to share in half the profits if we achieve our aims.

Alternatively, if your business wants insight or training on any digital policy area, please consider using the services of Open Digital Consulting.  Part of our fee will go towards supporting our policy work.

And above all, if you like what we're doing, please spread the word and get involved.  Tell us about your concerns. Email contact@opendigital.org or engage with us on Twitter: @open_digital.

The funding conundrum

Funding a policy organisation in a way which provides reasonable independence and allows fair reward for those researching policy areas is the most significant challenge Open Digital faces.

Some think tanks stand accused of becoming enslaved to their backers, afraid to stray into policy areas their backers find uncomfortable; and, occasionally, promoting the vested interests of their backers.

Sponsored reports inevitably lead to research and advocacy following the money. And because incumbents have the most cash to spend, the policy areas incumbents want examined get examined. But who stands up for the innovators, emerging businesses and the public?

Academic organisations generally act with independence, but are often slow off the mark for analysis and research into rapidly evolving technology.

Civil society groups also play an independent role; but, reliant on public donations, its necessary for such groups to create a certain level of noise in order to maintain support, and sometimes the message gets lost in the noise.

And for an organisation attempting to influence public policy, it is simply wrong for us to charge for access to reports and white papers.  Research which affects public policy must be open to public scrutiny, it's as simple as that.

Read about our funding plans here.

Thursday 26 January 2012

Polarised debate over EU data protection must be a wake up call for big data corps, they can't have it both ways

There are deep and complex issues to be discussed regarding EU plans for an overhaul of data protection regulation.  Whilst I don't want to rush to judgement on the detail, one thing is clear from the BBC's technology correspondent Rory Cellan-Jones' blog - it's really put the wind up Google:
A senior executive at the search firm told me that two industries which depended on data, advertising and the web, were just about the only things a sclerotic European economy had going for it, and now both were in danger of being strangled by bureaucracy. 
"The data protection directive sees data as a bad thing," said the executive
Whilst many see the combative approach of EU commissioner Viviane Reding and her equally bullish partner-in-crime-come-potential-adversary "Steelie" Neelie Kroes as a good thing - a positive part of a fight for control over our data as the only language the big data corporations understand is the law - I have some sympathy for Google's apparent frustration.

I don't believe confrontation leads to good law, and by good I mean law which is effective in its aim, proportionate in scope and is not overburdensome to enforce - enforcement burden: level of policing required, economic cost of red tape and infringement on personal freedom/autonomy.

Part of my criticism of digital intellectual property enforcement legislation (SOPA, PIPA, ACTA and DEA) is the way these laws were conceived and drafted; in a confrontational environment.  All attempt, in varying degrees, to regulate an industry - the internet industry - without extensive consultation with the industry they're trying to regulate.

In fact in many cases the internet industry was deliberately excluded from the process, in other cases public consultations were carried out only to ignore serious objections from credible voices within the industry.

The internet, its relationship with society and with commerce is incredibly complex in nature. In no other complex industry would governments attempt to regulate without extensive consultation with the industry (e.g. banking, medicine).  Concerns about regulatory capture noted, it is equally unacceptable to completely exclude from the discussion those with the best insight and understanding of the problem the regulation attempts to solve.

UPDATE 12:53: note I said "exclude from the discussion." Granted companies can lobby and respond to consultations, but I see opportunity for meaningful and constructive dialogue between all stakeholders. Only through such dialogue can we hope to achieve a better understanding of the problem, and only then can laws be drafted to solve the problem. The process as it stands is confrontational.

Data protection and digital privacy are serious legitimate issues of public concern.  But unless legislation is workable one of three things will happen.

Either regulation will be ignored and eventually abandoned due to enforcement burden; or, data companies will exit the EU yet continue to collect/process/sell data on EU citizens due to the elastic nature of the online jurisdiction - and still turn a profit without an EU presence, robbing EU states of any economic benefit; or, EU businesses and citizens will be denied the benefit of technological advances and low-cost services because of the effective outlawing of certain data practices.

Yes we need to do something as a society to address legitimate concerns, but I personally feel that something has to acknowledge both the scale of the problem and the limited impact legislation can have.

Does the EU data protection regime "see data as a bad thing?"

I believe we need to radically rethink our approach to digital privacy and data protection.  The harm spectrum is broad and our current understanding narrow, with a lot of grey in the middle.

We need to separate the two concepts of personal data and privacy. They are fundamentally different. We need to work with the internet industry as a whole to draw the boundaries on privacy to prevent intrusive monitoring of what are clearly private actions.

But there is a possibility we need to scale-back our data protection demands in some areas when it relates to data captured in the online equivalent of a public space.

Yet my sympathy for Google and other technology giants is tempered by their unwillingness to date to engage with privacy advocates and fund a broad cross-section of community-led policy research.

Yes there is self interest, because Open Digital needs financial support to reach our self-funding goal (see slide 9 from my presentation at Digital Surrey last night), but I hear this from many advocates involved in the digital privacy debate: only this morning I saw, from Privacy International advisory board member Alexander Hanff:
I'm about as happy with a state-driven approach to privacy as I am with a corporate free-for-all. At the heart of the debate is an issue that affects everyone, globally, whether or not they have access to the internet - because the ramifications for global trade and societies around the world if we get this wrong are enormous.

Any sustainable business must see that a key part of sustainability is building trust with its customers, and that relationship is the focus of our research into digital privacy (see slide 7 from my presentation at Digital Surrey last night).

Big corporations can't have it both ways, they must either support independent initiatives to understand the problem with the aim of coming to a broad consensus on what data protection legislation is needed, or risk bad laws stemming from a frustration with the way a fledgling personal data industry has behaved so far.

Wednesday 25 January 2012

O2 sends your phone number to every website you visit, should you be bothered?

In a word, yes; I very much think this matters.

Why? Because at the very least it allows any website operator to capture your telephone number and potentially use it to send you spam texts or marketing calls.

UPDATE 25-Jan 13:29: Twitter user @alanbell_libsol reports the problem as fixed and I can confirm it's fixed for me too.

The problem

We have confirmed that UK mobile phone network O2 sends your mobile phone number to each website you visit.

The O2 problem was reportedly spotted by @lewispeckover and has been a known issue in the security industry for 2 years, see Collin Mulliner's 2010 CanSecWest talk.

We know the issue also affects at least one operator using the O2 network, GiffGaff.

We have also confirmed the phone number is sent even if you connect your computer or tablet to the internet via your phone, ie tethering.  This strongly indicates the phone number is being injected at some point in O2's network.

Personal data?

What can someone do with your mobile phone number? On its own, not a lot.  They can call you out of the blue, or send you junk texts.

But think how many websites you visit know quite a lot about you, such as your real name, your address, your hobbies, likes and interests? I choose not to disclose my phone numbers to social networking sites because I don't want to be contacted by strangers by phone selling me things I might be interested in.

Tuesday 24 January 2012

Norwegian Data Inspectorate rules use of Google Apps by companies breaches Norweigian law, cites US Patriot Act

Datatilsynet, The Norwegian Data Inspectorate has effectively outlawed many corporate uses of Google Apps within Norway on privacy grounds.

Reports are only just emerging (in Norwegian) that a "Notice of Decision" dated 16th January (pdf, Norwegian) states that Norwegian companies that make use of Google "cloud" services, (known locally nettskyløsning - essentially Google Apps) with its standard terms "violate the law".  

It is unclear at this stage whether the opinion will be challenged in the courts.

The Norwegian authorities cite the US Patriot Act, which gives "U.S. authorities the ability to monitor terrorist suspects without charge or trial" amongst the reasons why a US-led data protection initiative known as US-EU Safe Harbor was insufficient in itself to guarantee compliance with strict Norwegian data protection laws.

Readers are reminded that Norway is not a full member of the EU, but, as a member of the European Economic Area, complies to all relevant EC directives.

The Norwegian ruling comes 18 months after Danish Data Protection Agency reportedly ruled that sensitive personal information could not be stored on Google's cloud platform (Danish) when Odense Municipality planned to use the service to manage student schedules. 

It also comes at a difficult juncture for Google after revelations cloud data was improperly accessed by Google employees in Kenya in order to "boost its own business".

Complaint

The intervention of the Data Inspectorate stems from a complaint "by an individual" against the municipality of Narvik, a relatively small city whose administrative body was the first government agency in Norway to move to Google Apps, according to Digi.no, an Oslo-based tech blog.

The main issues of the complaint seem to relate to:
  • Where in the world data will be stored, including backup copying, and what protection is available in these countries
  • Whom at Google has access to the stored data
  • Whether it is possible for Narvik Municipality to conduct data "safety" audits for the data stored in the Google cloud - essentially what co-operation from Google was required for a satisfactory audit to be conducted
Inspectorate opinion

The ruling relates to the use of Google's email service by the municipality of Narvik, but the principles outlined extend to other Google cloud services.

To summarise, the Inspectorate found Google does not offer terms and conditions that meet Norwegian law, has no mechanism in place to offer local or customised Ts & Cs, plus the Inspectorate did not have access to sufficient technical information to show adequate data protection measures were in place to allow personal data to be exported to countries outside the European Economic Area.

Therefore the use of Google Apps by Norwegian companies (plus, presumably, any foreign-owned company with a Norwegian presence) to process personal data would put those companies in contravention of three sections (13, 15 and 29) of Norway's Personal Data Act 2000.

Monday 23 January 2012

Don't be sucked into supporting a flawed proposition in an artificially polarised copyright debate

The language in the increasingly hostile battle over online copyright infringement resembles that used in the war on terror or the war on drugs:
"wherever you are around the world, we're going to after you"
Alarm bells should be ringing. Drugs and terrorism ruin lives (in a biological sense). There's no comparison to the commercial problem of designing a market intervention suitable for the digital age to protect those who invest in the creative arts.

Also, arguably, neither war has worked. Ioan Grillo, in his book El Narco, documents the shifting power struggle in the battle to control the supply of illicit drugs and the social and economic impact of various enforcement strategies, including the chilling effect on communities demolished as the powerful slog it out.

Absurd as it may seem, I do see parallels with copyright infringement. If the world's most powerful nation struggles to control supply of a clearly dangerous physical product, how the hell do they expect to police worldwide availability of an intangible product with no known health issues?

There's a lot wrong with the approach to copyright enforcement advocated by many major rights-holders.

But now, with what can only be described as a massive escalation in hostilities, there's a real danger moderate rational-thinking people are being sucked into supporting one of two logically and morally flawed positions; pushed into taking sides in an incredibly complex debate, or backing unhelpful and retaliatory actions of questionable legality (on both sides), simply because one side makes more sense than the other.

Madness, since we're not dealing with a binary debate.

Monday 16 January 2012

Open Digital will join global protest against copyright anti-piracy overreach, Weds 18th Jan

Today our Policy Advisory Council voted unanimously in favour of Open Digital joining a global protest against two US bills to tackle copyright infringement that we feel go too far.

The motion was simply "should we join the protest?" All six members voted in favour.

Here's what I told PAC members when I proposed the blackout:

Wednesday 11 January 2012

What happened when a police force commissioned a neighbourhood policing app? Find out on 25th January

Open Digital are proud to sponsor this month's Digital Surrey with a fascinating talk from Surrey Police's Chief Superintendent for Neighbourhoods Gavin Stephens.

Book your free place here 

Gavin will be introducing the Surrey Police App, currently available for iPhone (and soon to be available for Android) and will be joined by Angus Fox,  Director of social collaboration software firm Multizone who developed the app.

We will hear about the commissioning and deployment of the Surrey Police App, which connects residents to their neighbourhood police teams, as well as a range of topics from government issues to the social impact on neighbourhoods and public concerns such as privacy.

The University of Surrey will be hosting the event with sponsorship from the Open Digital Policy Organisation.

Unfortunately space is limited and previous demand has been considerable. Please let us or the organisers know if you have to drop out for any reason - this also helps Digital Surrey keep such events free to attend.


Monday 9 January 2012

Lego's augmented reality kiosk - great use for AR without the lag or fiddle factor

This is worth sharing, it's only the third time I've been excited by an application for augmented reality (AR). 

Like the other two applications (goggles for emergency services and the Xbox Kinect), the Lego AR kiosk satisfies three main criteria: it serves a purpose, is low on fiddle factor and the AR application operates with minimal lag.

Many other applications fall over on one or all of these hurdles. I find most phone-based AR like Layar slow to initialise, laggy and fiddly to use.

AR applications for desktop tend to lack purpose and have some fiddle involved in setting up a webcam. 

The Lego AR kiosk is a great idea and we could see similar kiosks appearing in shops around the world.

 
(ht @PaulBaldovin)

Friday 6 January 2012

The 2009 failed prosecution by the Met's Obscene Publications Unit for a written fantasy/horror story

Via Twitter I followed the trial of Michael Peacock, acquitted this afternoon in Southwark Crown Court on 6 counts of distributing obscene DVDs under the Obscene Publications Act 1959.

The prosecution was widely condemned by the liberati of Twitter, and the acquittal therefore welcomed as an important victory against state censorship on a moral level.

But if you sympathise with Peacock, it's worth sparing a though for Darryn Walker and what is probably one of the most important failed UK prosecutions in the history of the internet.

In 2007 Mr Walker published a fantasy/horror story about the torture, mutilation and rape of members of pop group Girls Aloud on a Usenet Newsgroup.  (The story remained for at least a year under the alt.sex.stories hierarchy but has, from a precursory check today, since disappeared.)

Shocking and disturbing, sure - but should such fiction be illegal? And when it's hidden in some internet backwater where's the harm?

And, since the story involved selling body parts of the dismembered songstresses on eBay one can only assume the work contained a level of comedy and/or satire.

Soon after publication, an 18-month legal ordeal which saw Darryn lose his civil service job was started by the most unlikely moral guardians of The Daily Star:
"We can reveal that Interpol has been notified to help track down the man behind the bizarre work"
Daily Star, 26th July 2007
The Daily Star claimed Girls Aloud members were being "stalked by a vile internet psycho".  Sensationalised in the mainstream gutter press brought the story to a whole new audience.