Analysis It's a seemingly endless topic of conversation: what do we do about the fact that Google and Facebook have built vast databases on us as individuals?
The discussion ranges from rallying cries (If it's free, you are the product!) to privacy fears (How much of this information does the government have?) to anecdotal evidence of the dangers of "over sharing" (being fired from jobs; publicly shamed, and so on).
Unfortunately, despite many years of furious debate, the end result is invariably either a shrugging of shoulders or nannying strictures about what people shouldn't say or do online if they don't want to be embarrassed later.
Fortunately, we now have at least one paper that has decided to tackle the issue head on and ask: so what do we actually do about this?
The Privatization of Human Rights: Illusions of Consent, Automation and Neutrality [PDF] by internet governance expert Emily Taylor digs into why the accumulation of all these data about us is a problem and what would be pragmatic steps to resolving it.
First, if you are one of those people who think the whole issue is overblown, the paper provides some stark evidence over the almost monopolistic hold that two companies have over all of our online lives. Google and Facebook are the two most popular websites across the entire world; their global dominance is striking.
Then there is the fact that these websites draw up their own terms of service – and they draw them widely. They grant themselves full access to user data; full access to "private" chat and emails; full access to your location at all times. And it's not your information either. They can delete, modify, share and sell your information at any point without requiring your consent and without even needing to tell you.
The companies we use more than any other are also, in many respects, extra-legal. Unless you live in California, if you have an issue with Google or Facebook you are required to raise it in a foreign jurisdiction.
The argument is: well, if you don't like it, you don't have to use the service. Something that Taylor argues is the "illusion of consent."
The fact is that the "standard terms" of these online giants are standard only to themselves: they do not incorporate well-established principles and concepts of necessity and proportionality which are used in every other field to judge how much intrusion into people's privacy is acceptable. Taylor also notes "there is little evidence of 'reasonableness,' a flexible safeguard that guides interpretation of consumer contract terms across the European Union according to unfair contract terms legislation."
Again, were such an approach taken by public authorities or institutions, it would only be a matter of time before there were audits and public hearings. These developed systems of transparency and accountability are still missing in our online lives.
As to the third "illusion" that Taylor identifies – the "illusion of neutrality" – this is perhaps best demonstrated by the fact that Facebook has occasionally chosen to reveal experiments it has played with its own customers, including changing the "emotional content" that they were shown in their feeds of information.
Unsurprisingly, when users found they were being used as unwitting guinea pigs, they were not exactly excited. Or, to put it a different way: furious.
While these were particularly strong examples, the truth is that Google and Facebook control what people see according to their own rules and whims. Google rankings have the power to make or break millions of businesses. Facebook's efforts to partner with specific media companies threaten to give prominence to whoever gets along best with the social media company.