Opinion Personal data is the oil of the internet. The great engines of Facebook and Google pump it relentlessly, burning it at will to power their marketing monetisation magic. The pollution it creates in broken privacy, shattered politics and the corrupting force of hidden agendas, is out of control.
You'd think that the source of this data – that's us – could have some say in how it's used. Some practical way to control, monitor, decide who gets it and what they do with it. There are regulations, but do you feel protected by them? Cookie options not cutting it for you? Didn't think so. We are the means of production, but we don't control it. Time for a revolution, comrades, but if the corporations won't help and the regulators can't, where do we look? How about the BBC?
The BBC has some odd quirks. It has a commitment to make things better for its users – us again – and parts of it take that very seriously. One such part is BBC R&D, a world-class if much diminished collection of engineers and innovators who work on the tech behind the screens and speakers. These days, that mostly means helping create standards for digital broadcasting, including all the online and data things that go with it.
BBC R&D discovered it too didn't much like the way personal data was in the hands of the wrong people. That got in the way of creating better public value from the internet, and the BBC worries about these things.
Public service broadcasting in the 21st century means public service internet. So, in 2017 it started a project called Databox with Nottingham University, using ideas kicked off by some cat called Sir Tim Berners-Lee, who's apparently got some track record here. Two years later, work started on prototypes and last week BBC R&D put out a report on what the first testers thought of it all.
The idea is simple. You keep your personal data stored on an edge device you control. This can be a phone app or an actual appliance. It implements the three strands of what's called Human Data Interaction, HDI, the philosophy at the heart of it all. These three ideas are: legibility, agency and negotiation.
Legibility says you must be able to see and understand what your data is and what's happening to it, including when it's eaten by pesky algorithms. Agency says you get to control what data is kept and what happens to it – not just opt-in and opt-out, but how it's collected and stored. You get to modify it, and what inferences you allow to be drawn. Negotiability says you can see and control the social implications of your data use, and be able to get value from it. Yes, you get to say what you want in return for your data.
It doesn't matter what sort of a superbrain data ninja you are, these are not powers easily gained – if at all – for the vast majority of personal data you are shedding every time you do stuff online. Imagine what it would be like if you could. Berners-Lee calls it turning the world the right way up, and handing control back not just to the ninjas but to everyone.
The first service tested is a recommendation engine that talks to Spotify, Netflix and the BBC. It imports usage data from all three, then creates a unified media usage list and a local profile that exports information to the services to let them suggest things you might like to watch or hear. That doesn't sound any different to what happens now: the crucial part is that you control what you store and send. The services get a better way to recommend content, but never sees the full picture they'd get if they shared that data among themselves. You decide what they need to do their job, they don't skim extra off the top.
The researchers say that the test audience that has used the system – young people who don't spend much time on the BBC particularly – was positive. Audience members liked the control and visibility it gave them; they understood the need to manage personal data but didn't understand how to do it. This unlocked that door. The purpose of the service – to make better recommendations – worked out too.
So yes, content management is better. And? The thing is, the HDI principles that the BBC is testing here apply to all data – like the web itself, they're standards which don't care what sort of data they're dealing with. Financial, behavioral, interpersonal, you name it – if someone's making money by abusing it, this can stop it.
If it all works, then so what? Why would Facebook hand over the control panel of its oil wells to the oil itself? The researchers point out that there are plenty of business models and applications that work just fine in the new system. Most importantly, if new, usable standards are developed and adopted anywhere, they're available for users to demand – and for regulators to stipulate.
Tech regulation stinks when the market gets ahead of the regulators, who don't have the resources or knowledge to keep ahead of ravaging behemoths. If the right tools are available, they'll use them. And this can become the right tool.
We do need a revolution that puts the power in the hands of the people, but we probably don’t want to shoot the Czar and his family. A more equitable sharing of power and value, more transparency and accountability, and the ability to say "no" will be disruptive, but in the right way. It's about time we made Google read our terms and conditions – and the world's finest public service broadcaster is on our side. Be rude not to. ®