This article is more than 1 year old
To stop web giants abusing privacy, they must be prevented from respawning. Ever
History tells us tech companies just get bigger, even after being broken up or battered
Column Thriving amidst the pervasive chaos of 2020, the world’s largest technology companies - the FAANGs*, as we’ve come to know them - have managed to grow larger, richer and more powerful.
That’s wonderful for shareholders, but quite a problem when it comes to the relations between these new superpowers and the nations they operate within.
Multinational entities always exploit their capacity to play nations off against one another in search of tax breaks or favourable regulatory environments. This zero-sum strategy means each FAANG win equals a loss for a national government - and governments tend to hold grudges. Tally up enough black marks, and even a trillion-dollar business might find itself in a fight for its life.
It certainly seems as though the knives are out for two of the FAANGs - Facebook and Google. In the US, the Department of Justice and several States Attorneys General will soon file an antitrust lawsuit; the EU wants to rein in the data gathering and coercive business practices of both; even plucky Australia has thrown its hat into the ring, empowering its competition regulator to claw back some of hundreds of billions of dollars a year in revenues hoovered by the pair. The FAANGs may be colossi, arrogantly striding the Earth, but the actions of scores of Lilliputian governments may yet bring Gulliver low.
But breaking up is hard to do. IBM and Microsoft survived breakup attempts, while AT&T actually grew far larger, post-breakup, than it ever could while operating as a government-sanctioned monopoly. So regulators face a dilemma: let these organisations run rampant, or turn them into modern-day Hydras - spawning a new head with every antitrust amputation.
For any containment to be successful, regulators will first need to deprive these organisations of their respawn superpowers, generated by an unholy combination of "user profiling" and "engagement".
Both firms know their users better than those users know themselves; observing trillions of interactions with digital intellects vast, cool and unsympathetic, applying these observations to build a predictive model used to direct and shape "engagement". Continuous surveillance reveals our weaknesses, and those weaknesses are fed back to us to exploit our credulity, our prejudices, and our expectations.
Despite repeated warnings, the public seems to have been happy to maintain its digital addictions - although some got a better view backstage last month, when Netflix released The Social Dilemma. Over the course of a few days, tens of millions got a look inside the belly of the beast, and understood - some for the first time - that they aren't the user, they're the product.
From fifty-plus years of anti-smoking efforts, we know that getting people to stop using something that they know is bad for them won't be easy. But we could at least level the playing field with a different kind of amputation: regulations blocking the utilisation of profiling data to increase engagement.
Created to boost the "stickiness" of content, these machine-learning-driven systems create powerful feedback loops between users and these content providers. They're the engine room that keeps billions scrolling, liking and posting. Cutting that loop breaks the spell that holds users in thrall. Those firms will not like it - reduced to slow, expensive, organic engagements - but users will gain a newfound agency; an ability to look away from the blinking lights of today's shiniest outrage.
Early in the 20th century, a series of food and drug laws in the United States regulated both the purity of and access to a range of substances that had proven to be addicting, toxic, or both. Now that we've learned how to replicate those sorts of psychic effects with digital equivalents, we need to look to regulate these digital systems - and not just the companies that peddle them to the public.
They're potent, potentially dangerous, and always come at a cost. Their use needs to be carefully regulated - just as we regulate addictive drugs. And just as we wouldn’t prescribe addictive drugs to billions of people, we have to ensure that these technologies are never deployed at scale again.
The crisis we face today most closely resembles that of China during the Opium Wars. Furiously trying to defend itself and its people from colonial powers using addictive drugs they illegally imported to gain a commercial foothold within the nation, the Chinese struck out - and lost. Today's national governments, weakened by the very powers they seek to contain, face a similar threat. To contain these modern monsters, we'll need to learn from history - and act quickly. ®
* Facebook, Amazon, Apple, Netflix and Google