Tech titans like Facebook, itself described as a "digital gangster", continually fail to address the risks their platforms pose to democracy – so the British government should regulate, MPs have said.
The House of Commons Digital, Culture, Media and Sport Committee has been conducting an inquiry into "fake news" – which it acknowledged is now an inappropriate moniker – for the best part of two years.
The inquiry has involved 23 oral evidence sessions, received more than 170 pieces of written evidence and heard from 73 witnesses – none of whom, to the great frustration of the committee, was Facebook boss Mark Zuckerberg.
UK Parliament roars: Oi! Zuck! Get in here for a grilling – or you'll get a Tower of London tourREAD MORE
The committee today published its final report (PDF), which includes 110 pages of damning criticism of Facebook's attitude to user data privacy and security, the UK's electoral laws and the current state of regulation.
In the report, the MPs call for reforms to electoral laws which they say are "hopelessly out of date for the internet age", as well as an independent regulator with statutory powers over social media giants.
The document repeated many of the recommendations made in the July interim report, but the main theme is that the government needs to put a stop to tech giants' self-regulation, and rebalance the power between the platforms and the public.
A central tenet of this is a compulsory Code of Ethics that requires companies to deal with harmful and illegal content that has been referred to them by users, and content "that should have been easy for tech companies themselves to identify".
The MPs also call for greater transparency into online political campaigning – such as a searchable, public repository saying who paid for, sponsored and is targeted by each ad – and investment into digital and data literacy for the public.
'Facebook intentionally broke data privacy laws'
Beyond such recommendations are detailed discussions of the Cambridge Analytica saga, covering the many companies brought under that umbrella, including the firm of Leave campaign mega-funder Arron Banks, which have been fined by the UK's data protection watchdog, and – of course – Facebook.
Because, while the inquiry started as a probe into the spread of misinformation on the internet, the group was in the right place at the right time to consider the trendy topics of political micro-manipulation and data harvesting.
Soon after the scandal broke, Facebook became a prime target for the committee, and Zuckerberg's repeated refusal to give evidence to the group only stoked the MPs' anger, and there are multiple references to this in the report.
At one point, efforts to bring him in seemed to reach vendetta levels, with an international committee of parliaments forming just to be rebuffed by the man at the top of the social network and an unprecedented use of archaic Parliamentary powers to seize and later release damaging emails about the Zuckerborg.
Summing up his committee's view of Facebook, DCMS Committee chairman Damian Collins said the firm had "deliberately sought to frustrate our work, by giving incomplete, disingenuous and at times misleading answers to our questions".
He said firms like Facebook "exercise massive market power" and make money by "bullying the smaller technology companies and developers who rely on this platform to reach their customers" – statements based on the emails the committee released.
The MPs also complained that Facebook was "unwilling to be accountable to regulators around the world" – and that the government needed to consider the impact of such monopolies on democracy.
However, the report went on to warn that if companies become monopolies, "they can be broken up, in whatever sector" – and noted that its handling of personal data is "prime and legitimate areas" for regulators' scrutiny.
And, although the committee is not a legal or regulatory body, it opined that Facebook had "intentionally and knowingly violated both data privacy and anti-competition laws".
It also claimed that Facebook had violated the 2011 consent decree established with the US's Federal Trade Commission (FTC) to curtail developers' previously unfettered access to users' information; Facebook was required to obtain consent from users in order to share their data.
"The Cambridge Analytica scandal was facilitated by Facebook's policies. If it had fully complied with the FTC settlement, it would not have happened," the report said.
The committee called on the Competition and Markets Authority to assess the email cache it released, "to decide whether Facebook is unfairly using its dominant market position in social media to decide which businesses should succeed or fail".
Similarly, the Information Commissioner's Office should carry out a "detailed investigation" into the way Facebook offers up access to users' data to other companies, and the committee said it hoped its evidence is beneficial to an investigation underway by the Irish Data Protection Commissioner.
The group also reiterated previous calls for the National Crime Agency to investigate connections between Cambridge Analytica sister firm SCL Elections and the newly set-up Emerdata, and for a broader probe into the work of strategic comms firms.
"The transformation of Cambridge Analytica into Emerdata illustrates how easy it is for discredited companies to reinvent themselves and potentially use the same data and the same tactics to undermine governments, including in the UK," the report said. "The industry needs cleaning up."
Regulator should get access to algorithms, wield hefty fines
Delving into the committee's proposals, it wants to tighten liabilities and make tech firms responsible for harmful content posted by users.
The Code of Ethics for tech companies – outlined in its interim report – intends to take aim at social media firms' ability to hide behind the fact they claim to be platforms not publishers.
The code should set out what constitutes harmful content, and this should include harmful and illegal content that has been referred to the companies for removal by their users and – upping the ante – content "that should have been easy for tech companies themselves to identify".
It should be overseen by an independent regulator that has powers to launch legal actions against those that break the code – and to demand access to information relevant to its inquiries. This should include company’s security mechanisms and algorithms to ensure they are operating responsibly.
"If tech companies (including technical engineers involved in creating the software for the companies) are found to have failed to meet their obligations under such a Code, and not acted against the distribution of harmful and illegal content, the independent regulator should have the ability to launch legal proceedings against them, with the prospect of large fines being administered as the penalty for non-compliance with the Code."
The funding for such regulation and the extra work for the regulator should come from a levy on tech firms, the MPs said. Such proposals fell on deaf ears last time, but the MPs now note that proposals for a digital sales tax "shows that the government is open to the idea of a levy on tech companies".
A central part of the work was on political campaigning online, which has received no end of scrutiny amid allegations and evidence of interference from outside entities in various elections.
The MPs want to tackle this with "absolute transparency" over political ads online, including "clear, persistent banners" on all paid-for ads and videos that say who the source and advertisers are. There should also be a category introduced for digital spending on campaigns and explicit rules surrounding designated campaigners’ role and responsibilities, the report said.
Beyond this, the report stated, the government needs to assess how UK law defines digital campaigning, and acknowledge the role that unpaid campaigns – such as Facebook groups – play in influencing elections and referenda.
The committee also said it wants to see new powers for the Electoral Commission, including powers to compel organisations they don’t currently regulate – namely, social media firms – to provide information, and bigger fines, based on turnover, than the existing limit of £20,000.
The government should also "put pressure" on social media companies to publicise disinformation and share information about "foreign interference on their sites", which might be who paid for political adverts, who has seen them and who has clicked on them.
That pressure is – unsurprisingly – proposed as "the threat of financial liability if such information is not forthcoming".
'Big tech must not expand exponentially'
The committee is not naive in thinking that it can change the internet: it acknowledged that the situation in which platforms coarsen discourse, distort views and polarise debate will not change – and that nations have always faced propaganda masquerading as news.
However, its argument is that the companies that operate in this field can, and should, be better regulated. Crucially, the committee is seeking greater transparency – for instance on the sources of information – and accountability.
Facebook makes money by selling access to users' data through ad tools, and increases that value by offering up reciprocal data-sharing deals with app developers and by seeking to extend its grip on the public - MPs want these practices to be subject to more policing.
"The big tech companies must not be allowed to expand exponentially, without constraint or proper regulatory oversight," they said.
"But only governments and the law are powerful enough to contain them. The legislative tools already exist. They must now be applied to digital activity, using tools such as privacy laws, data protection legislation, antitrust and competition law." ®
* In case you didn't manage to penetrate the tortuous attempt of our headline at meeting its rhyme scheme and scansion, this is the classic '90s tune we were trying to evoke.