Oxford profs tell Twitter, Facebook to take action against political bots

It's just the future of democracy at stake, no biggie


The use of algorithms and bots to spread political propaganda is "one of the most powerful tools against democracy", top academics have warned.

A team led by professors at the Oxford Internet Institute analysed tens of millions of posts on seven social media platforms in nine countries, including the US, Russia and Germany, during elections and political crises.

They were looking at computational propaganda, which is defined as the way algorithms, automation and human curation are used to purposefully distribute misinformation on social media networks.

The work concludes that the problem is real and widespread, with both governments and activists being responsible.

However, the report's authors add that, although social media firms may not be producing the content, they need to take action against it.

"Computational propaganda is now one of the most powerful tools against democracy," lead authors Samuel Woolley and Phil Howard write. "Social media firms may not be creating this nasty content, but they are the platform for it. They need to significantly redesign themselves if democracy is going to survive social media."

The analyses point out that the way that such propaganda is used differs depending on the system of government.

In authoritarian countries, it said, social media platforms are a primary means of social control, while in democracies the platforms will be used by different actors to try to influence public opinion.

"Regimes use political bots, built to look and act like real citizens, in an effort to silence opponents and push official state messaging," the report said.

Meanwhile, "political campaigns, and their supporters, deploy political bots – and computational propaganda more broadly – during elections in attempts to sway the vote or defame critics" and run coordinated disinformation campaigns and troll opponents.

The techniques appear to be working – the network analysis of US social platforms showed that bots "reached positions of measurable influence during the 2016 US election".

By infiltrating cores and the "upper echelons of influence", the computational propaganda – and bots – had a "significant influence on digital communication" during the election.

According to the researchers, many social media platforms are "fully controlled by or dominated by governments and disinformation campaigns" – the Russia report found that 45 per cent of Twitter activity in the state is managed by highly automated accounts.

But Ukraine has "perhaps the most globally advanced case of computational propaganda", the academics said, with numerous campaigns having been waged on Facebook, Twitter and the European network VKontakte since the early 2000s.

There are also cases of authoritarian governments truing to influence the political agenda in other countries: Chinese-directed campaigns have targeted Taiwan, while Russian-directed ones have targeted Poland and Ukraine.

In contrast to these countries, Germany is taking an overtly cautious approach to dealing with computational propaganda.

"All of the major German parties have positioned themselves in the debate surrounding bots and committed to refrain from using them in campaigning," writes Lisa-Maria Neudert, author of the German report.

The country has taken regulatory measures within existing legal frameworks, while a new law that would hold social networks liable for computational propaganda on their platforms has been proposed.

However, she said that the debate "lacks conceptual clarity", with prevailing misconceptions and confusion about the terminologies used during discussions.

On top of analysing individual cases, the research looked at the broader challenges facing further investigation of computational propaganda.

This included "sleeper bots" – active bot networks that fall below the formal threshold to be classed as an active bot – and that the people aiming to influence the agenda are getting wise to such analyses.

"We have found that political actors are adapting their automation in response to our research," the report reads. "This suggests that the campaigners behind fake accounts and the people doing their 'patriotic programming' are aware of the negative coverage that this gets in the news media."

There has been much debate about the influence of fake news and political bots in recent campaigns, both at home and overseas, as well as the related issue of data protection.

Last month, the Information Commissioner's Office announced it was launching a probe into the way political parties used voters' personal information to run targeted campaigns. ®

Similar topics


Other stories you might like

  • UK watchdogs ask how they can better regulate algorithms
    We have bad news: you probably can't... but good luck anyway

    UK watchdogs under the banner of the Digital Regulation Cooperation Forum (DRCF) have called for views on the benefits and risks of how sites and apps use algorithms.

    While "algorithm" can be defined as a strict set of rules to be followed by a computer in calculations, the term has become a boogeyman as lawmakers grapple with the revelation that they are involved in every digital service we use today.

    Whether that's which video to watch next on YouTube, which film you might enjoy on Netflix, who turns up in your Twitter feed, search autosuggestions, and what you might like to buy on Amazon – the algorithm governs them all and much more.

    Continue reading
  • 'Peacetime in cyberspace is a chaotic environment' says senior US advisor
    The internet is now the first battleground of any new war – before the shooting starts

    Black Hat Asia Cyber war has become an emerged aspect of broader armed conflicts, commencing before the first shot is fired, cybersecurity expert Kenneth Geers told the audience at the Black Hat Asia conference on Friday.

    "Peacetime in cyberspace is a chaotic environment," said Geers, who has served as a visiting professor at Kiev National Taras Shevchenko University, represented the US government at NATO, and held senior roles at the National Security Agency. "A lot of hacking has to be done in peacetime."

    Geers said the Russia-Ukraine war demonstrates how electronic and kinetic conflicts interact. Ahead of the Ukraine invasion, Russia severed network cables, commandeered satellites, whitewashed Wikipedia, and targeted military ops via mobile phone geolocations.

    Continue reading
  • Indian government accuses Uber of jacking up prices for loyal customers
    Six ride sharing companies forced into consumer redress scheme

    India has accused ride-sharing companies of over-charging loyal customers who regularly take the same route, and directed six platforms to become part of a scheme that offers third-party grievance handling services.

    The directive to join the scheme was issued during a meeting with officials of India's Department of Consumer Affairs, attended by Ola, Uber, Rapido, Meru Cabs and Jugnoo. The platforms were advised to improve responses to customer concerns and rights and directed to become "convergence partners" in India's National Consumer Helpline. Such partners are required to accept and resolve consumer grievances reported to the Helpline.

    The Department said ride-sharing companies need to sign up for the helpline for reasons including that their algorithms set fares in ways that are not easy to understand – sometimes even charging loyal customers higher rates than first-timers on the same route.

    Continue reading
  • Growing US chip output an 'expensive exercise in futility', warns TSMC founder
    Production talent isn't here, costs are high ... so how's that multi-billion-dollar Arizona plant coming, eh?

    TSMC founder Morris Chang, a key player in the semiconductor industry since its inception, thinks America's attempt to grow its domestic chip production will be "a wasteful, expensive exercise in futility."

    Speaking on Tuesday as a guest of the Brookings Institution think tank, Chang said that the US chose a trajectory in the 1970s and 1980s that saw its manufacturing talent retraining for higher-paying jobs. Chang said that isn't necessarily bad for America, but it is a challenge for the US chip manufacturing industry, which, in his mind, simply doesn't have the fabrication talent pool needed to expand and succeed. 

    Taiwan, Chang said, has a large population that was integral to TSMC's manufacturing success. While the US and other countries saw professionals moving away from manufacturing, Taiwan was ripe with talent and made it an ideal location for a "pure play" chip foundry that only produced components for other companies, he proclaimed. 

    Continue reading

Biting the hand that feeds IT © 1998–2022