With the United States set to undertake its first Presidential election since the Russian-tinged 2016 race, state governments and social networks are upping their game.
This from the team at Cisco Talos, which cautioned in a new report that while governments and sites are better prepared for disinformation campaigns, the way hostile nation states go about their business has also evolved.
Specifically, said the Talos team, influence campaigns have been outsourced from thinly veiled government operations to multi-level efforts that often use a middle level of marketing firms.
These firms, it is said, operate as digital marketing firms for the most part, but will also use their expertise on behalf of governments that want them to sway public opinion in an election. One such example was AggregateIQ during Brexit.
Since 2016, those sort of companies have grown more numerous, and as a result are empowering more governments looking to interfere with other nations' elections. This is significant because it evens out the playing field.
Russia may be well ahead of China and others, said Cisco Talos threat researcher Nick Biasini, but mercenary marketing firms can be the great equalizer.
"There are starting to be companies that specialize in it," Biasini told The Register.
"The ability of one country over another country is going to be muted by them being able to hire whoever they want."
This is not good news for the US in an election where both China and Russia are expected to use dirty tricks in hopes of swaying US public opinion their way.
There is, however, some good news. Talos also noted that the likes of Twitter and Facebook, two platforms that were vital to the 2016 disinformation efforts, have stepped up their games. Both platforms offer systems to report and label fake or misleading news stories and posts, and as a result pulling off the sort of widespread disinformation campaigns we saw in the last election will be extremely difficult.
Rather, the team said it believes attackers may instead rely more on psychological manipulation, linking more stories through fake news sites and focusing on specific groups and communities.
"Although it can be relatively easy to launch a basic disinformation campaign, the ability to do the efficacy and analysis to drive this at a large scale is really difficult to do," Biasini explained.
In the end, it will come down to whether the newly evolved protections in place are enough to withstand the more focused tactics of hostile countries and their private-sector mercenary manipulators.
"At the highest level we are far more aware, people are more aware of what disinformation is, and the states are actively involved," said Biasini.
"I definitely think that we are in better shape, but we won't really know what is going to happen until it happens." ®