An analysis of 1,400 Facebook accounts, more than 143,000 posts Liked, and more than a million pages could go some way to unmasking the techniques of “Like Farms” used to manipulate reputations on the content-with-ads network.
The boffins, from Australia's (doomed) NICTA, University College, London, and the University of Iowa, say their aim is to “distinguish between legitimate and misbehaving reputation-building activities, and ultimately detect malicious accounts”.
While there are already tools trying to do this – the authors nominate CopyCatch and SynchroTrap that Facebook already uses – their paper at Arxiv says they're not particularly effective.
One “Like Farm”, they said, ran 621 accounts, and in a six-month period only one of those accounts was spotted and terminated.
The study used 13 empty honeypot pages called “Virtual Electricity” – a nice gloss, The Register notes, since “energy from the quantum vacuum” is a modern perpetual motion machine that attracts suckers the world over.
The researchers promoted some pages via Facebook ads with worldwide, USA, France, India and Egypt targeting; the rest were pimped by BoostLikes.com, SocialFormula.com, AuthenticLikes.com, and MammothSocials.com.
Some of the key characteristics to arise from the analysis include: “accounts from the same Like Farms like very popular pages or relatively niche pages.
“We found that Like Farm accounts post text with significantly lower lexical richness and entropy, and are much more interactive than regular users.”
In fact, they note, Like Farm posts – as well as exhibiting bot-like behaviour, reposting lots of stuff with low originality, when the Like Farm says something its posts are likely to have “have fewer words and poor vocabulary, [and] are targeted to a limited number of topics”.
Even so, because this is their reason for existing, Like Farms “generate more comments and Likes” than real users, and their operatives are trained to like each other, tag their friends, and concentrate their likes around specific pages.
The bad spelling and grammar, the authors note, “suggests the opportunity to incorporate, in malicious/fake account detection tools, not only activity thresholds, but also other features such as lexical analysis, which is part of our future work”.
Spying on Spammers: Tracking Like Farm Accounts on Facebook was put together by NICTA's Muhammad Ikram, Arik Friedman, Guillaume Jourjon and Mohamed Ali Kaafar; Lucky Onwuzurike and Emiliano De Cristofaro of University College London; and M. Zubair Shafiq from the University of Iowa. ®