X hiring 100 content cops in bid to tame Wild West of online safety
Maybe those Twitter cuts ran too deep, huh?
Not long after it emerged that X, formerly Twitter, cut 1 in 3 Trust and Safety employees after Elon Musk's takeover in October 2022, the social media platform now claims it's ready to hire 100 full-time content moderators at a new office in Austin, Texas.
After a long period of speculation, hard figures on moderation cuts were finally pulled out of X earlier this month by Australia's eSafety Commissioner, which requested specific information about what X was doing to meet the Australian government's Basic Online Safety Expectations under the nation's Online Safety Act.
Now, with a couple of days to go before a Senate Judiciary Committee on child safety online, where CEO Linda Yaccarino is due to appear, X has announced plans to build a new "Trust and Safety center of excellence," as reported by Bloomberg.
According to Joe Benarroch, head of business operations at X, the office will house 100 moderators focused on combating child sexual abuse material (CSAM) as well as enforcing the platform's other rules on hate speech and violence.
"X does not have a line of business focused on children, but it's important that we make these investments to keep stopping offenders from using our platform for any distribution or engagement with CSE [child sexual exploitation] content," Benarroch said. X users must be 13 years old to open an account, and those under 17 cannot be targeted by advertisers. The company claims less than 1 percent of its daily users are aged 13-17.
Yaccarino is to testify at the Senate Judiciary Committee alongside the CEOs of Meta, TikTok, Snap, and Discord, all of which, including X, are being accused of failing in their duty to adequately protect children on these platforms.
"The team is currently being built," Benarroch told Reuters of the Austin office, adding that the goal is to fill the positions by the end of the year.
- Elon Musk made 1 in 3 Trust and Safety staff ex-X employees, it emerges
- X's 2024 plans include peer-to-peer payments in app push
- X reverses course on headlines in article links, kinda
- EU launches investigation into X under Digital Services Act
In an X Safety blog post published on Friday, the company claimed to have "suspended 12.4 million accounts for violating our CSE policies. This is up from 2.3 million accounts in 2022."
This smacks of trying to get a school project done in a couple of days when you've had all summer to do it, giving Yaccarino a list of talking points to make sure the senators can see X is doing something.
But the platform's track record on moderation since Musk took over has been dismal. In December 2022, just months after the billionaire entered HQ carrying a sink for the sake of a dad joke, X dismantled Twitter's Trust and Safety Council, an advisory association formed in 2016 to tackle content based on hate speech, child exploitation, and suicide.
Even the most forgiving of observers might then suggest that moderation of inappropriate and illegal content was not a particular priority at this stage in X's evolution, and the platform continues to appear more lawless than ever, thanks, in no small part, to Musk's "free speech" doctrine.
Now US officials are asking social media companies to take more responsibility for child exploitation on their respective platforms, X has to rebuild content moderation infrastructure that was already in place at Twitter.
Whether an aim to get this "Trust and Safety center of excellence" operational by the year's end will wash with senators is another question altogether. ®