Elon Musk made 1 in 3 Trust and Safety staff ex-X employees, it emerges

Oz online safety czar receives evidence of cull despite platform reinstating hundreds of banned accounts

Twitter, the social media service now calling itself X, executed a 30 percent reduction in its Trust and Safety staff globally after Elon Musk's acquisition in October 2022.

That 30 percent cut includes shedding 80 percent of its safety engineers and half of the moderators, according to a report from Australia's eSafety Commissioner.

Evidence of the cull emerged after the independent body, which scrutinizes and regulates what Australians get up to online, issued a legal notice to X Corp in June last year. Under the nation's Online Safety Act, it requested specific information about what Twitter/X was doing to meet the Australian government's Basic Online Safety Expectations in relation to online hate and to enforce its own hateful conduct policy.

In its response, X said that within the cuts to Trust and Safety staff, it also reduced the number of safety engineers by 80 percent. The company additionally revealed the number of moderators it directly employs were cut by half, while the number of global public policy staff have also been reduced by almost 80 percent, according to a document published by the Commissioner today.

While reports of cuts to staff in online moderation and safety have emerged since Musk acquired Twitter for $44 billion in October 2022, this is the first time it has given specific figures on where the ax fell, the Australian commissioner said.

As well as cutting safety and moderation staff, X Corp also let 6,100 banned accounts back on the platform in Australia. Media reports have suggested 62,000 suspended accounts were reinstated worldwide. These accounts have not been placed under any additional scrutiny, X Corp confirmed.

eSafety Commissioner Julie Inman Grant said: "It's almost inevitable that any social media platform will become more toxic and less safe for users if you combine significant reductions to safety and local public policy personnel with thousands of account reinstatements of previously banned users. You're really creating a bit of a perfect storm."

In Australia, a recent eSafety study found that First Nations youth are three times more likely to experience hate speech online than their non-indigenous counterparts. X Corp stated it had not formally engaged with any First Nations organizations after it began laying off safety staff and receiving the legal notice.

Globally, other groups have become alarmed at lack of moderation and safety on the X platform.

In September, a group of 100 Jewish leaders published an open letter criticizing X and its owner for enabling a "new stage of antisemitic discourse."

In November, the platform sued pressure group Media Matters after it accused the site of allowing antisemitic posts next to advertising. X alleged that the group "manipulated" data in an attempt to "destroy" the platform formerly known as Twitter.

In December, X earned the dubious honor of being the first online platform to have formal Digital Services Act (DSA) proceedings launched against it, with the European Commission accusing it of disseminating illegal content among other violations of the recently enacted rule.

X may be in violation of five DSA articles "linked to risk management, content moderation, dark patterns, advertising transparency and data access for researchers," EC EVP Margrethe Vestager said in a statement. ®

More about


Send us news

Other stories you might like