Social media platform X, formerly Twitter, has been slammed by Australia’s eSafety Commissioner for not protecting the online safety of its users.
eSafety Commissioner Julie Inman Grant said X has created a bit of a "perfect storm" for toxicity online as a transparency report found that X Corp. has reduced its global trust and safety staff by a third.
This included an 80% cut in the number of safety engineers since the company was acquired by billionaire Elon Musk in October 2022.
The company also has reduced the number of moderators it directly employs by more than half, while the number of global public policy staff have also been reduced by almost 80%.
X has also reinstated over 6,100 previously banned accounts in Australia since the October 2022 acquisition, 194 of which were previously suspended by the platform for hateful conduct violations.
"It’s almost inevitable that any social media platform will become more toxic and less safe for users if you combine significant reductions to safety and local public policy personnel with thousands of account reinstatements of previously banned users," Inman Grant said.
“A number of these reinstated users were previously banned for online hate. If you let the worst offenders back on while at the same time significantly reducing trust and safety personnel whose job it is to protect users from harm, there are clear concerns about the implications for the safety of users.
“We also see from X Corp.’s responses to our questions that the reduction in safety staff coincided with slower response times when users reported online hate to the platform.
"Response times to hateful tweets have slowed by 20% since the acquisition and response times to hateful direct messages have slowed by 75%, with users not receiving a response for up to 28 hours.
“We know from that online abuse is frequently targeted at victims via services’ direct message features, with clear intent to cause harm.
“Loss of local staff in Australia also limits the potential for engaging local communities disproportionately impacted by online hate. A recent eSafety studyExternal link found that First Nations youth are three times more likely to experience hate speech online than their non-indigenous counterparts.”
X Corp. also stated that it had no full-time staff specifically dedicated to hateful conduct issues globally, and no specific team for this policy during the period covered by the notice, although it said it had broader teams who worked on these issues.
When eSafety asked what tools were used to detect volumetric attacks or "pile-ons" in breach of Twitter's targeted harassment policy, X Corp. stated that no tools were specifically designed to detect this type of abuse on the service.
"I liken these attacks to someone trying to swat individual bees when they are engulfed by a killer swarm. It can feel quite overwhelming and be very damaging for the target,” Inman Grant said.
Have something to say on this? Share your views in the comments section below. Or if you have a news story or tip-off, drop us a line at adnews@yaffa.com.au
Sign up to the AdNews newsletter, like us on Facebook or follow us on Twitter for breaking stories and campaigns throughout the day.