Elon Musk reportedly only has 2,294 content moderators on X, thousands lower compared to 16,974 at Google's YouTube, 7,319 at Google Play and 6,125 at TikTok as reportedly claimed by the European Union.

Reuters reports that these numbers are based on reports submitted by the social media platforms last September, in accordance to EU's Digital Services Act (DSA) that require the companies to do more in tackling illegal and harmful content. 

X, Formerly Twitter, Offers Valuable Insights Into Self-Reported Chronic Pain Using Machine Learning: Study
(Photo : Dan Kitwood/Getty Images)
Social media platforms, including X, formerly known as Twitter, have emerged as valuable resources for understanding self-reported chronic pain.

As per an anonymous source, regulators are expecting that X will feel compelled to increase the number of content moderators in order to keep up with its competitors, "there is an important aspect of the DSA, and that is peer pressure."

The report says that this comes after concerns have been raised about X as Musk recently let off numerous people in charge of monitoring and regulating material despite the rise of misinformation on the platform. 

Read Also: Researchers No Longer Studies X, Fearing Elon Musk's Legal Actions, High API Fees 

X Filled With Misinformation

Tech Times previously reported that due to a "unprecedented" number of fake news, outdated films, video game footage, and phony links that seemed authentic, X users ranging from journalists to fact-checkers struggled to verify information on the ongoing Israel-Hamas war. 

During the start of the Israel-Hamas war, a massive quantity of misinformation was spread as violent footage of military operations and kidnappings went popular on social networking sites like X. 

But, on the other hand, X announced back on August that it will be expanding its content moderation team for the 2024 elections.

The New York Post reported that X, which laid off 80% of its former personnel when Elon Musk took it private, has already hired a handful of new staffers and is hiring dozens more to boost policing of false posts on the site.

Insiders say the goal is to expand X's so-called "Trust and Safety" team, which monitors misinformation - a significant concern for the 2024 election given the increasing complexity of artificial intelligence and its capacity to make deepfakes.

Bringing back content moderators is an important aspect of the strategy, with full-time positions available, including a head of civic integrity. According to insiders, X will also seek electoral assistance from contractors.

X Former Employees' Fears, Now a Reality

Elon Musk's brutal lay offs of X's content moderators started last November 2022, Associated Press reported how the laying off process proved to be less than ideal, with the former employees previously expressing, a now fully-realized, consequence of reducing X's content moderation team.

Melissa Ingle, a contractor at Twitter for more than a year, was one of many contractors who reported they were let go on Saturday. She expressed fear that the amount of personnel departing may lead to an increase in Twitter harassment.

"I love the platform, and I had a great time working at the company and trying to improve it." "And I'm just really afraid of what will fall through the cracks," she remarked on Sunday. 

Sarah Roberts, an associate professor at the University of California, Los Angeles who worked as a staff researcher at Twitter last 2022, is a content-moderation specialist reportedly said on the report that she believes cutting contracted content moderation employees will have a "tangible impact on the experience of the platform."

Related Article: X Filled With Disinformation and Fake News About Israel-Hamas War 

Written by Aldohn Domingo

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion