Meta's special-track content review platform for VIP individuals and businesses is facing harsh criticism from the company's own Oversight Board, who have expressed concern about its lack of neutrality and potential to cause more harm than good.

Meta's Oversight Board is Criticizing Facebook for Its Approach to Offensive Content

According to a report issued by the independent Oversight Board, the program, dubbed "cross-check" or XCheck, offers certain users greater protection from the platform's rules. It delays enforcement actions and allows potentially offensive content to remain on the site for hours, with no evidence presented to show the effectiveness of the special-track system versus Meta's standard content moderation practice.

In 2020, Meta, then known as Facebook, established the Oversight Board to investigate issues such as banning former President Donald Trump. Following the exposé by The Wall Street Journal, the overseers examined the nature of the special VIP review platform and criticized how it gave preferential treatment to certain users.

Unlike Twitter, Meta has Its Own VIP Program Offering Special Protection

According to the story by CNBC, the oversight team pointed out that the program did not have language and regional expertise. Meta conceded that it has a " hard-coded " process to make automatic exemptions for a preselected group of content policy violations.

Meanwhile, Meta rival Twitter is grappling with its own issues in the wake of Elon Musk's acquisition of the social media platform. However, one significant difference is the lack of any similar VIP program offering special protection from content moderation rules.

The Difference Between Regular Users Compared to VIPs

Overall, the Oversight Board report served an indictment of Meta's special-track content review platform, which appears to prioritize the company's interests over protecting safe and fair speech. Given the disparate treatment of regular users compared to VIPs, it remains to be seen how Meta will address the issues raised in its own report.

Meta enacted a major shift in how it reviews and views content for the general public. After the media kicked off reporting on the matter, the company improved upon its existing review process, integrating an automatic system that would triage content before sending it to the appropriate channels for inspection at a deeper level. 

Read Also: Mastodon Sees Third-Party App Development for Better Use, Devs Turn Away from Twitter

The Oversight Board of Meta Issued 27 Recommendations Aimed at the Operations of the Company

If further review is needed, Meta's vetted employees or contractors will inspect the content and bring it to the attention of the sole decision-maker-the Early Response Team. This team is responsible for issuing warnings, suspensions, or even banning users when necessary.

In the wake of this significant change, Meta's Oversight Board issued a set of 27 recommendations aimed at the company's operations. The first was the bifurcation of the content review process, effectively splitting it into two streams-one to protect human rights and one to prioritize businesses. 

Other recommendations included firewalling government relations from content moderation, laying out clear criteria for inclusion on cross-check lists, and expanding appeal processes.

Related Article: Elon Musk's Twitter Banks on Automation to Moderate Content, Combat Hate Speech

Tech Times

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion