A variety of reports about Facebook's rules and policies over content moderation have just been published, offering an in-depth look at how the company deals with content shared on its site and how it determines what users can post.

Overall, the leaked documents show that Facebook has to deal with an enormous influx of content, and the rules by which it moderates can at times be difficult for the moderators themselves.

Facebook's Rules For Content Moderation

The document dump is thanks to The Guardian, which published the series of reports on Sunday, May 21. The whole series, called Facebook Files, offers readers a compelling read, assuming one wishes to observe how the company handles threats of violence; non-sexual child abuse; graphic violence; sex, terrorism, and violence; and cruelty to animals.

The Guardian says it has reviewed over 100 documents pertaining to moderation, including manuals, spreadsheets, and flowcharts. These documents aid Facebook's moderators deal with reported content.

At its core, Facebook's internal moderation manuals have a simple goal: to espouse free speech while also avoid real-world harm. It leverages automated systems to get rid of some content, but moderators handle the rest.

Some moderators find difficulty with determining if a particular content violates Facebook's rules, especially since some policies are clear-cut, while some are more nuanced, requiring more effort to filter through an enormous array of reports and recognize distinctions.

Facebook's Manual For Moderating Graphical Violence

Take Facebook's internal manual for graphic violence, for instance. Moderators are tasked to remove content "upon report only," meaning graphic content — be it of a sexual or violent nature or both — could be seen by a great number of people before it's eventually removed, assuming — and this is crucial — someone reports it.

Sadism is one area that's particularly complex. As a subject, photos and videos of sadism can pass, but as an expression, they can't. This means that a video that might explain what sadism is and a user posting "I love seeing how much pain he's in" are two entirely different things.

Facebook's manual for credible threats of violence, on the other hand, also contain nuances that moderators have to pick up on. The document notes that people, as in real-life communication, often express threats or violence in generally "facetious and unserious ways." It also highlights what sort of sentences are passable — "I'll kill you, John!" — and what's not — "I'll kill you, John! I have the perfect knife to do it."

The guidelines task moderators to recognize posts that are tongue-in-cheek and ones that are genuine threats — for instance, posts with exact methods, timing, and such are more prioritized over general blanket statement threats.

To hone the point further, the nuances of moderation also bleeds over to the manual on animal abuse. Like sadist posts mentioned above, photos and videos documenting animal abuse is allowed, if only to raise awareness. However, those too disturbing are marked as such. Yet one slide is a bit perplexing because it advises moderators to ignore content depicting animal abuse when it's a photo but mark it as disturbing when it's a video. The photo, it's worth noting, shows a man pulling a dog by the leash violently.

Facebook And Its Moderators

The Guardian notes that moderators review millions of reported content. That number is enormous, and it's no surprise that the reviewers "often feel overwhelmed by the number of posts they have to review." Needless to say that they sometimes get it wrong, especially in areas where nuances are involved, such as permissible sexual content.

That said, Facebook recently announced that it's adding 3,000 more people to its community operations team to review the reports. How adding more people would impact the influx of reports is uncertain, but it's a step forward considering the series of violent and highly controversial content — such as a man livestreaming his murder spree — that was shared on Facebook but took a long time before being taken down.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion