Content moderation is the biggest issue that Facebook is facing at the moment. From the sharing of fake news to hate groups using the platform to recruit new members, Facebook is having a tough time applying its rules evenly.
To combat hate speech being shared on Facebook, it hired an army of 7,500 content moderators. The problem is that the rules aren't being applied evenly by the content moderators.
An investigation by ProPublica found that Facebook's attempts to remove hateful content haven't worked properly. This may be due to Facebook's constantly changing guidelines that have to be used uniformly across many countries and languages.
ProPublica gathered user-submitted examples of content that Facebook failed to moderate. It ultimately sent 49 samples to Facebook. Facebook acknowledged that it made mistakes in 22 of those posts. Six of the posts were not taken down because users flagged them incorrectly or the author deleted the post. For 19 of the posts Facebook defended its decisions. In the last two posts, it asserted that it didn't have enough information.
Back in June, ProPublica found that Facebook's rules don't protect everyone equally. Based on their guidelines, "white men" are protected but not "black children" because age wasn't a protected category. Facebook changed this after the investigation.
Facebook apologized for their mistakes regarding the moderation of posts.
"We're sorry for the mistakes we have made," said Facebook Vice President Justin Osofsky in a statement. "We must do better."
Osofsky pledged that Facebook is going to double its safety and security team to 20,000 in 2018. A similar action was taken by the company in May, after a man posted a live video showing him murder another man.
He mentioned that Facebook removes about 66,000 posts of designated hate speech every week, adding that everything that may be seen as offensive may not be viewed as hate speech.
"Our policies allow content that may be controversial and at times even distasteful, but it does not cross the line into hate speech," he said. "This may include criticism of public figures, religions, professions, and political ideologies."
Facebook's problem with moderating content is a pressing issue for the company. Using people to make a judgment on what content is or isn't offensive makes it hard for the rules to be implemented uniformly. Doubling the number of moderators helps Facebook, but the problem it's facing is how the content can be deemed offensive.