Facebook just released crucial information that showcased how the social media company is acting against people producing inappropriate content or fake accounts.

The Community Standards Enforcement Report

On Tuesday, May 15, Guy Rosen, Facebook's Vice President of Product Management, posted a blog post on the company's newsroom. Rosen revealed that the social media company decided to publish the Community Standards Enforcement Report. The Facebook leader hoped that the public would read the report to see how the company is acting to dilute a plethora of harmful activity.

Facebook revealed that the Community Standards Enforcement Report focused on suspicious activities that took place from October 2017 to March 2018. The social media company targeted accounts which produced inappropriate content from several areas including graphic violence, terrorist propaganda, and hate speech.

The Facebook Figures

Rosen continued to reveal that Facebook removed 583 million fake accounts between January and March of this year. He noted that the social media company immediately disabled the accounts once they were registered. Facebook also took down over 837 million pieces of spam during 2018's first quarter. The company also reported that they took down 21 million pieces of adult nudity and 3.5 million pieces of violent content.

AI's Role

Facebook stated that artificial intelligence has played an essential role in helping the social media company flag down content. The social media network's CEO Mark Zuckerberg published a post on his Facebook page that credited AI's role in taking down the material, but he added that it was hard to pinpoint hate speech variations.

"AI still needs to get better before we can use it to effectively remove more linguistically nuanced issues like hate speech in different languages, but we're working on it," said Zuckerberg to CNet.

Facebook's Global Adventures

Facebook representatives are expected to travel across the globe to participate in summits in different cities. Several members of the Facebook leadership team are in Paris to discuss enforcing community standards and removing bad content.

Representatives will also visit Oxford, England on Wednesday, May 16 and Berlin on May 17. Facebook is expected to host summits in India, Singapore, and the U.S.

Removing 'Suspicious' Apps

Zuckerberg's statement and Rosen's blog post comes one day after Ime Archibong, Facebook's Vice President of Product Partnerships, revealed Facebook would remove apps that leaked user data. Through the company's two-part investigation, they found 200 potential apps leaked confidential data. The Facebook leader added that the company would notify users if their data were compromised.

The Live Rewind

On May 8, Facebook announced that it is testing a new feature that might help users who missed crucial elements of a Facebook Live broadcast. The Live Rewind button would not only allow users to rewind the video but also help organizations share live video simultaneously on multiple pages through its crossposting feature.

Tech Times reached out to Facebook for a comment on this story.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion