Who should we blame whenever misinformation pops in social media like Facebook? The person reading the fake news, the one originally posted it, or Facebook allowed the post? Amid the pandemic and the upcoming US Elections 2020, here's how Facebook tries to curve the ball towards a reliable social media platform.

Facebook's "dos and dont's" in removing a fake post

We all have seen a fake Facebook post. Red flags usually account unreliable source, or when it pops an intriguing but not quite believable topic. 

However, as time goes by, these red flags don't work anymore. Most misinformation posts manage to make their articles or videos sound reliable. 

When this happens, how can you make sure that you're reading or watching are still legitimate? 

On Monday, Sept. 21, Facebook releases a public PDF file of their steps to curb misinformation in social media without violating anyone's rights to freedom of speech. 

The file, titled "Facebook's submission to the Senate Select Committee on Foreign Interference through Social Media", presents how the platform identifies which group or posts to take down amid the pandemic and upcoming US Elections 2020. 

According to the post, highlighted via ZDNet, Facebook changed a lot of things to work its way towards a reliable platform.

COVID-19 myths

The one solution that probably worked against COVID-19 myths is the fact-checking tool of Facebook.

If you observe nowadays, the platform always has a fact-checking tool. It allows users to identify if what they'd read or what they about to read is legitimate info or not. 

The Coronavirus Information Center lets Facebook users be transferred to a reliable WHO or government site in order to fact-check the content of a shared post online. 

WhatsApp sharing limit

Facebook's social media brother WhatsApp also created a way to stop misinformation. 

In April, WhatsApp added a series of added user protection layers to know which messages were forwarded multiple times a day. 

Messenger also follows this scheme. 

One thing they should have done

Though we're grateful that Facebook has been doing a lot for the platform, there's one thing that they're not doing right now. 

This is the process of removing groups as they see fits, according to ZDNet. 

You see, Facebook only makes anti-vaccination or anti-quarantine--groups that share most fake news-- disappear from the suggested posts or public feed.

The groups reportedly were not being removed effectively. 

Could they have done something better?

ALSO READ: [GUIDE] Here's How You Can Create Your Own Facebook Avatar, and Even Make Cute Stickers That Looks Like You

This article is owned by Tech Times

Written by Jamie Pancho 

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion