Did you just like or comment on a Facebook post about the COVID-19 pandemic? Facebook will let you know "soon" if you've spread false coronavirus information.
The social media giant will be alerting its users who've liked, shared, or commented on malicious COVID-19 information on its platform the next few weeks, the company said.
The Facebook users who've interacted with the said post on its platform will receive messages that will direct them to COVID-19 myths debunked by the World Health Organization.
Users to redirect users from official websites for fact-checking
Facebook and other platforms have already taken steps to shrink the wave of dangerous misinformation that has spread in conjunction with the coronavirus.
Facebook has banned bogus ads promising coronavirus treatments or cures. No such factor exists. There is not any vaccine, although there's an international race to develop one.
In a blog post, Facebook now wants to connect users who interacted with fake posts about the virus from authoritative sources in case they see or hear these claims again.
However, Facebook's vice president for integrity Guy Rosen said the social media giant will not tell users what pieces of posts they interacted with or what was incorrect with it.
Founder and CEO Mark Zuckerberg said the agency had eliminated hundreds of hundreds of pieces of content associated with the virus that fact-checkers had declared fake. These included myths and hoaxes that said consuming bleach should help save you COVID-19 or that social distancing become ineffective.
"If a piece of content contains harmful misinformation that could lead to imminent physical harm, then we'll take it down," Zuckerberg wrote in a post.
Facebook, according to BBC, said it is continuing to expand its multilingual network of fact-checkers issuing grants and partnering with trusted organizations in more than 50 languages.
Facebook launches new fact-checking feature
Facebook has also introduced a new feature that consists of fact-checked articles from the company's fact-checking partners that are currently to be had for humans in the United States.
Also, the social media platform launched a few new information about combating misinformation associated with the coronavirus in its blog post. The employer said it displayed warnings on forty million COVID-19-associated posts in March, primarily based on 4,000 articles by means of its fact-checking partners.
"When people saw those warning labels, 95% of the time, they did not go on to view the original content," Rosen wrote.
Social media organizations have struggled to fight incorrect information about the coronavirus ever because the outbreak began. Hoaxes and myths claiming to cure the coronavirus and blaming religious minorities for spreading the ailment have thrived on platforms like Facebook, Twitter, and YouTube for the last few months.
Earlier this month, WhatsApp, the famous instant messaging app owned via Facebook, announced that it enforced stricter limits on forwarded messages. Such a feature, according to BuzzFeed, is a popular way for coronavirus misinformation to spread among its 2 billion users.