Facebook declared that the claim about fake news published in its platform have influenced the U.S. election is crazy. That position, however, has been put in the spotlight after reports revealed information that show how Facebook might have effectively allowed fake news during the campaign season.
The Tale Of Two Fixes
Reports indicate Facebook already had two solutions to fake news last year. One involved an algorithmic machine learning detection, which eventually got implemented. The other tool, on the other hand, is a News Feed update that could shut down fake news for good, but it never saw the light of day. The reason for that decision is where things get a tad curiouser.
Citing inside sources, Gizmodo reported that the second update was shelved because the company feared a conservative backlash. It appears that the fake news update will disproportionately impact right-wing news sites, and Facebook would not hear of it.
That report has been corroborated by other anonymous sources who have recently spoken to news outlets about the episode. Tech Times has already reported an internal strife at Facebook during the heat of the presidential campaign. This concerned the debate to allow Donald Trump's incendiary Facebook posts as well as the loosening of the company's censorship policy.
Specific to the fake news affair, a number of Facebook employees told the New York Times last Nov. 12 that developments rendered Facebook's willingness to make any serious changes to its products paralyzed. Gizmodo sources underscored this with a surprising claim that there was an "internal culture of fear." It is not yet clear if this persists at this point.
What the critics call as conservative bias in the Facebook hoaxes subject seems to have been spurred by news reports last May that detailed how the social media network's Trending News team appears to be suppressing conservative content.
Mark Zuckerberg, Facebook CEO, met with conservative leaders after the incident to reassure them of Facebook's commitment to fairness. The company also fired its Trending News team. Information about its replacement was never made public, but it has been blamed for the publication and promotion of several right-wing fake news thereafter.
Facebook Fake News Response
Facing scrutiny after the elections, Facebook recently responded that it is now taking more steps to address fake news. This was punctuated by Zuckerberg's statement recently posted in his Facebook account. However, what is clear from all these pronouncements so far is the focus on one mechanism that passes responsibility to the users. This involved the self-reporting tool, which allows flagging of fake news.
"We can't read everything and check everything," Adam Mosseri, head of Facebook's news feed, said. "So what we've done is we've allowed people to mark things as false. We rely heavily on the community to report content."
If true, however, the report about the News Feed update that could have zeroed in on fake news and obliterated them on their tracks soundly debunked Mosseri's claim.
Facebook has refuted Gizmodo's report. It acknowledged in a recent TechCrunch report that they indeed developed two clickbait update options this year. However, it claims that the machine learning clickbait detector update was chosen over a hoax detector based on user reports because it is said to be more effective. The company, however, will not release more details about the number of fake news and hoaxes.
"A bias towards truth isn't an impossible goal," Bobby Goodlatte, former Facebook product designer, weighed in on the issue. "But it's now clear that democracy suffers if our news environment incentivizes b-------."