During the 2020 presidential election, conservative users on Facebook experienced more isolation and encountered higher levels of misinformation compared to liberal users. However, Facebook's influence on users' political content varied widely across different groups. 

US-IT-FACEBOOK
(Photo : OLIVIER DOULIERY/AFP via Getty Images)
In this photo illustration a Facebook feed page is displayed on a smartphone on March 25, 2020 in WArlington, Virginia.

Experiencing False News, Disinformation on Facebook

In a recent study, researchers made an intriguing discovery about the behavior of conservatives and liberals on Facebook during the US 2020 presidential election. 

According to NBC News, the study is part of an extensive research project investigating Facebook's influence on democracy, revealing that conservatives were more likely to engage with false news stories than liberals.

Analyzing data from 208 million U.S. users, researchers observed that "conservative audiences preferred untrustworthy" news sources. Moreover, nearly all (97%) of the political news webpages marked as false by Meta's third-party fact-checkers were more prominently seen by conservatives than liberals.

University of Pennsylvania's Annenberg School for Communication Professor Sandra González-Bailón revealed that Facebook's pages and groups played a more significant role in fostering ideological segregation and polarization than users' friends.

She emphasized that unraveling these feedback loops with observational data is highly challenging and calls for further research.

Also Read: Facebook Bans 'Militarized Language' Posts in Watching Polls for US Elections in November-Preventing Kenosha Militia Part 2?

Conducting the Study

In a departure from previous academic research on Facebook, Bloomberg reported that this unique effort stands out due to its massive scale and the source of data - directly from Meta. 

Unlike earlier studies that relied on various data collection methods, often involving scraping from the platform, this project ensures higher data quality and reliability. 

To gain insights into the impact of social media, Meta Platforms Inc. provided internal data to 17 independent researchers from prestigious institutions like New York University, the University of Texas at Austin, and several other academic centers. 

This collaboration aims to shed light on the role of social media platforms in American democracy and understand their effects on public engagement and information dissemination during critical political events. 

Meta did not provide payment to the independent researchers involved in the project. The social media company committed not to reject research questions except for privacy or logistical concerns.

Meta relinquished any authority to restrict or censor the researchers' final discoveries. An independent group of researchers oversaw the process to ensure transparency throughout the collaboration, highlighting the commitment to openness and accountability.

University of Texas professor and a key figure in leading the research project Talia Stroud noted that these findings should prompt everyone, including policymakers, to approach potential solutions with caution. The issue's complexity requires careful consideration, and no straightforward answers exist.

Stroud added that it is now evident that the algorithm plays a significant role in shaping users' experiences on the platform. However, the researchers acknowledged that altering the algorithm for a short period is unlikely to bring about notable changes in people's political attitudes. 

Related Article: Mark Zuckerberg is Willing to Pay Facebook Users to Quit Platform Ahead of Election, Here's Why

Written by Inno Flores

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion