Facebook manipulates News Feeds of 700K users in psychology experiment. Creepy.

By Lori Sandoval, Tech Times | June 29, 11:42 PM

Share( )Tweet( )
Facebook

Facebook conducted a manipulated experiment involving hundreds of users—without them knowing it. But an expert says there’s no legal or ethical issue on what Facebook did.
(Photo : Dimitris Kalogeropoylos)

Over 700,000 Facebook users became the subjects of a seven-day experiment of the social media network in January 2012 in its efforts to understand the effects of negative and positive posts to users in their News Feeds, according to a recent study that appeared in the Proceedings of the National Academy of Sciences journal.

Said experiment was conducted from January 11-18, with participants being randomly picked in accordance to their User ID. Participants in the experiment were users who viewed the social media in English. More than three million status updates were analyzed, which contained more than 122 million words, of which four million were regarded as positive and 1.8 million as negative.

The study, Experimental evidence of massive-scale emotional contagion through social networks, indicates that Facebook posts—whether good or bad—have impact to its users.

“Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness,” wrote the study.

What’s being questioned, however, is that Facebook manipulated the coverage to which users were exposed to the emotional expressions in their Feeds, influencing the kind of posts the users get from their friends. The so-called manipulation was computer-generated though, and yet critics were alarmed by the idea that the social media network can actually do so.

"I was concerned, until I queried the authors and they said their local institutional review board had approved it—and apparently on the grounds that Facebook apparently manipulates people's News Feeds all the time... I understand why people have concerns. I think their beef is with Facebook, really, not the research,” Susan Fiske, psychology professor at Princeton University, said to Business Insider in a phone interview.

Facebook also monitored the responses of users in order to see if the attitudes of their friends affected them as well, such as a change in their posting behaviors. It also observed the withdrawal effect to its users.

“People who were exposed to fewer emotional posts (of either valence) in their News Feed were less expressive overall on the following days, addressing the question about how emotional expression affects social engagement online,” the study said.

This result also shows the disparity to theories suggesting that viewing positive statuses by friends may, in a way, affect users negatively. A decrease in positive expressions leads to people producing fewer positive statuses and more negative ones, while a decrease in negative expressions produces fewer negative posts and more positive ones.

Still, others said being a subject to such undisclosed experiment raised issues on ethics and, again, on privacy, but Fiske said it isn’t so.

"A lot of the regulation of research ethics hinges on government supported research, and of course Facebook's research is not government supported, so they're not obligated by any laws or regulations to abide by the standards," Fiske said.

Yet further research indicates that Facebook can conduct such study because users have agreed to its terms and conditions as soon as they registered on the social networking site.

© 2014 Tech Times, All rights reserved. Do not reproduce without permission.

Real Time Analytics