If you've ever looked at ads and suggested pages that Facebook recommends for you, you may have thought, "Is that what Facebook really thinks of me?" or "Man, Facebook knows me so well." Either way, you know that the social network is watching what you put in your profile, what you say in your statuses and what posts you "Like" to get some sense of the kind of user you are.

Most of us probably choose to ignore the fact that this is happening because Facebook has become such a part of our daily routines, we don't really take too long to think, "If I like this new profile pic of my friend, how will this affect what I see on Facebook?" However, when Facebook's News Feed experiments made headlines this past June, people took notice. The fact that Facebook scientifically altered the News Feeds of nearly 700,000 users by removing positive or negative posts to see how it would affect user behavior seemed like it made everyone feel a bit icky.

The general response to that secret experiment prompted Facebook to issue an apology of sorts. Its Chief Technology Officer Mike Schroepfer posted a blog post addressing the matter on Oct. 2.

"Although this subject matter was important to research, we were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism," Schroepfer wrote. "It is clear now that there are things we should have done differently."

Well, that was nice of Facebook to finally comment on the incident three months later, wasn't it? However, just because Facebook recognized that it was in the wrong for the way it carried out that experiment doesn't mean it's going to stop doing research on you. Oh no.


The social network is just going to implement a new framework for how it conducts and publishes research. This includes: clearer guidelines for researchers and an "enhanced review process" for work involving specific groups, populations or "deeply personal" content; a more thorough review for projects falling within these guidelines; education on research practices as part of training new engineers; and a new website dedicated solely to Facebook's published academic research.

Schroepfer wrote that it's important for Facebook to interact with the academic community on matters of technological innovation because online services like Facebook can help us better understand how the world works.

But what Schroepfer failed to mention is how this research will really impact Facebook in helping its advertisers earn more money, and therefore, helping Facebook earn more money. As Wired pointed out, Facebook will continue to conduct these experiments because its pool of test subjects is so large and random, that it gives it a competitive advantage and is also a researcher's dream. Plus, Facebook reported record revenues and profits for its most recent quarter after news of the research broke, and the upcoming quarter's financials are expected to be high again. All of that is going to be just too attractive for Facebook to pass up if it stopped its experiments.

However, it's research like this and some of Facebook's other policies that have caused some to seek refuge in the new, supposedly more freeing social network Ello. Clearly, the supposed mass exodus of tens of thousands of people signing up for invites to this mysterious platform isn't a big enough of a threat for Facebook to completely change how it does business right now. If Ello really does become the next big thing, we'll have to wait and see what Facebook has to say about that.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion