True Story: Facebook Tried To Control Your Emotions By Manipulating Your News Feed

1984As if you don’t have enough reasons to distrust Facebook, here’s one more: They intentionally manipulated the positive and negative content of users’ News Feed to see if they could affect and manipulate the emotions of those users. Wow.

There is a lot of coverage about this story on the web, but here’s the very short version of the official paper: In 2012, Facebook analyzed the posts in the News Feed of about 700,000 users for one week, and slightly increased or decreased which stories appeared, based on whether posts were “positive” or “negative”. Then they measured posts made by the people whose news Feed they had manipulated, to see if the slightly more positive or negative posts in their feed made them post more positive or negative content.

Can the positivity or negativity of posts on Facebook affect whether your posts become positive or negative? Apparently, according to the study: Yes.

Creepy. Not just the results, but the fact that Facebook would even do a study like this. Right?

Many sites have pointed out potential ethical issues in this experiment, questioned whether informed consent should have been required, and asked if a site like Facebook should be doing experiments like this at all. Here is my take on what matters in this latest Facebook PR mess:

It’s Probably Not That Bad. Kind Of.

As Adam Kramer (one of the study authors) points out in a public post, the actual user impact of this was probably pretty insignificant. It’s highly unlikely that anyone suffered emotional trauma or was truly negatively affected. It was a relatively small sample, over a short period, and the level of manipulation was very small. In my personal opinion, some people are inflating the significance of the actual impact of the study and its emotional effects on users. The actual study itself seems fairly harmless, really.

However, the important points to remember and consider are:

Facebook Dictates What You See

Facebook decides what you do and don’t see from your friends, family, and Pages you are interested in. True, you give them that power by clicking a checkbox when you sign up. But is it really working for you? Do you like it that way? Are you comfortable with Facebook deciding what you do and don’t see?

Most users want a simple, unfiltered, chronological News Feed so they can browse through and see everything. But Facebook doesn’t want you to have that. A big part of the reason is so they can charge money to businesses that want to make sure their posts appear. But another reason is so they can run experiments like this and others to figure out how people behave. Facebook is the experiment, and you are the lab rat. When you think of it that way, it’s kind of less fun, isn’t it?

A Slippery Slope To “Censorship”?

Facebook says it wants to show you the “highest quality” content, and the posts you are most interested in. But it doesn’t let you decide what is important. It doesn’t even let you have a vote. It attempts to derive your interest by spying on everything you do – down to the level of watching where your mouse moves, when you stop scrolling, and which links you click on. Even if you never Like or Comment on anything, Facebook is watching you closely to figure out how to sell advertising to you.

Recently Facebook announced that it determined “meme pictures” were low quality content, and they would appear less often. What about people who love memes? Do they get a vote?

March forward a few years, and what will Facebook be determining is “low quality” content? Posts containing a religious message? Posts containing anti-war sentiments? Posts criticizing the government? Posts about controversial topics like abortion? Posts about an obscure hobby you enjoy? Posts about topics you are interested in but aren’t mainstream?

Facebook has demonstrated the very subtle ability to manipulate how people feel and behave by changing what they are exposed to. It’s not a huge leap to imagine them pushing a political, social, religious, or cultural agenda and affecting public thought by manipulating what the Billion+ people using the site see every day.

It’s like a huge, worldwide, interactive TV broadcast, and the only people in charge of what content you see is Facebook themselves. Are you comfortable with this level of influence? Are we giving them too much influence in our life? I put “censorship” in quotes on purpose because this isn’t the government restricting free speech, and everyone is free to use Facebook or not. But they are limiting what you see, and you may not even realize it.

Conclusion: Understand The Product You Are Using

In the end, Facebook is a very useful and entertaining service, but because it has such an overwhelming influence on people and is so dominant on the web, it’s important that users understand what they are using and how it could work against them.

Understand that you are not seeing everything you could see, and that Facebook is deciding what it thinks you should see. An unknown algorithm is ranking content and controlling what you are exposed to on a daily basis. If you don’t like that, you should speak up. Facebook makes money by advertising to users. If you get sick of the site and leave, they will lose money. It’s in their best financial interest to keep you. In theory.

Understand that you are not the customer, you are the product being sold. Your eyeballs are being sold to advertisers for the highest bid, and Facebook’s primary focus is to know as much about you as they can so they can serve you the best ads that you might click on. Perhaps their goal is to not just learn things about you, but to influence what you think, how you feel, and what products you like, so it can turn you over to companies who will gladly take your money. If they can slightly manipulate emotions, who is to say that they won’t slightly influence public sentiment in favor of a brand that is a big advertiser with them?

Finally, Understand that Social Media can have an impact on how you feel about the world and yourself, even if they aren’t manipulating the news feed. Be careful with how you use it, and the power that you give it in your life.

What Do You Think?

Did Facebook go too far? Does this creep you out, or could you not care less? Comment here or on this post on Facebook and let your opinion be heard.

Matt Kruse, developer of Social Fixer, a browser extension that makes Facebook better.