Matt Smith's End of the Spectrum

Melbourne-based writer and journalist. Purveyor of finally crafted radio plays. A Muppet of a man.

A failure to communicate


Researchers at Facebook have released a study declaring they’ve discovered how to make its users feel happier or sadder. The results were published in a scientific paper which stated that Facebook researchers manipulated nearly 700,000 user news feeds. They selected content based on mood, to see if that had an effect on what the users then posted.

“When positive expressions [in the newsfeed] were reduced, people produced fewer positive posts and more negative posts,” Facebook data scientist Adam Kramer explained from his secret volcano lair while stroking a white cat. “when negative expressions were reduced, the opposite pattern occurred.”

It’s new territory for Facebook, which in the past has limited itself to figuring out how to sell your data and how to sell you things. In the face of a high level of public anger, Kramer took to his Facebook page with a public post:

“The goal of all of our research at Facebook is to learn how to provide a better service… our goal was never to upset anyone. In hindsight, the research benefits of the paper may not have justified all of this anxiety.”

Hindsight is a wonderfully understated term in this case, and the statement shows that Facebook may now be, ironically, worried about its users’ emotional reaction to the study.

Facebook has a disclaimer which you can find in the small print. It allows it to, and I quote, use your data “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.” Note that small unobtrusive word ‘research’, which you’d ironically need to do plenty of in order to find the small mention of it amongst the legalese.

This passing mention of research wasn’t in place until four months after the study took place. Facebook released a statement glossing over this fudging of the fine print, saying it was never doing research in the first place, and manipulating its users to be happy and sad is completely acceptable. “Companies that want to improve their services use the information their customers provide, whether or not their privacy policy uses the word ‘research’ or not.”

It shows little regard for that person or their status beyond a plaything. Should we be able to decide our own feelings without Facebook deliberately pointing us in a direction?

The lack of informed consent, no matter what box you ticked and how well actions are legalised, speaks of an arrogance that can only come from a service that knows its integral to the lives of a lot of its users.

If I didn’t have Facebook, for example, I wouldn’t be able to share in what my old schoolmate’s partner is eating for breakfast. I’d have no idea how to go about inviting a bunch of people to an event which they’ll ‘maybe’ decide to turn up to. How would I share photos of my cat? My cat deserves to be seen, and without Facebook this wouldn’t happen.

While I can’t blame this solely on Facebook (other social medias and SMS texting can also take the credit) it is by far the worst culprit, and garners the most suspicion as to what it’s doing with your private data. They’ve proven they can’t be trusted with anything you give them, and yet we give it to them anyway – all in the name of being ‘social’ and ‘sharing’. We’ve just lost sight on who we’re sharing this with.

Everything you type in Facebook, everything you post, everything you like, belongs to them and is for sale. Whatever your eyes fall on is also for sale to advertisers, so anything you like or are a fan of can be pushed to your friends with ‘your endorsement’ attached. We’re all unpaid marketers for Facebook, wearing a virtual sandwich board that they can put to use. It’s happened so gradually that we’ve barely even noticed.

Ultimately we should be asking ourselves how far we can be pushed. Should we be picking up the phone more often, or seeing friends and relatives in person? If the answer is no, are you really missing out on much in the first place?

Now that Facebook has shown willingness to tamper with your newsfeed and manipulate your emotions, there’s no telling where they’ll go. Just say your friend from highschool liked a post from a restaurant two years ago. Maybe you’ll now see an ad from them as well. And maybe Facebook will try and manipulate you so you’re feeling happy when you see it.

Considering Facebook scientists conducted this study back in 2012, it’s fair to assume that they’ve been doing this kind of thing for years and we’re none the wiser. Maybe they’re sitting back in their volcano lair stroking a white cat and laughing at us, their little lab rats, as we click ‘like’ and give them data to sell.

As long as they take a photo of their white cat and post it, I bet we’ll have no problem with it at all.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s


This entry was posted on July 5, 2014 by in lifestyle, technology and tagged , , , , , , .
%d bloggers like this: