manoncomputer600

Facebook Experiments: The Ick Factor

manoncomputer400Facebook is facing a storm of criticism after an academic paper revealed that they conducted an experiment in conjunction with researchers at Cornell University and the University of California, San Francisco, into how users moods are affected by the posts of their friends.

Over the course of a week, the Facebook news feeds of 689,003 users were tweaked so that stories were prioritised based on the presence of keywords associated with positive or negative feelings. The users’ own subsequent posts were then analysed for positive or negative feelings in an effort to establish how “emotional contagion” occurs in social networks.

On the face of it, it sounds like worthwhile research. So why are people upset about it?

Facebook already filters your news feed: they have to. Otherwise it would pile up faster than you could read it. Everything you see comes from the people you friend and follow, and the pages you like, but how you interact with those posts determines how much you see from their authors in future.

So there’s an algorithm at work. Facebook calls it Edgerank. Like any algorithm, they test it and refine it:

With so many stories, there is a good chance people would miss something they wanted to see if we displayed a continuous, unranked stream of information. Our ranking isn’t perfect, but in our tests, when we stop ranking and instead show posts in chronological order, the number of stories people read and the likes and comments they make decrease.

So far so good. Experimentation and data driven business are par for the course these days. Find out what your customers like and give it to them. Everybody wins, no?

We’re used to marketers experimenting on us. Special offers, free samples, competitions, surveys: there’s an accepted quid pro quo. We signal our preferences: they tweak their products, presentation, advertising etc to sell us more stuff. Marc Andreesen says the Facebook experiment is no different:

https://twitter.com/pmarca/status/483024580554932224

He’s got a point. And here’s another one:

https://twitter.com/pmarca/status/483020875776540672

Emotional manipulation is the key phrase. Somehow we’re OK with it when it’s to sell us something, or to get us to root for a character in a drama, or to engage in a political campaign.  But the idea of psychological experimentation for academic purposes carries an additional ‘ick factor’: scientists experimenting on human subjects, treating people as ‘lab rats’.

The Stanford Prison Experiment is an extreme example of the ethical pitfalls involved, yet the participants in that study were adult volunteers. In this case, while Facebook have argued that their Terms of Service permit this kind of research, we find it disquieting to think that people’s emotional reactions were ‘studied’ in this way, without their explicit consent.

It can be argued that such consent would introduce a selection bias by including only those people who were aware that they might be studied. But it’s a challenge which is not unique to psychology. If we can conduct human trials of drugs and treatments in an ethical way, shouldn’t we be able to do so for social media as well?

In principle, I’m in favour of this kind of research, conducted openly, using the highest academic standards. If Facebook has the potential to manipulate society, then society needs to be informed about that in order to ensure that it is regulated and supervised. Put simply, I’m happier to see this kind of research published, than to think it might be conducted, and applied, in secret.

, , , , ,

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.