All design involves manipulation. Crafting solutions to deliberately influence our decisions and actions. But is there a line that shouldn’t be crossed when it comes to using people in an experiment to test if their emotions can be altered?

There has been a media storm over the weekend after news became public that Facebook had been conducting an experiment to see if different types of news inserted into activity streams would influence the emotions of readers.

The research was to explore if the known physical reaction – that emotions can be contagious – could occur online. When surrounded by happiness, we are more likely to be happy. When surrounded by misery, we are more likely to feel sad. Pretty normal human stuff in the real-world. Does it also apply digitally?

This is what Facebook did to apply the theory online:

For one week, some users saw fewer posts with negative emotional words than usual, while others saw fewer posts with positive ones

And the result?

People were more likely to use positive words in Facebook posts if they had been exposed to fewer negative posts throughout the week, and vice versa. The effect was significant, though modest

Reactions across social media channels and news outlets have ranged from outrage to ‘meh’.

Manipulation surrounds us in our daily activities. To design something is to manipulate behaviour towards a desired outcome. Be it for personal, social or commercial gain. For a visual tour of how we are both manipulated by and manipulate our environment to positive effect, I recommend ‘Thoughtless Acts’ by Jane Fulton Suri and IDEO. The featured image at the start of this post is just one example of both forces in action. Designing curved paths to encourage travel across a certain space. Only for people to design a shortcut across the grass. Manipulation is not a bad word.

Manipulation has taken on an added dimension in the digital world as the concept of A|B testing has become common place as a method to continuously improve web site design. Split your audience in half, present a different screen to each half and see which version works best. Implement the statistical winner. Rinse and repeat.

What feels wrong about what Facebook did is the word ‘experiment’. This wasn’t about directly improving the site design for Facebook’s benefit, it was about seeing if the behaviour of visitors could be influenced to inform future design. It’s a fine line, but to deliberately make people feel bad without them even knowing they are part of an experiment feels unethical.

Imagine somebody prone to depression and feeling particularly low one day. And they decide to go somewhere that they normally enjoy to try and shake it off. What if, normally, Facebook is a channel that makes them happy. It’s where they chat with friends and catch-up on news and gossip that they are interested in. And what if, for one badly-timed week, their personal activity stream was deliberately manipulated without them knowing and populated with negative words for an experiment?

— Update —

Facebook has responded to some of the media criticism and whilst I don’t doubt that the researchers involved didn’t mean any harm, they can’t knowingly claim the experiment was harmless either.

Here is a quote from the response:

…at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it — the result was that people produced an average of one fewer emotional word, per thousand words, over the following week

Well that makes it all OK then.

Er, no it does not.

A few years ago, Steve Blank wrote an excellent article explaining why a seemingly small and perfectly rational idea can have a much bigger impact than intended. Here’s a brief summary of the story:

…a young successful company was growing and hired a new CFO. Who became concerned at the amount of money being spent providing free snacks and sodas to staff and made a recommendation to the board:

“We’re too big for that now… But we’ll sell them soda “cheap.”

To save $10,000 or so, they unintentionally launched an exodus of their best engineers

Expecting well-compensated technical expertise to fork out $0.50 for a can of soda sounds perfectly reasonable. If somebody would quit over such a small matter, you’re better off without them right?

Wrong. Because that one small action wasn’t the first. It was the final one. It was the wake-up call. It was the culmination of a series of small incremental adjustments to the treatment of employees that led some to decide “enough is enough, this company isn’t the place it used to be”.

And that’s why the Facebook experiment was wrong. The researchers may consider their actions to have had only minimal significance. But for just one person, that negativity could have been the final action, not the first.


Flickr-ShortcutFeatured image: ‘Shortcut’ kindly shared on Flickr by Kai Schreiber


Behaviour, Blog, Digital Strategy
, ,

Join the conversation! 1 Comment

  1. […] If you found this post interesting, you might also be interested in an earlier test of Microsoft’s emotion algorithm – Computer Vision Accuracy and a look at why the Facebook experiment to manipulate emotions by altering news feeds was ethically wrong – Design = Manipulation […]

Comments are closed.

%d bloggers like this: