Emotions Experiment: Facebook Caught In Lie Over Data Use Policy?

Share the joy
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  

When controversy hit Facebook earlier this week for secretly carrying out an experiment that sought negative and positive emotional responses from users, they pointed critics to their Data Use Policy, which listed research as a possibility:

“… we may use the information we receive about you for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”

However it has now emerged that the social network didn’t implement that specific term until 4 months after the experiment had taken place.

The question of informed consent just got a little louder.

Tucking away unethical practises behind broad terms in a user agreement that nobody reads is bad enough itself, but in this case users were given absolutely no possible way of knowing that anything like the experiment could have been taking place. “Research” was not in the user agreement at that time.

There is also a clear distinction between regular A/B testing that many websites and advertisers carry out and what Facebook was doing.

The public have a reasonable expectation that when they buy newspapers, or go online that they are going to be advertised products. It is also usually quite apparent when they’re viewing an advert. In Facebook’s case ads may be at the side of the page, or it may say “sponsored post” beside them in news feeds.

A/B testing is the process of altering where the ads appear, or the type of ads shown, to monitor which ones are most successful. It most certainly isn’t the aim of A/B testing to elicit mass negative emotions from people. It’s not the same thing in intent or execution.

UK Investigate Facebook

Facebook have been so heavily criticised over the experiment that altered nearly 700,000 people’s news feeds to show either negative or positive content, that the UK’s Information Commissioner’s Office (ICO) has launched an investigation, and will be liaising with their equivalent in Ireland, where Facebook’s EU headquarters is based.

They want to discover whether Facebook violated data collection laws.

While a fine of up to £500,000 can be enforced, this is not likely to dent Facebook’s billions in revenue. The negative press however may get them to be more upfront in the future.

The Real Purpose of the Experiment?

Amidst the commotion what hasn’t actually been determined is the purpose of the Facebook experiment. PR lines about it helping them improve the user experience are a bit too generic for many people’s liking. And the concept of “emotional contagion” has long been understood. Perhaps the real question is why would Facebook want to know how they can manipulate people’s emotions? Is it just for scientific understanding or is there a profit motive?

We also cannot ignore the fact that federally funded researchers from Cornell and UCSF played a major role as well. Cornell University’s Jeffrey T. Hancock has previously carried out research for the Pentagon’s Minerva Initiative, which is funding universities to model the dynamics, risks and tipping points of large-scale civil unrest across the world. Wouldn’t understanding how people’s emotions are impacted by social media play a role in that?

Whatever the purpose of the experiment it’s clear that in future people want to know when they’re being experimented on!


Share the joy
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  

Author: Keelan Balderson

Keelan Balderson is a Blogger, Youtuber and Podcaster from the UK, with an avid interest in social media, alternative politics, and pro wrestling.

Share This Post On