Monday, June 30, 2014

Facebook Apologizes for Manipulating People- BBC


Facebook emotion experiment sparks criticism

Facebook logo 
 
 Facebook said it conducted the study to gauge users' response to content

Related Stories

Facebook is facing criticism after it emerged it had conducted a psychology experiment on nearly 700,000 users without their knowledge.

The test saw Facebook "manipulate" news feeds to control which emotional expressions the users were exposed to.

The research was done in collaboration with two US universities to gauge if "exposure to emotions led people to change their own posting behaviours".

Facebook said there was "no unnecessary collection of people's data".

"None of the data used was associated with a specific person's Facebook account," the social networking giant added.

Cornell University and the University of California at San Francisco were involved in the study.
Ability to manipulate?


"They are manipulating material from people's personal lives and I am worried about the ability of Facebook and others to manipulate people's thoughts in politics or other areas.”
Jim Sheridan Labour MP
 
But some have criticised the way the research was conducted and raised concerns over the impact such studies could have. 

"Let's call the Facebook experiment what it is: a symptom of a much wider failure to think about ethics, power and consent on platforms," Kate Crawford posted on Twitter.

Lauren Weinstein tweeted: "Facebook secretly experiments on users to try make them sad. What could go wrong?"

Meanwhile, Labour MP Jim Sheridan, a member of the Commons media select committee has called for an investigation into the matter.

"This is extraordinarily powerful stuff and if there is not already legislation on this, then there should be to protect people," he was quoted as saying by The Guardian newspaper.

"They are manipulating material from people's personal lives and I am worried about the ability of Facebook and others to manipulate people's thoughts in politics or other areas.

"If people are being thought-controlled in this kind of way there needs to be protection and they at least need to know about it."

However, Katherine Sledge Moore, a psychology professor at Elmhurst College, Illinois, said: "Based on what Facebook does with their newsfeed all of the time and based on what we've agreed to by joining Facebook, this study really isn't that out of the ordinary."

"The results are not even that alarming or exciting."

'Very sorry'
  The research was conducted on 689,000 Facebook users over a period of one week in 2012.
According to the report on the study: "The experiment manipulated the extent to which people were exposed to emotional expressions in their News Feed".

The study found that users who had fewer negative stories in their news feed were less likely to write a negative post, and vice versa.

Adam Kramer of Facebook, who co-authored the report on the research, said: "We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out".

"At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook."

However, he admitted that the firm did not "clearly state our motivations in the paper".

"I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused."

More on This Story

Related Stories

No comments:

Post a Comment

Please leave a comment-- or suggestions, particularly of topics and places you'd like to see covered