It already knows where you live, how often you’re online and who you like (all those name searches get stored, you know) but now Facebook has caused even more controversy, by manipulating how you feel.
A study released earlier this week notes how, as part of an “emotional contagion” experiment, Facebook filtered the comments, videos, pictures and web links that appeared on users’ news feeds, in order to test whether other people’s moods affected their own. While some users saw less posts containing happy words like “love” “sweet” and “nice”, their counterparts saw less terms with negative connotations such as “nasty”, “ugly” and “hurt”. Rather than “positive emotional content” encouraging users to feel jealous and depressed (as most people suggest), happy posts actually encouraged its viewers to feel the same and vice versa.
Different to the advertising algorithms that Facebook data Scientist Adam Kramer references in his attempts to justify the experiment, he defends the study’s potential breach of ethical and legal guidelines stating that “at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect [something]”. Although all apparently perfectly legal according to the terms and conditions we signed when we jumped on the Facebook bandwagon aged 13, Facebook is adamant that there was no invasion of privacy as all the manipulation was done by machines, and no actual human beings saw anyone’s posts. But with 689,000 users unknowingly participating in the psychology experiment, questions over what else Facebook controls or will do with such valuable social information has provoked outrage. What if Facebook has been filtering out negative posts on their 800,000 daily visitors’ timelines since 2012 so that people feel happier and more inclined to return? Although over two years old, the revelations will no doubt hinder the social network site’s vast attempts to insist it will protect users privacy, especially on top of other recent scandals. In fact, many people have already taken to Twitter (Facebook’s arch enemy) to express their disgust:
Worse still, Kramer’s Facebook announcement only added more flame to the fire, providing a platform for even more public ruin. NYU journalism professor Jay Rosen for example, asked “what interest the United States military had in the research?” after they were cited as funding the project.
But one thing is for sure: Emotions are contagious - even if you’re only seeing them as you scroll down your computer screen. So next time you need some cheering up, instead of finding schadenfreunde from your frenemies Facebook page, remember that negativity is infectious, and you’re probably better off logging off.