<We all know Facebook controls and contrives content, especially through ads and the news feed. But in a paper published in the Proceedings of the National Academy of Sciences, Facebook scientists acknowledged how, and how much, they’ve been manipulating content for just under 700,000 users. Their publication, “Experimental evidence of massive-scale emotional contagion through social networks,” pretty much says it all — scientists wanted to see whether the positive or negative tone of users’ posts would affect later posts by users who had seen them. Essentially, they wanted to know whether someone’s state of mind can be spread via social networks. I’m sure it’s not surprising that they learned that, absolutely, mood is contagious on Facebook.
Scientists controlled which posts popped up on users’ feeds, either causing the posts seen by those 700K random viewers to be particularly positive or particularly negative. Then they tracked the updates of those 700K viewers during the next week to see if their subsequent posts were similarly positive or negative. The Facebook scientists found that the fewer positive posts that ended up in someone’s feed, the fewer positive posts those users in turn produced, and the same went for negative posts. If you’ve ever been inundated with whiny Facebook posts in your feed, you know that that alone is enough to irritate. Although I’m sometimes irritated by the “today was the best day ever!” posts too. Hell, I’m just generally irritated by Facebook, but I haven’t disabled my account yet so I really have no right to complain.
But we could complain about the clandestine manipulation of users’ emotions, though. This is clearly pretty valuable information for Facebook to have, and will unfortunately likely shape future techniques regarding deep learning and other ad and newsfeed-based algorithms they run. The study concluded that “in-person interaction and nonverbal cues are not strictly necessary for emotional contagion,” and honestly, the fact that Facebook uses a word more associated with Ebola than with social media says something about how they regard their users — as unwitting test subjects.
If you clicked the terms of service box when signing up for your Facebook account, you gave the company permission to use you as a test subject by using your information for advertising, as well as for “testing, research and service improvement.” And since computers culled the negative and positive posts, no actual scientists look at anyone’s personal information, so Facebook is like, super respectful of our privacy.
In the study, Facebook also refers to another study that indicates that this effect is magnified in the long term — a study that followed nearly 5,000 people from 1983-2003 concluded that “people’s happiness depends on the happiness of others with whom they are connected,” which is great when it comes to the spread of happiness, but isn’t so great on the flip side — that longer-term sadness can also be transmitted via a social networks.
And once Facebook knows our habits — Do we turn to retail therapy during tough times? Do we become incredibly generous on payday? — then they’ll be better able to manipulate our actions, as well as our emotions. Still, we’ll continue to let Facebook do it, which means we can’t put all of the blame on them — just most of it.