BREAKING
Police respond to report of armed suspect
Suspect now in custody; no shots fired.
Full Story
By allowing ads to appear on this site, you support the local businesses who, in turn, support great journalism.
Why Facebook has been experimenting on you behind your back
facebook
Details emerged Monday about an experiment Facebook conducted on its users without their knowledge and what that means for the future of the social network. - photo by Shutterstock

You may have been used as a test subject in a recent experiment conducted by Facebook and not have noticed.
Last week, the social networking company revealed that it purposefully manipulated statuses people saw on their news feeds, The New York Times reported.
Thousands of users selected at random were shown positive and negative posts as a way for Facebook to conduct a psychological study on its users, Vindu Goel wrote for The Times.
“The company says users consent to this kind of manipulation when they agree to its terms of service,” Goel wrote. “But in the quick judgment of the Internet, that argument was not universally accepted.”
Goel explained that Twitter and Facebook users alike spoke out against the manipulation. It went as far as the study’s leader — Adam D. I. Kramer, a Facebook researcher — issuing an apology on Facebook.
“Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone,” he said in his apology post. “I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.”
So what did the study actually find?
The Atlantic’s Robinson Meyer explained that the study found that negative emotions on Facebook often create more negative statuses, and the same can be said for positive posts.
This means “that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness,” Meyer reported on the study.
Facebook can then use this information to create new ways to get more people to post and be engaged on their social network, Meyer wrote.
“(F)rom this study alone Facebook knows at least one knob to tweak to get users to post more words on Facebook,” according to Meyer.
But there were problems with the study. John M. Grohol, a psychology expert, wrote on his blog that Facebook puts too much stock into its own service.
“What the Facebook researchers clearly show, in my opinion, is that they put too much faith in the tools they’re using without understanding — and discussing — the tools’ significant limitations,” Grohol wrote.
So is Facebook done with these sort of experiments?
David Auerbach, in a tongue-in-cheek, joking post, doesn’t think so.
Email: hscribner@deseretdigital.com
Twitter: @herbscribner