I wasn’t surprised to see Slate write the click-baity article “Facebook’s Unethical Experiment” about Facebook’s controversial study where researchers showed that increasing positive messages in a user’s news feed lead that user to post more positive messages. But I’ve been shocked by how strongly the entire internet has come down against this research and I’m really concerned about the stifling effects of this misinformed outrage. Every website you visit is manipulating you and most are doing controlled studies right now to figure out how to manipulate you better. And overall, this is good for you. Google constantly uses experimental groups to test subtle changes in the ranking of results to make them more relevant to you and to make you happier. Amazon endlessly experiments on you to see if they can get you to buy more, but this isn’t usually a bad thing – their recommendation engine is amazing and often truly useful.. Remember when sign-up flows used to have a million steps and credit card forms were clunky? Through controlled experiments, millions of companies have learned that bad UI makes users unhappy and is bad for business. So why are people in such a frenzy about this Facebook study? If it’s because it measured the effect of manipulating emotion directly, that seems misguided/crazy It would be a great thing if websites tried to measure their effect on emotion more directly – maybe they could try to optimize for long-term user happiness rather than click-through rates. Some people are angry because they think academics should be held to a higher standard than companies. But in this case holding academics to a “higher standard” just forces corporations to never collaborate with them and makes these interesting and potentially important results necessarily unpublishable. This study was a minor variation on something every consumer internet company does all the time and is nothing like the Stanford prison experiment. If journalists want to worry about technology companies manipulating people, I think they should worry about game companies that use Skinner Box style reinforcement and endless experimentation to hook people into playing games over and over. Facebook investigating whether positive messages in a news feed lead to positive feelings is a good thing. I hope they keep looking into it, although after the media’s reaction, I’m sure they will never share their results publicly again. Important research will remain in the hands of large companies rather than shared across research institutions.