THIS CASE WILL BE USED IN THE NYC/NJ & LONG ISLAND REGIONAL COMPETITIONS
Across the globe, Facebook users utilize the platform in a variety of ways, and more than a third of adults report regular use of Facebook as a news source. Behind each user’s news feed is an algorithm that controls what that user will or will not see. The algorithm is based on a collection of factors—including which types of posts a user interacts with and what their Facebook friends are posting about. At one level, this process is practical. When a given user opens their news feed, there are thousands of posts that Facebook could show them. Processing through such a large number of posts would be overwhelming to the user. So, the algorithm streamlines a mere hundred posts to the user and selects posts that will presumably keep them coming back for more.
However, some types of algorithm tinkering seem different. In 2012, Facebook intentionally altered the news feed algorithm of hundreds of thousands of users in order to conduct a psychological experiment. The experiment was designed to measure whether or not emotional states are contagious via social media networks, as they can be with in-person interactions. By changing the number of positive or negative posts that users would see, researchers concluded that, indeed, emotional states are contagious via a social media network. The experiment’s findings are informative, but many have questioned whether Facebook was morally justified in conducting such an experiment in the way it did.
Facebook withheld experimental information from hundreds of thousands of users about how the emotional tone of their news feeds was being directly and intentionally altered. Moreover, Facebook users were unaware that they were the subjects of a psychological experiment designed to impact their moods. However, Facebook users consent to the intentional alteration of their news feeds when they agree to the terms of service. So, defenders argue that Facebook had the requisite permission of its users to use them in the psychological experiment, regardless of whether or not the users were explicitly aware of their participation in the experiment or their consent to it.
- Are social media companies like Facebook ever morally permitted to conduct psychological research on their users without the direct knowledge of those users?
- To what extent, if any, does the tacit consent of social media users—i.e., their agreement to the terms and conditions of utilizing a social media platform—grant social media companies the moral permission to conduct psychological experiments on them?
- Under what circumstances, if any, might social media companies have a moral obligation to intentionally alter their algorithms or to modify what certain users see on their news feed?
This is case #1 from the 2021-2022 Regional HSEB Case packet, developed by the Parr Center for Ethics. The full case packet can be found here.