paint-brush
Facebook and Instagram Use Algorithms That Promote Compulsive Useby@metaeatsbrains
254 reads

Facebook and Instagram Use Algorithms That Promote Compulsive Use

by Save the Kids From MetaNovember 1st, 2023
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Meta, the parent company of Facebook and Instagram, employs recommendation algorithms designed to keep users compulsively engaged. These algorithms use signals from users, including overt actions like liking posts and unconscious actions like lingering on content. They manipulate users, especially young ones, by using unpredictable sequencing and presenting psychologically gripping content. By triggering dopamine releases, these algorithms keep users engaged and encourage addictive behavior. Despite this, Meta does not adequately disclose these practices, leading to potential harm to users, especially young ones.

People Mentioned

Mention Thumbnail
featured image - Facebook and Instagram Use Algorithms That Promote Compulsive Use
Save the Kids From Meta HackerNoon profile picture

The United States v Meta Platforms Court Filing October 24, 2023 is part of HackerNoon’s Legal PDF Series. You can jump to any part in this filing here. This is part 17 of 100.

3. Meta’s Recommendation Algorithms encourage compulsive use, which Meta does not disclose.

151. Instagram and Facebook employ Recommendation Algorithms that curate content from the main feeds and other parts of the Platforms.


152. The Recommendation Algorithms use data points, or “signals,” harvested from individual users to choose and/or arrange each new piece of content to display to a user. Such signals include, but are not limited to, overt actions like Liking a post or following a page as well as such unconscious actions such as lingering on—but not otherwise engaging with—certain content or visiting but not following another user’s page.


153. Meta employs Recommendation Algorithms universally across its Social Media Platforms, including the Instagram Platform’s Main Feed (the scrolling presentation of content immediately visible upon opening the app) and Explore Feed (another scrolling presentation of algorithmically curated content that can be guided by a user’s text input in a search field).


154. Meta designed its Recommendation Algorithms to maximize youth engagement in several ways but did not disclose these engagement-maximization features to the public—instead representing that these algorithms were intended to benefit the user.


155. First, Meta designed the Recommendation Algorithms to present material to young users in an unpredictable sequence rather than displaying posts chronologically.


156. Specifically, Meta’s Recommendation Algorithms display content to young users through a sequencing method referred to by psychologists as “variable reinforcement schedules” or “variable reward schedules.”


157. As Dr. Mark D. Griffiths, Distinguished Professor of Behavioral Addiction at Nottingham Trent University, explains:


The rewards [experienced on social media platforms]—which may be physiological, psychological and/or social—can be infrequent but even the anticipation of one of these rewards can be psychologically and/or physiologically pleasing. The rewards are what psychologists refer to as variable reinforcement schedules and is one of the main reasons why social media users repeatedly check their screens. Social media sites are ‘chock-ablock’ with unpredictable rewards. Habitual social media users never know if their next message or notification will be the one that makes them feel really good. In short, random rewards keep individuals responding for longer and has been found in other activities such as the playing of slot machines and video games.[9]


158. Because they do not work in a predictable pattern, these “variable reinforcement schedules” trigger a release of dopamine, a neurotransmitter released by the brain in response to certain stimuli. Dopamine, commonly “seen to be the ‘pleasure chemical,’” is released in anticipation of a potential reward. However, dopamine neurons fire for only a relatively short period of time, and after dopamine is released, an “individual can become disheartened and disengaged.”[10 ]


159. As researchers Rasan Burhan and Jalal Moradzadeh explain, the variable reinforcement schedules baked into social media platforms like Instagram can lead to “addiction with dopamine implicated”:


[T]he user can be kept in a loop. Essentially, that’s how the social media apps exploit these innate systems. The way this comes about is through a term referred to as Variable Reward Schedules. This works by positive stimuli being provided at random intervals. By users checking their phones for notifications and updates at periodic intervals for something that could be intrinsically rewarding. Most of the time it’s a neutral stimuli, but on occasion there may be a positive stimuli leading to the rewarding dopamine release hence keeping the user in the feedback loop.[11]


160. [Redacted]


161. By algorithmically serving content to young users according to variable reward schedules, Meta manipulates dopamine releases in its young users, inducing them to engage repeatedly with its Platforms—much like a gambler at a slot machine.


161-164. [Redacted]


165. Nonetheless, and as illustrated above, as recently as 2020, Meta continued to intentionally design its Platforms to manipulate dopamine responses in its young users to maximize time spent on its Platforms. Meta did not disclose that its algorithms were designed to capitalize on young users’ dopamine responses and create an addictive cycle of engagement.


166. Second, Meta uses data harvested from its users to target user engagement on an individual level via its Recommendation Algorithms—making continued engagement even more difficult for young users to resist.


167. In a June 8, 2021 public blog post on Instagram’s website, Mosseri stated that Meta collects and supplies its Recommendation Algorithms with thousands of “signals” across Instagram’s Feed and Stories, including “[y]our activity” and “[y]our history of interacting with someone.” Mosseri’s post explained that the collection of “[y]our activity . . . helps us understand what you might be interested in . . .” and the collection of “[y]our history of interacting with someone . . . gives us a sense of how interested you are generally in seeing posts from a particular person.”


168. Similarly, Facebook’s Vice President of Global Affairs wrote in Medium on March 31, 2021, about Facebook’s Recommendation Algorithms: “The goal is to make sure you see what you find most meaningful—not to keep you glued to your smartphone for hours on end. You can think about this sort of like a spam filter in your inbox: it helps filter out content you won’t find meaningful or relevant, and prioritizes content you will.”


169. Likewise, Meta’s terms of service on data collection state that Meta uses user data to “[p]rovide, personalize and improve our Products,” “[p]rovide measurement, analytics, and other business services,” “[p]romote safety, integrity and security,” “[c]ommunicate with you,” and “[r]esearch and innovate for social good.”


170. In reality, though, Meta tracks and logs the behavior of millions of young users and utilizes that data to refine and strengthen the features that induce young users’ compulsive Social Media Platform use.


171. As young users engage with Meta’s Social Media Platforms, they are unwittingly training Meta’s Recommendation Algorithms to provide the particular flow of content, notifications, and features that will most effectively keep them online.


172. Again, Meta does not disclose to consumers that it is weaponizing young users’ data to capture and keep their attention.


173. Meta admits in its Privacy Policy that it uses data provided by its young users for purposes other than facilitating meaningful social experiences, such as “improv[ing] our Products . . . includ[ing] personalizing features, content and recommendations, such as your Facebook Feed, Instagram feed, Stories, and ads.”


174. This includes using young users’ data to “[t]est out new products and features to see if they work” and to “[g]et feedback on our ideas for products or features.”


175. But Meta’s representations about its Recommendation Algorithms do not effectively apprise young users of the reality that Meta is harvesting vast amounts of personal data to train its Recommendation Algorithms to induce them to keep using the Platforms.


176. Third, the Recommendation Algorithms increase young users’ engagement by periodically presenting those users with psychologically and emotionally gripping content, including content related to eating disorders, violent content, content encouraging negative selfperception and body image issues, bullying content, and other categories of content known by Meta to provoke intense reactions.


177. Meta’s Recommendation Algorithms are optimized to promote user engagement. Serving harmful or disturbing content has been shown to keep young users on the Platforms longer. Accordingly, the Recommendation Algorithms predictably and routinely present young users with psychologically and emotionally distressing content that induces them to spend increased time on the Social Media Platforms. And, once a user has interacted with such harmful content, the Recommendation Algorithm feeds that user additional similar content.


178-182. [Redacted]


183. Again, though, Meta’s public statements regarding its algorithms’ amplification of distressing and problematic content did not reflect Meta’s true awareness of these problems.


184. In fact, Meta has strongly denied that its Social Media Platforms amplify extreme, distressing, or problematic content.


185. For example, on September 30, 2021, Davis denied that Meta promotes harmful content, such as content promoting eating disorders to youth, when she testified before Congress, stating, “we do not direct people towards content that promotes eating disorders. That actually violates our policies, and we remove that content when we become aware of it. We actually use AI to find content like that and remove it.”


186. [Redacted]


187. Likewise, in a June 8, 2021 post on the Instagram website, titled “Shedding More Light on How Instagram Works,” Mosseri describes Meta’s Recommendation Algorithms by providing examples of benign content recommendations (e.g., “if you’re interested in dumplings you might see posts about related topics, like gyoza and dim sum . . .”). The post provides no accompanying examples or warnings disclosing that the Recommendation Algorithms also tend to suggest content that is dangerous or harmful for young users.


188. The Instagram website also boasts that “[a]t Instagram, we have guidelines that govern what content we recommend to people” and specifies that Instagram “avoid[s] making recommendations that may be inappropriate for younger viewers . . . . We use technology to detect both content and accounts that don’t meet these Recommendations Guidelines and to help us avoid recommending them. As always, content that goes against our Community Guidelines will be removed from Instagram.”


189. A parent or young user encountering these and similar communications by Meta could reasonably understand Meta to be representing that its Recommendation Algorithms do not promote content to young users that violates Meta’s Recommendation Guidelines or is otherwise dangerous or inappropriate for young users.


190. But as explained above, Meta does increase young users’ engagement with its Platforms by periodically presenting them with psychologically and emotionally gripping content, including content related to eating disorders, violent content, content encouraging negative self perception and body image issues, bullying content, and other categories of content known by Meta to provoke intense reactions from users.




[9] Mark D. Griffiths, Adolescent Social Networking: How Do Social Media Operators Facilitate Habitual Use?, 36 Educ. & Health J. 66, 67 (2018), http://archive.today/cPgJ1 (internal references omitted).


[10] Rasan Burhan & Jalal Moradzadeh, Neurotransmitter Dopamine (DA) and its Role in the Development of Social Media Addiction, 11 J. Neurology & Neurophysiology 1, 1 (2020), http://archive.today/kxldL.

[11] Id. at 1-2.



Continue Reading Here.


About HackerNoon Legal PDF Series: We bring you the most important technical and insightful public domain court case filings.


This court case 4:23-cv-05448 retrieved on October 25, 2023, from Washingtonpost.com is part of the public domain. The court-created documents are works of the federal government, and under copyright law, are automatically placed in the public domain and may be shared without legal restriction.