paint-brush
Disinformation Echo-Chambers on Facebook: Resultsby@escholar

Disinformation Echo-Chambers on Facebook: Results

tldt arrow

Too Long; Didn't Read

Recent events have brought to light the negative effects of social media platforms, leading to the creation of echo chambers, where users are exposed only to content that aligns with their existing beliefs.
featured image - Disinformation Echo-Chambers on Facebook: Results
EScholar: Electronic Academic Papers for Scholars HackerNoon profile picture

This paper is available on arxiv under CC BY-SA 4.0 DEED license.

Authors:

(1) Mathias-Felipe de-Lima-Santos, Faculty of Humanities, University of Amsterdam, Institute of Science and Technology, Federal University of São Paulo;

(2) Wilson Ceron, Institute of Science and Technology, Federal University of São Paulo.

Table of Links

3. Results

Within our dataset, we were able to identify that approximately 1,504 out of the 3,912 Facebook groups displayed indications of coordinated activity. In other words, nearly 38.5% of these groups engaged in near-simultaneous sharing of identical content. The results also underscore that these orchestrated endeavors to manipulate public discourse span across various groups with political designations. The concern is heightened considering the nature of these posts, which comprise false or misleading information.


The correlation between political Facebook groups and specific political behaviors introduces challenges to community cohesion and trust dynamics. A substantial body of literature addressing politics and social media explores the potential impact of echo chambers on individuals’ behaviors and how these might undermine efforts to uphold democratic values [76,77]. These online groups, in particular, exhibit indications of selective exposure, ideological segmentation, and political polarization. In our sample, they often adopt political labels [49]. This situation compounds existing issues by occupying a privileged position in scientific communication, thereby endangering public health and hindering efforts to manage the coronavirus pandemic. These Facebook groups serve as a tangible example of the intricate and interconnected nature of disinformation rhetoric, making empirical analysis in isolation a complex endeavor. For example, past research has highlighted the penetration of political disinformation narratives in the COVID-19 discourse during the first waves of pandemic in Brazil [21].


Figure 2. This graph exclusively features Facebook groups possessing degrees exceeding 100. In this context, these groups have shared a minimum of 100 coordinated posts. Remarkably, a significant portion of these groups have adopted political titles.


Our method successfully identified certain groups that exhibited stronger associations in disseminating these disinformation campaigns compared to others. As depicted in Figure 2, the Facebook groups highlighted in pink (a total of 117 nodes) form a particularly robust, as we called, “disinformation echo chamber.” Within it, inauthentic actors appear to be swiftly and repeatedly amplifying inappropriate content. This occurrence transpires at a notably higher frequency than in other groups, as evidenced by a clustering coefficient of 0.85. This shows the propensity of nodes within this network to cluster together, resulting in the formation of triangles and the manifestation of robust community structures within this network [78,79].


Additionally, the magenta nodes consist of 257 Facebook groups that showcase coordinated behavior. These groups also exhibit a high clustering coefficient (0.83), indicating the presence of a strong community structure. Lastly, the blue nodes represent 150 Facebook groups wherein multiple actors appear to make concerted efforts to enhance the visibility of specific content by employing coordinated activities. This community boasts a more robust structure than the magenta one (with a clustering coefficient of 0.84), albeit with a smaller number of nodes. Our analysis further revealed the existence of smaller communities that also display traces of activities aimed at artificially boosting the popularity of certain online content. Consequently, these Facebook groups, which likely emerge from orchestrated communication dynamics intending to disseminate messages to wide audiences, can be likened to “disinformation echo chambers.”


Figure 3 underscores how the five most frequently shared narratives are extensively propagated among these Facebook groups. Housing potentially inauthentic actors, these online communities appear to amplify these problematic contents in an endeavor to elevate their visibility. This creates a causal connection that potentially links the spread of disinformation with the presence of online echo chambers [68].


In essence, when a network of groups within an online media environment engages in nearly simultaneous and recurrent sharing of disinformation narratives, the emergence of “disinformation echo chambers” becomes apparent.


Figure 3. This graph illustrates the five most widely shared instances of disinformation content, highlighting their interconnectedness across various groups (dark blue nodes at the center). The edges portrayed in pink signify the shared videos within these groups, while the blue edges represent memes/photos, and the green edges signify URLs