Authors: (1) Filipo Sharevski, DePaul University; (2) Benjamin Kessell, DePaul University. Authors: Authors: (1) Filipo Sharevski, DePaul University; DePaul University (2) Benjamin Kessell, DePaul University. DePaul University. Table of Links Abstract and Introduction Abstract and Introduction 2 Internet Activism and Social Media 2.1 Hashtag Activism 2.1 Hashtag Activism 2.2 Hacktivism 2.2 Hacktivism 3 Internet Activism and Misinformation 3.1 Grassroots Misinformation Operations 3.1 Grassroots Misinformation Operations 3.2 Mainstream Misinformation Operations 3.2 Mainstream Misinformation Operations 4 Hacktivism and Misinformation 4 Hacktivism and Misinformation 4.1 Research Questions and 4.2 Sample 4.1 Research Questions and 4.2 Sample 4.3 Methods and Instrumentation 4.3 Methods and Instrumentation 4.4 Hacktivists’ Profiles 4.4 Hacktivists’ Profiles 5 Misinformation Conceptualization and 5.1 Antecedents to Misinformation 5 Misinformation Conceptualization and 5.1 Antecedents to Misinformation 5.2 Mental Models of Misinformation 5.2 Mental Models of Misinformation 6 Active Countering of Misinformation and 6.1 Leaking, Doxing, and Deplatforming 6 Active Countering of Misinformation and 6.1 Leaking, Doxing, and Deplatforming 6.2 Anti-Misinformation “Ops” 6.2 Anti-Misinformation “Ops” 7 Misinformation Evolution and 7.1 Counter-Misinformation Tactics 7 Misinformation Evolution and 7.1 Counter-Misinformation Tactics 7.2 Misinformation Literacy 7.2 Misinformation Literacy 7.3 Misinformation hacktivism 7.3 Misinformation hacktivism 8 Discussion 8.1 Implications 8.1 Implications 8.2 Ethical Considerations 8.2 Ethical Considerations 8.3 Limitations and 8.4 Future Work 8.3 Limitations and 8.4 Future Work 9 Conclusion and References 9 Conclusion and References 5 Misinformation Conceptualization Social media users conceptualize misinformation, evidence shows, in more than one model that narrowly focuses on inherently fallacious information [109]. Beyond just fake news, misinformation is equally conceptualized as form of political (counter)argumentation where facts do selectively appear in alternative narratives relative to political and ideological contexts, often taken out-of-context with speculative intentions. Misinformation is also seen as external propaganda that includes manufactured facts and factoids disseminated and amplified online with division-creating intentions. Given the radical transformation of the trolling and mimes over time, our first research question aimed to learn the hacktivists’ take on this transformation in the context of the competing conceptualizations amongst ordinary social media users. 5.1 Antecedents to Misinformation The participants in our sample agree that trolling and mime dissemination has been hijacked for nefarious purposes, lamenting that what was a “deliberate action mostly for laughs, now is an automated operation for keeping people tribalistic and resistant to opposing views” [P13]. The use of “sock puppets for running forum raids in the old days of hacktivism” [P4], unfortunately, was not enough a serious threat for social media to implement “strict policies of who and how can participate in the public discourses early on” [P1] and counter to their business model of “monetizing every possible engagement on their platforms [P14]. “deliberate action mostly for laughs, now is an automated operation for keeping people tribalistic and resistant to opposing views” “sock puppets for running forum raids in the old days of hacktivism” P4 “strict policies of who and how can participate in the public discourses early on” P1 “monetizing every possible engagement on their platforms P14 Mainstream social media companies were accused of directly enabling the “information disorder” as their models of engagement pushed “less educational content the more an issue was important and demanded action” [P14]. This disorder played in the hands of the neoliberal elites, media outlets, and news organizations run by “billionaires detached from reality to gain further control over public spaces” as P1 put it. In the view of our participants, misinformation “has always been there” and pointed to the combination of “self-proclamation of expertise online, cultivating followers, and playing on confirmation bias” as the recipe the very hacktivists showed it works well in seeding misinformation: “information disorder” “less educational content the more an issue was important and demanded action P14 “billionaires detached from reality to gain further control over public spaces” P1 “has always been there” “For example, look at the [redacted]. He said he was a founding member of Anonymous and lots of people believed him. He has spoken at conferences about it and even got jobs because of it. Literally dig slightly into that and it’s clear that no one in the Anonymous community can vouch for the guy and there’s no evidence of him being linked. So, people are just too lazy to check stuff out because this guy is kinda selling a story that fits with what they think so it must be true” [P3]. “For example, look at the [redacted]. He said he was a founding member of Anonymous and lots of people believed him. He has spoken at conferences about it and even got jobs because of it. Literally dig slightly into that and it’s clear that no one in the Anonymous community can vouch for the guy and there’s no evidence of him being linked. So, people are just too lazy to check stuff out because this guy is kinda selling a story that fits with what they think so it must be true” P3 5.2 Mental Models of Misinformation The predominant mental model of misinformation amongst the hacktivists in our sample was the political (counter)argumentation where the information disseminated on social media for the sake of furthering a political argument or agenda [109]. In the original version of trolling and meme sharing the misinformation was seen as an alternative expression of disagreement, revolt, or ridicule without any context, but the contemporary trolling and memes is brought in the political context as a ready-made content for expression of political attitudes [90]. Despite that fact checking is widely available (and even suggested to users when content is moderated on social media [108]), the political appropriation of misinformation thrives because “people won’t fact check things and perpetuate them as long as these things align with their political ideology” [P2]. The reason why most social media users “fall for misinformation” is plain ignorance and stubbornness to hear anything contrary to their own political opinions” [P3] which results from “a serious lack of, at least in the U.S, critical thinking education in schools [P2]. “people won’t fact check things and perpetuate them as long as these things align with their political ideology” P2 “fall for misinformation” P3 “a serious lack of, at least in the U.S, critical thinking education in schools P2 In the view of the majority hacktivists in our study,“both sides of the political spectrum spread misinformation and it further enables political polarization” [P13]. While they acknowledge that “the misinformation on social media is often identified with right-wing opinions” [P6], hacktivists recognize that “we overuse the terms misinformation and disinformation to describe anything that is not a leftist opinion or fact [P7]. They point to the misinformation “stickiness” where the repeated exposure to speculative and false statements make them appear truthful [66], becoming the main theme of every social media discourse. For example, P3 refers to the Biden’s laptop saga [44], which in their view “has been politically disinfoed [sic] to death to the point that the laptop leaks are irrelevant and can’t be trusted as an evidence.” “both sides of the political spectrum spread misinformation and it further enables political polarization” P13 “the misinformation on social media is often identified with right-wing opinions” P6 “we overuse the terms misinformation and disinformation to describe anything that is not a leftist opinion or fact P7 “stickiness” “has been politically disinfoed [sic] to death to the point that the laptop leaks are irrelevant and can’t be trusted as an evidence.” Misinformation as political counter(argumentation) bothers the hacktivists as it conflicts with the all information should be free postulate, which in turn forces mainstream social media platforms to “restrict the flow of information” [P10]. Misinformation, in the view of P10, should not be restricted because “people are entitled to see both sides of a proverbial political coin so the platforms must allow them to do so, otherwise by only showing heads or tails people will speculate about what’s on the other side and assume the worst.” The restriction of information on platforms conflicts with the mistrust of authority and promote decentralization hacker postulate because “self-appoints the elites to define what constitutes ‘truth’” [P14]. It also forces “people to become rather tribalistic and a priori suspicious of people with different views” [P]. The “political tribalism” on social media [2], in turn, makes it “easier to demonize people with different opinions and political attitudes and avoid scrutinizing the like-minded ones” [P2], which plays directly in the hands of the “misinformers.” all information should be free P10 “people are entitled to see both sides of a proverbial political coin so the platforms must allow them to do so, otherwise by only showing heads or tails people will speculate about what’s on the other side and assume the worst.” “self-appoints the elites to define what constitutes ‘truth’” P14 P “political tribalism” “easier to demonize people with different opinions and political attitudes and avoid scrutinizing the like-minded ones” P2 As for the “misinformers”, our participants unequivocally identified the state-sponsored “appropriators” that hijacked the original hacktivist playbook to spread external propaganda on social media. That nation-states enjoyed a reputation for promulgating disinformation in the past was not a news to the hacktivists (e.g. “Russia has always been really good at it” [P2]), but instead what caught them aback was the “audacity and the sophistication” [P4] in utilizing trolling and memes on such a massive scale [134]. Reflecting on this shift in online operations, P3 believes that “disinfo ops [sic] and hacking our intellectual property is all these nation-states are left with because they can’t beat us militarily or economically.” Not necessarily neoliberal, but nonetheless authoritarian, the elites behind the external propaganda in equal degree conflicts with the mistrust of authority and promote decentralization hacker postulate because is a “blatant effort to control the social media turf and the mass of population spending their time there [P15]. The external propaganda nature of disinformation also conflicts with the all information should be free hacker postulate in the view of the hackers in our sample because “overshadows and complicated an access to other more factual or useful information” [P2]. “Russia has always been really good at it” P2 P3 “disinfo ops [sic] and hacking our intellectual property is all these nation-states are left with because they can’t beat us militarily or economically.” “blatant effort to control the social media turf and the mass of population spending their time there “overshadows and complicated an access to other more factual or useful information” P2 This paper is available on arxiv under CC BY 4.0 DEED license. This paper is available on arxiv under CC BY 4.0 DEED license. available on arxiv