paint-brush
Interface and Data Biopolitics in the Age of Hyperconnectivity: Interface and Data Biopolitics by@hyperconnectivity

Interface and Data Biopolitics in the Age of Hyperconnectivity: Interface and Data Biopolitics

by HyperConnectivity August 29th, 2024
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

The scenario described in the previous sections has important impacts on the “knowability, “readability”, accessibility and usability of the world. The implications, together with the systematicity and opaqueness of the scenario, calls for the emergence of new areas of scientific, technological and humanistic investigation.
featured image - Interface and Data Biopolitics in the Age of Hyperconnectivity: Interface and Data Biopolitics
HyperConnectivity  HackerNoon profile picture

Author:

(1) Salvatore Iaconesi, ISIA Design Florence and *Corresponding author ([email protected]).

Abstract and 1. A Hymn

2. Asymmetry

3. Bubbles, Guinea Pigs

4. Interface and Data Biopolitics

5. Conclusions: Implications for Design and References

4. Interface and Data Biopolitics

The scenario described in the previous sections has important impacts on the “knowability”, “readability”, accessibility and usability of the world, both in how people use it and interact with it, and in how they are able to design it.


The implications, together with the systematicity and opaqueness of the scenario, calls for the emergence of new areas of scientific, technological and humanistic investigation which can be defined as Interface and Data Biopolitics.


There are multiple definitions for the term “biopolitics”: Kjellén's organicist view and his description of the “civil war between social groups” (Lemke, 2011); the political application of bioethics (Hughes, 2004); the interplay between biology and political science (Blank, 2001); Hardt and Negri’s (2005) anti-capitalist insurrection through daily life and the body; Foucault’s (1997) “biopower”, through governments and organizations applying political power to all aspects of human life; and many more.


We refer here mainly to Foucault’s definition, which described biopolitics as “a new technology of power...[that] exists at a different level, on a different scale, and [that] has a different bearing area, and makes use of very different instruments”. (Foucault, 1997, p. 242)


In his analysis Foucault mainly referred to national states and institutions. Therefore his observations need adaptations to be considered in today’s globalized, financial, digital economies and political apparatuses of power. For example the rise of large corporations, which match the power, influence, and reach of national states, the different role of money, its virtualization, and the “finacialization of life” (Lapavitsas, 2013) are things that need to be integrated in such frameworks.


Fundamentally, Biopolitics can be defined as the study of systems as they leverage as many manifestations as possible of our daily lives, activities, relations and bodies to exercise power and control over their users and participants, in explicit and implicit ways.


As demonstrated in the previous sections, today’s scenarios of Hyperconnectivity bring about multiple forms of biopolitically relevant contexts. Online and application interfaces, biometrics, wearable computing, IoT, social media and, in general, all human activities with a direct or indirect digital information counterpart generate data which is harvested by large operators in order to be processed to influence our actions, behaviors, beliefs and perceptions, and, thus, to exercise power.


The shift to the digital sphere also provokes a shift from “biopower” to “neuropower” (Väliaho, 2014), as the medium for control shifts from body to mind.


For example, the elements forming an interface exercise a certain degree of power on their users. If only options A and B are available on an interface, the user will not be able to adopt option C. In many cases the user will not even be able to perceive that option C is possible. Hence, the interface, its designer, and the ideology and strategy that comes with both, have a degree of authoritarian agency over the user.


While registering to online services, many times users are asked to select their gender, to characterize their online profile. If, for example, only the “male” and “female” options are available, other options will be excluded and, thus, this could prove to be a problematic, upsetting and troubling scenario for those who feel neither “male” nor “female”. The business requirements of the operator, who would need to tag the users with predefined categories that are convenient to be commercialized to marketing and advertising partners, would have the prevalence.


In another scenario, a wearable biometric device could record data for health purposes. For example, a recording of a level of 1.5 to 1.8 from the device for a certain bodily value could indicate a “healthy” condition. If users had a readout of 1.6 they would be considered “healthy”, maybe corroborating the fact with a reassuring green light visible on the device, or on the associated application. If, for any plausible reason, the “health” threshold would be changed to a 1.7-2.0 range, the same users would be described as “not healthy”. The light would turn to red, maybe accompanied by a message: “visit your doctor!” The body of the users would be the same. They wouldn’t feel an additional headache or hurt in some other part of their body. By simple variation of a parameter their status would change, accompanied by a series of authoritative notifications.


This is a very powerful condition. Even taking simpler, less radical and more common examples still shows how a direct possibility to exercise power through asymmetric capacity of capturing, processing and visualizing data, and through designing interfaces in certain ways is available to the operators which own these platforms, systems, devices and services.


With the Internet of Things, this scenario manifests ubiquitously, affecting appliances, our homes through domotics, our schools, offices, stores and, potentially, the public, private and intimate spaces and contexts of our lives. As Pasquinelli (2008) puts it: “it is impossible to destroy the machine, as we ourselves have become the machine.”


On top of that, the power asymmetry manifests itself also in another way. While it is users that generate data and information by using interfaces, services and products, at the same time this data is not available to them, nor they have the possibility to perceive the full spectrum of its implications (Blanke et al, 2015).


As of today, most online services offer opportunities for users to download their own data (for example through “Google Takeout”). But these options are misleading, because they let users download their “content”, but not the data, information and knowledge that was generated through it by processing it. For example, there is no way for users to know in which marketing categories they have been classified, or what actions they performed led to being classified in such ways.


For example, let’s pretend that Facebook identified the category of “potential terrorists” as their machine learning processes discovered a pattern in the frequency with which radical extremists use the letter ‘f’ in their messages. If certain users, by complete chance, created messages using the same ‘f’ frequency, they would be classified as “potential terrorists”. They would know nothing about it, and this could have implications on their freedoms and rights. Of course this is a paradoxical example, just to make clear the dynamics of this phenomenon.


Moreover, all this data capturing and processing is designed, as stated in the previous section, to confront with relevance and attention, thus resulting in information, knowledge and relations bubbles. While these processes are useful in the scenario of information overload, they also progressively lock out difference from users’ perception: the more we are exposed to content which we “potentially like” and to “people we potentially agree with”, the more “otherness” disappears from our reach. This brings on a series of negative effects, such as the diminished sensibility to and acceptance of diversity (Bozdag, van den Hoven, 2015), rising levels of cognitive biases (Bozdag, 2013), diminished tolerance, social separation (Ford, 2012), and more.


This paper is available on arxiv under CC BY 4.0 DEED license.