paint-brush
Interface and Data Biopolitics in the Age of Hyperconnectivity: Conclusions and Referencesby@hyperconnectivity

Interface and Data Biopolitics in the Age of Hyperconnectivity: Conclusions and References

by HyperConnectivity August 29th, 2024
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

The scenarios described in this article pose great challenges for Designers and, most important of all, for Design Education. To confront with these issues, approaches which are trans-disciplinary are needed. The possibilities and opportunities to meaningfully deal with the issues presented in the article emerge only at the intersections between law, psychology, culture, philosophy, sociology, ethics and other sciences, humanities and practices.
featured image - Interface and Data Biopolitics in the Age of Hyperconnectivity: Conclusions and References
HyperConnectivity  HackerNoon profile picture

Author:

(1) Salvatore Iaconesi, ISIA Design Florence and *Corresponding author ([email protected]).

Abstract and 1. A Hymn

2. Asymmetry

3. Bubbles, Guinea Pigs

4. Interface and Data Biopolitics

5. Conclusions: Implications for Design and References

5. Conclusions: Implications for Design

The scenarios described in this article pose great challenges for Designers and, most important of all, for Design Education.


On a first level of inspection, it is simple to verify how all of these situations and configurations of power schemes, practices and behaviors are at the border of what is assessed by laws, regulations, habits and customs. They are at the same time familiar and new, unexpected, unforeseen, unsought. To confront with these issues, approaches which are trans-disciplinary are needed, because no single discipline alone is able to cover all of the knowledge, attitude, perspective which are needed to grasp and understand them.


The possibilities and opportunities to meaningfully deal with the issues presented in the article emerge only at the intersections between law, psychology, culture, philosophy, sociology, ethics and other sciences, humanities and practices.


This fact represents an important opportunity for design, which can act as a convenient, practical and methodologically sound interconnector among disciplines and approaches.


For this, it is of utmost importance that Design curricula natively host such trans-disciplinary approaches, not only combining disciplines as it is common practice in multi-disciplinarity, but traversing them, generating not only contaminations, but also methodological boundary shifts.


The same state of necessity can be detected also for the topics of openness, transparency and access. As seen in the previous sections of the article, most of the times power asymmetries manifest themselves through lack of openness, transparency and access.


Interoperability, data openness and accessibility, usage of open licensing schemes, use of open formats, open access to APIs: these are all types of practices that enable to confront with these problems.


These topics should be standard part of any form of design education, highlighting not only the fact that they enable the emergence of the ethical approaches necessary to resolve the issues described in the article, but, also, represent potential competitive advantages for any organization, as well as the opportunity to create meaningful actions.


The necessity of openness, transparency and access pave the way to another necessary axis for innovation in Design Education, represented by the necessary evolution in which Design needs to confront with Public, Private and Intimate Spheres.


As seen in the previous sections, it is now practically impossible to determine the boundaries of these spheres. Content harvesting, sensors, analytics, and algorithms: these processes know no boundaries. Data and information that appears to be private or even intimate is captured, intercepted, inferred, diverted, producing results for marketing, advertising, or even for surveillance and control. In designing these ecosystems to confront with these issues it is necessary to make every possible effort to clearly and transparently define the boundaries of public, private and intimate spaces, as well as the rights and freedoms which are granted within each of them. This is a complex process, which involves the aforementioned trans-disciplinary approaches as well as considerations that regard current business models, legislations, human rights, and (often national and international) security. There is no simple way to confront with this type of problem. Rather, it is a problem to be dealt with through complex approaches, combining not only different disciplines and practices, but also society as a whole. Here again lies the potential role for design, which can rediscover its humanistic and social elements and act as an interconnector between multiple types of agencies. This is also an evolutionary opportunity for design education practices, in which this modality can be implemented directly into the learning process, by opening it up to the city, the territory and its inhabitants.


Which brings on the next relevant pattern: the one of participation, inclusion and social engagement. Opening up data and processes, using open licenses and formats all are necessary items, but not sufficient. If these actions do not match cultural, social, philosophical ones, they remain ineffective in society. Open Data, as of now, remains a tool for the few, for those researchers, engineers and designers who mediate it for others. For these types of action to become relevant for society design processes must include the patterns of active participation, inclusion and social engagement. This notion must be built into design processes and education, and all possible actions must be performed to inject these ideas into the strategies of those businesses, organizations and, in general, clients who commission the designs.


All leads to the concluding argument of this article, which points out the necessity for design to embrace all possible strategies and actions to promote human dignity, freedom and joy, avoiding atomization and loneliness which have become typical of the years we live in.


The risk society (Beck, 1992) has brought on


“[...] a mad, Kafkaesque infrastructure of assessments, monitoring, measuring, surveillance and audits, centrally directed and rigidly planned, whose purpose is to reward the winners and punish the losers. It destroys autonomy, enterprise, innovation and loyalty, and breeds frustration, envy and fear. Through a magnificent paradox, it has led to the revival of a grand old Soviet tradition known in Russian as tufta. It means falsification of statistics to meet the diktats of unaccountable power.” (Monbiot, 2014)


All this is fundamental to current models that insist on comparison, evaluation and quantification.


Design practice and education can, instead, have a positive role in this, acting as a complex, inclusive and critical interconnector, promoting human dignity, joy and freedom.

References

Behar, R. (2004) Never Heard Of Acxiom? Chances Are It's Heard Of You. Retrieved December 26, 2016, from http://archive.fortune.com/magazines/fortune/fortune_archive/2004/02/23/362182/index.htm


Beck, U. (1992) Risk Society, Towards a New Modernity. London: Sage Publications.


Blank, R. (2001). Biology and Political Science. Psychology Press.


Blanke, T., Greenway, G., Cote, M., & Pybus, J. (2015). Mining mobile youth cultures. In Proceedings - 2014 IEEE International Conference on Big Data, IEEE Big Data 2014.


Booth, R. (2014) Facebook reveals news feed experiment to control emotions. The Guardian. Retrieved December 26, 2016 from https://www.theguardian.com/technology/2014/jun/29/facebook-users-emotions-news-feeds boyd, d., Crawford, K. (2012) Critical questions for big data: Provocations for a cultural, technological, and scholarly phenomenon. Information, Communication & Society, 5(15).


Bozdag, E. (2013) Bias in algorithmic filtering and personalization. Ethics and Information Technology 15(3), pp 209-227.


Bozdag, E., van den Hoven, J. (2015) Breaking the filter bubble: democracy and design. Springer. Retrieved December 26, 2016 from http://link.springer.com/article/10.1007/s10676-015-9380-y


Bratton, B. H, Jeremijenko, N. (2008) Is Information Visualization Bad for You? Situated Technologies Pamphlets 3: Suspicious Images, Latent Interfaces. New York: The Architectural League of New York.


Davenport, T. H., Beck, J. C. (2013) The Attention Economy: Understanding the New Currency of Business. Cambridge: Harvard Business Press.


Ford, E. (2012) What's in Your Filter Bubble? Or, How Has the Internet Censored You Today? Retrieved December 26, 2016 from http://pdxscholar.library.pdx.edu/cgi/viewcontent.cgi?article=1078&context=ulib_fac


Foucault, M. (1997) Society Must Be Defended: Lectures at the Collège de France, 1975-1976. New York: St. Martin's Press.


Hardt, M, Negri, A. (2005) Multitude: War and Democracy in the Age of Empire. Hamish Hamilton.


Hill, K. (2014) 10 Other Facebook Experiments On Users, Rated On A Highly-Scientific WTF Scale. Forbes Magazine. Retrieved December 26, 2016 from http://www.forbes.com/sites/kashmirhill/2014/07/10/facebook-experiments-on-users


Hughes, J. (2004). Citizen Cyborg: Why Democratic Societies Must Respond to the Redesigned Human of the Future. Westview Press.


Kramer, A. D. I., Guillory, J. E, Hancock, J. T. (2014) Experimental evidence of massive-scale emotional contagion through social networks. PNAS 111(24).


Lafrance, A. (2014) Why Can't Americans Find Out What Big Data Knows About Them? The Atlantic. Retrieved December 26, 2016 from http://www.theatlantic.com/technology/archive/2014/05/why-americans-cant-find-out-what-bigdata-knows-about-them/371758/


Lapavitsas, C. (2013) The Financialization of Life. Retrieved December 26, 2016 from https://www.youtube.com/watch?v=QsXmi58N3CA


Lemke, T. (2011) Biopolitics: An Advanced Introduction. New York: NYU Press.


Lessig, L. (2006) Code. New York: Basic Books.


Madrigal, A. C. (2012) Bruce Sterling on Why It Stopped Making Sense to Talk About 'The Internet' in 2012. The Atlantic. Retrieved December 26, 2016 from http://www.theatlantic.com/technology/archive/2012/12/bruce-sterling-on-why-it-stoppedmaking-sense-to-talk-about-the-internet-in-2012/266674/


Menkveld, A. J., Yueshen, B. Z. (2013) Anatomy of the Flash Crash. Retrieved December 26, 2016 from http://vu.nl/nl/Images/SSRN-id2243520_tcm289-336546.pdf


Monbiot,G. (2014) Sick of this market-driven world? You should be. The Guardian. Retrieved December 26, 2016 from https://www.theguardian.com/commentisfree/2014/aug/05/neoliberalism-mental-health-richpoverty-economy


O'Reilly, C. A. III (1980) Individuals and Information Overload in Organizations: Is More Necessarily Better? Academy of Management 4(23).


Pariser, E. (2011) Beware online "Filter Bubbles". Retrieved December 26, 2016 from https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles


Pasquinelli, M. (2008) Animal Spirits, A Bestiary of the Commons. Rotterdam: Nai Publishers.


Rushe, D. (2014) Facebook sorry – almost – for secret psychological experiment on users. The Guardian. Retrieved December 26, 2016 from https://www.theguardian.com/technology/2014/oct/02/facebook-sorry-secret-psychologicalexperiment-users


Singer, N. (2012) Mapping, and Sharing, the Consumer Genome. New York Times. Retrieved December 26, 2016, from http://www.nytimes.com/2012/06/17/technology/acxiom-the-quietgiant-of-consumer-database-marketing.html


Teng, V. (2013) Hymn of Acxiom. Retrieved December 26, 2016, from https://www.youtube.com/watch?v=0mvrKfcOnDg


The Private NSA: See What Acxiom Knows About You (2013). Retrieved December 26, 2016, from http://www.tomsguide.com/us/how-to-axicom,review-1894.html


Tufekci, Z. (2015) Algorithmic harms beyond Facebook and Google: Emergent Challenges of Computational Agency. Colorado Technology Law Journal. Retrieved December 26, 2016 from http://ctlj.colorado.edu/wp-content/uploads/2015/08/Tufekci-final.pdf


Väliaho, P. (2014) Biopolitical Screens: Image, Power, and the Neoliberal Brain. Cambridge: MIT Press.


Verhaeghe, P. (2014) What About Me?: the struggle for identity in a market-based society. London: Scribe Publications.


Weber, R. H. (2010) Internet of Things – New security and privacy challenges. Computer Law & Security Review 1(26).


Wellman, B. (2001) Physical Place and Cyberplace: The Rise of Personalized Networking. International Journal of Urban and Regional Research 2(25).


White, G. (2016) Big Data and Ethics: Examining the Grey Areas of Big Data Analytics. Issues in Information Systems 4(17).


Zittrain, J. (2014) Facebook Could Decide an Election Without Anyone Ever Finding Out. New Republic. Retrieved December 26, 2016 from https://newrepublic.com/article/117878/information-fiduciary-solution-facebook-digitalgerrymandering


This paper is available on arxiv under CC BY 4.0 DEED license.