paint-brush
Are Brain-Computer Interfaces at Risk of Mass Cyberattacks?by@thesociable
215 reads

Are Brain-Computer Interfaces at Risk of Mass Cyberattacks?

by The SociableAugust 11th, 2024
Read on Terminal Reader
Read this story w/o Javascript

Too Long; Didn't Read

A coding error in a CrowdStrike update affected Windows users on July 19, 2024. If one faulty software update can shut down huge swaths of society, imagine what could happen in a transhumanist future when BCIs require their own updates for the hive mind. The only thing you can hope for is that the company who made your brain chip, and the companies that develop the software, can put you back online before you disassociate completely.
featured image - Are Brain-Computer Interfaces at Risk of Mass Cyberattacks?
The Sociable HackerNoon profile picture

If one faulty software update can shut down huge swaths of society, imagine what could happen in a transhumanist future when BCIs require their own updates for the hive mind: perspective


Disruptive chaos ensued on the morning of July 19, 2024, when the digital realm infected the physical world due to what was reported to be a coding error in a CrowdStrike update affecting Windows users.


Airports, banks, healthcare services, and major news networks were all disrupted due to “a defect found in a single content update for Windows hosts,” and “not a security incident or cyberattack,” according to a statement from cybersecurity firm CrowdStrike.


If just one software update can cause this much disruption without the need for a malicious cyberattack, imagine what could happen in a transhumanist future when brain-computer interfaces (BCIs) like Neuralink become prevalent and require software updates of their own.

Instead of waking up to seemingly telepathic abilities, you discover that your brain chip is giving you the blue screen of death like those seen in airports all over the globe.


As the software continues to glitch, your neurons try to make sense of the faulty information pulsating through the neural mesh of your BCI, and the digital signals begin to resemble the physical chaos of passengers scrambling to self-organize in an attempt to find order in a sea of failing circuitry.


The only thing you can hope for, if indeed you have the brain capacity to hope at all, is that the company who made your brain chip, and the companies that develop the software, can put you back online before you disassociate completely.


Maybe you get lucky and the issue is identified and resolved relatively quickly, but what if it doesn’t?


“A lack of cybersecurity has become a clear and immediate danger to our society worldwide”

Klaus Schwab, Cyber Polygon, 2021


Over time, hardware fails and software needs upgrades.


If centralization and consolidation within digital infrastructure continues, and if proper steps aren’t taken to ensure that BCIs can’t override our physical and cognitive abilities or eliminate our free will, then as a species we may truly fall into the hive mind and into the abyss.


We are now moving from the Internet of Bodies to the Internet of Brains.


A recent RAND Corporation report says that “An ‘internet of bodies’ may also ultimately lead to an ‘internet of brains’, i.e. human brains connected to the internet to facilitate direct brain-to-brain communication and enable access to online data networks.”

Additionally, “Advances in brain-computer interfaces may translate into developing brain-to-brain communication technologies, leading to entirely new modes of interpersonal communication.”


This is the anticipated digital hive mind.


Elon Musk says that one day we’re going to be able to upload and download memories between humans and machines.

What happens when a thought or a memory enters your consciousness that isn’t your own?


What would you do if a foreign memory that wasn’t yours entered your own consciousness? Would you even know it?


“Human augmentation could undermine end users’ cognitive safety and influence their ability to assess the credibility of information they receive while using human augmentation technologies”

RAND Corporation, March 2024


On January 30, 2019 I had a dream that I visited a psychiatric hospital where the patients were being treated for maladies in their brain chips.


I know the date because I wrote and published a story the very next morning called “The Übermensch in the Cuckoo’s Nest: Malware in AI-human Hybrids.”


In that dream I saw catatonic patients drooling, sitting, staring into some unknown cyber abyss.


I saw men, women, and children flinging themselves against padded walls, flailing on rolling chairs, and writhing on checkered floors — not being able to make sense of what was passing through their brains — clutching at howling demons whizzing in mid-air, smelling things that weren’t really there, and seeing things that would terrify them into despair.


In the story I wrote the morning after the dream, I pondered, “What would happen if malware was introduced to the human body? […] Hospitals of the future would have to remedy this but how?”


Patients wouldn’t know the difference between what was “real” and what was an AI-generated hallucination.


When I awoke, I concluded that if and when BCIs became mainstream, countermeasures would probably have already been put into place.

If not, “Our hospitals may be treating patients with chronic implant disorders whose illnesses may look like scenes coming out of Dante’s Inferno or The Book of the Dead.”


My dream from almost six years ago may just be imagination and fantasy, but the thought of who owns the hardware, where the signals are coming from and where they’re going, are very real discussions taking place among top scientists, ethicists, and researchers.


“What you think, what you feel — it’s all just data — data that in large patterns can be decoded using artificial intelligence”

Dr. Nita Farahany, WEF Annual Meeting, 2023


Speaking at the 2023 World Economic Forum (WEF) Annual Meeting in Davos, Duke University’s Dr. Nita Farahany gave a presentation on the battle for your brain where she said:


“Artificial intelligence has enabled advances in decoding brain activity in ways we never before thought possible.


“What you think, what you feel — it’s all just data — data that in large patterns can be decoded using artificial intelligence.”

“We can pick up emotional states — like are you happy or sad or angry,” she added.


“We can pick up and decode faces that you’re seeing in your mind — simple shapes, numbers, your PIN number to your bank account.”


“Geopolitical instability makes a catastrophic cyber event likely in the next two years“

Jeremy Jurgens, WEF Annual Meeting, 2023


At the 2020 WEF Annual Meeting, there was a discussion called “When Humans Become Cyborgs,” in which the panelists attempted to tackle some of the very large ethical questions surrounding bodily integrity and who actually owns the Brain Computer Interface.


The panelists noted that military officers were very concerned over such issues as:


  • Do I own my own implant?
  • Does my implant become part of me?
  • What happens when I leave the military?
  • Who pays for my implant?
  • Does my implant get removed?
  • Do I get to keep my implant for life?
  • Does my implant get upgraded? Who pays for that?


Apart from addressing who owns the hardware, another panelist pondered where the intimate BCI data would go:


“If our brains are connected, and you record for instance, what you are possibly thinking, what areas of your brain are being stimulated […] what you are feeling, and so on — this data is going to be stored somewhere.”


“We all know, but still pay insufficient attention to, the frightening scenario of a comprehensive cyber attack, which would bring a complete halt to the power supply, transportation, hospital services, our society as a whole”

Klaus Schwab, Cyber Polygon, 2020


Beginning in 2021, we saw how booster shots became normalized.


What are boosters if not software upgrades for the body?


Coming back full circle to the CrowdStrike and Microsoft moment of July 19, 2024, what would happen if this type of faulty software update were to occur in the era of transhumanism?


How would a coding error, a bad bug, a rogue algorithm, or a malicious malware affect our cognitive abilities, our mental states, our physical bodies, our souls?


One day we may have superhuman abilities plugged into the hive matrix, and the next we are reduced to drooling zombies hallucinating on someone else’s tech.


Of course, this is not likely to happen, and our technology is likely to improve as to prevent the very idea of such mayhem from ever becoming feasible, nor to even warrant such scenarios as I have put forth.


But hey! if anything, at least it could make for a very interesting work of science fiction.


Tim Hinchliffe, Editor, The Sociable