Author & futurist writing about QC, AI & other interesting things
‘War is what happens when language fails.’
— Margaret Atwood
We live in unprecedented times — at least in Europe and the West: apart from the blip that was the war in the Balkans in the mid-1990s, we have been conflict-free since 1945. Okay, so the Cold War was a ‘war’ of threats and dilly-dallying and second-guessing the strength of the other side not to mention the near-miss of all-out nuclear war during the Cuban Missile Crisis. But ignoring all that — and the continual wars in the Middle East, the Dark Continent as well as Asia — we should count ourselves very lucky.
‘We are all a great deal luckier that we realize, we usually get what we want — or near enough.’
– Roald Dahl
Violence always happens somewhere else, on some poor African’s doorstep — or worse still, at least for Americans and the British — interventionists wars in Iraq and Afghanistan.
In recent years, however, wars have come more in the shape of terrorist attacks, bombings and who can forget that sorrowful day in September 2001?
Yet throughout all this, throughout all the pain and torment we humans seem to be so good at creating, comes more of it.
Oozing out of the jingoistic belch of our own self-proclamations.
We are natural distillers of war, artisans of atrocity.
Did you know that the global arms trade as of 2016 was verging on the $90 billion mark?
‘Listen up — there’s no war that will end all wars.’
— Haruki Murakami
But listen to this: senior military officials and generals predict autonomous weapons systems (AWS) will be used universally in the next few years while the spending globally will be over $180 billion by the year 2020, more than double the spend in 2016.
Even though this is sadly the case, countries — with the threat from rogue states such as North Korea and Iran — are year-on-year spending more on their defense budgets to counter the threat. In the future, especially with the onset of better technology that will not only be cheaper than current technologies, but also more efficient, the money spent is set to climb to unbelievable heights.
This is where AI technologies in the defense arena will have a monumental part to play in the coming years.
For the good or the bad.
It comes as no shock that politicians and chiefs-of-staff in the military around the world have been keeping their eyes on how robots have been developing and in what ways, at the current rate, they will be able to assist their armies in the future of war.
Presently, AI technologies do a lot of good in conflict zones, where they are used as aerial surveillance devices and bomb disposal robots, saving countless soldiers’ lives as well as assisting intelligence agencies with crucial information on the enemies position and potential battlefield strategies.
‘Sometimes you have to pick the gun up to put the gun down.’
— Malcolm X
Somehow, though, autonomous weapon systems, or more disturbingly ‘lethal automated weapons’ (LAW) as many like to call them, will be designed for more sinister purposes.
Reports by the US Department of Defence claim that AWS will save many lives once fully integrated into the military, as once targets have been selected and earmarked, human intervention is not needed. Reducing army personnels’ active engagement in warzones and lowering the mortality rate.
If only the generals of the British, French, German and Russian as well as the other brave armies which fought during the First World War had such technology than maybe the human cost would have been much lower.
At the moment China, the United States, Russia and Israel are in a race to develop and implement the use of AWS in the form of automated jets, ships and even robot tanks.
Opponents to all this say that AWS, when fully operational, will go against the Geneva Convention, as they believe leaving the decision whether a human lives or dies up to the AI technology within a robot is morally objectionable.
Academics in AI and other influential campaigners have brought their case to the UN to stop the development of AWS into the mainstream of military development. So far they have seen limited results: 2018 the US along with a handful of other countries prevented talks on the wholesale ban on AWS. Noel Sharkey, a professor of artificial intelligence at the University of Sheffield in England and a leading campaigner in the Stop Killer Robots movement, is one of those fighting against the scaling up of AWS. He had this to say:
“I am here to help resist the application of AI and robotics on strictly economic grounds without consideration of human responsibility and societal impact.”
An international treaty, if it comes to fruition in the coming years, will only come into force in military conflict zones. In regard to policing and other internal control systems within a given country’s borders, the use of AWS will still be allowed. This is very worrying indeed for those countries with a poor track record of policing and surveillance of their citizens like China, Iran, North Korea and the former Asian Soviet republics.
‘War may sometimes be a necessary evil. But no matter how necessary, it is always an evil, never a good. We will not learn how to live together in peace by killing each other’s children.’
— Jimmy Carter
What for the people in those countries if their governments still have access to these technologies, both for monitoring and for killing.
It was only three decades ago that Saddam Hussein massacred thousands of Kurds in Iraq. What could Kim Jong-un of North Korea or Sooronbay Jeyenbekov of Kyrgyzstan do with autonomous weapons systems to their own people given half a chance?
Or the repressive government of Nicolás Maduro in Venezuela?
It doesn’t bear thinking about.
Others, neither for nor against, have a middle solution to all this with what is called ‘less than lethal weapons’. These, unsurprisingly, do what they say on the tin. Already there are companies producing this hybrid AI weapon technology as drones armed with paintballs and tasers guns.
This is all well and good, yet what happens if any of this technology, be it LAW, AWS or the innocuous-sounding ‘less than lethal weapons’ get into the wrong hands? What then?
Do we just fight fire with fire and use these autonomous weapons systems against themselves?
Machine on machine. Robot versus robot. AI contra AI.
Battleground chess. Wargames on a grander scale than anything witnessed in history.
The dark shadow of terrorist groups like Al Kheida, ISIS, the IRA, ETA, and even FARC in Colombia have caused mayhem and heartache to many thousands of people over the decades, and to be sure if these groups had access to AWS by stealing them or through the trade of the technology via the black market, most cynics would bet their bottom dollar they wouldn’t think twice about using them for ill will to boost their deranged causes.
Whatever angle and opinions are stirred up by all this, one thing is for certain — the future will be dominated by AWS. And how we use that technology is entirely up to us.
War, huh, yeah!
What is it good for?