Lethal technology raises profound ethical questions
Attack of the Clones is a space opera, a fantasy ostensibly untroubled by science. Yet the much maligned second episode in the Star Wars series has become an accidental forecast: high-powered robot warfare is real. Artificial intelligence is at its most potent when borne by advanced killing machines.
With the global media myopically focused on the pandemic, many will have missed news of a more sinister threat. In a key development last summer, Pakistani terrorists used drones to attack the Indian Air Force base at Jammu, a site of strategic importance a few miles from the disputed border. It was thought to be the first example of a drone attack on an Indian military base. It is unlikely to be the last.
The robots deployed at Jammu were basic in comparison to a new breed of flying assassins. The Turkish-made Kargu-2 can allegedly track and kill specific targets by using facial recognition and AI. These airborne killers operate autonomous of human control. A United Nations report reveals that this model has been used to mount self-directed attacks on human targets. They identified, located, and attacked retreating military units. At no point in their mission did they require data connectivity to human beings at base. Once deployed, the drones are free to go about their core business – death and destruction delivered by flying device.
The ethics behind these developments are shudderingly stark. At what point are human beings applying judgment? The fire-and-forget functionality of the technology means the red button might be pressed just once – at launch. While some armies might demand further checks and balances, the technology itself requires no such second guessing. At the point of no return – the strike – the drones are capable of independent action. Giving the final order to destroy – a crucial responsibility of military leaders – is, potentially, unnecessary.
Readers of this column will be assured that I’m no Luddite. Technology, robots, do many things better than humans. The era of groceries – even emergency medical supplies – delivered same-day to your doorstep via miniature helicopter is growing near. It is a natural progression – drones are mere extensions of the domestic technology that has been outstripping military innovation for decades. They are essentially smartphones with propellers.
Yet the danger lies in the philosophical paradox of imbuing drones with humanlike intelligence while simultaneously diminishing their need for human control. The risk is open-ended. Unlike manned armies, drones furnished with sufficient AI to be able to scope, command and execute a mission autonomously are straightforwardly replicable. Off-the-shelf robots will be available to anyone, from rogue militaries to terrorists like those in Pakistan, to teenage nerds with a grievance.
Aside from the risk of emergent tech falling into the wrong hands, there is another problem. It’s not even reliable. AI still struggles to identify simple objects, let alone people. An image-recognition system that was trained to identify an apple as a fruit was tricked into identifying an apple as an iPod, simply by taping to the apple a piece of paper with the word ‘iPod’ printed on it. In Hong Kong, protestors used facepaint to bewilder the government’s facial-recognition efforts. The mundane appearance of rain, snow or fog confounds technology in the most advanced cars.
Human judgment matters. It is imperfect. But it is accountable. It stands a better chance of recognizing an apple on a foggy day than much of the latest technology. Moreover, humans exercise an ethical choice as to whether gunning down a retreating convoy is morally acceptable. To an AI, a target is a target.
Yet what chance a rethink? The outlook looks bleak. The US and China have both resisted calls for a ban on the development and production of fully autonomous weapons. Their reluctance is tacit encouragement for weapons-makers – and governments – to craft a future where tough choices are devolved to things that cannot sweat, cry, or bleed.