Asian military forces are rolling out killer robots for a range of battlefield roles as diplomats wrestle with legal semantics over efforts to control artificial intelligence (AI) powered weapons that many fear could trigger the next arms race.
Experts at a session of the UN Convention on Certain Conventional Weapons (CCW) in Geneva this week, attended by delegates from more than 80 nations, could not even agree on how to define the weapons. They have been struggling to answer the same question since 2013.
Scientists say the issue can’t wait. In the past two years about 23,000 have signed an open letter calling for a moratorium on the development and use of lethal automated weapons systems (LAWS), including physicist Stephen Hawking, Elon Musk of Tesla and Apple’s Steve Wozniak.
“If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow,” the letter says, in a reference to the Russian-made assault rifle that is found in most global hotspots.
Pakistan is the only Asian country to make a clear commitment on the issue: in 2013, it became the first signatory of a petition by the Campaign to Stop Killer Robots that has since been backed by 21 other nations.
A Stockholm International Peace Research Institute (SIPRI) database lists 381 LAWS, of which 195 are unarmed, 175 are weaponized and the rest are of uncertain status.
Targeting accounts for 130 of armed functions; 184 of unarmed systems and 83 of armed systems are used for mobility. Other functions are for intelligence, interoperability and health management.
Technologies range from automated sentry and anti-missile systems to machine-learning algorithms (used to track cell phone messages) and a range of robotic weapons, including tanks, submarines and surface ships.
A sticking point is that many AI systems also have civilian uses, and there are concerns that a moratorium could affect valuable industry research.
South Korea, one of three Asian countries with the technological ability to develop LAWS, along with Japan and China, stated in a 2015 UN position paper that LAWS discussions “should not be carried out in a way that can hamper research and development of robot technology for civilian use.”
Another concern is that there is no clear distinction between autonomous weapons systems and those that are simply automated and used primarily for defense; automated systems have been used globally since the 1970s.
The SIPRI defines automated systems as “the ability of a machine to perform an intended task without human intervention”, but most existing existing weapons still rely on humans.
It will be difficult to regulate LAWS until a definition has been accepted, as there is no specific reference to these weapons in the Geneva Conventions or the CCW framework.
So far only the US and the United Kingdom have endorsed the use of LAWS, arguing human rights edicts are sufficient; other countries want safeguards on the way LAWS are deployed rather than an outright ban.
Japan said in its 2015 UN position paper that it “has no plan to develop robots with humans out of the loop, which may be capable of committing murder.” China, believed to have spent at least US$215 billion on all weapons in 2015, issued a position paper last December that appeared to support some form of global limits on the deployment of LAWS.
“Such systems cannot effectively distinguish between soldiers and civilians and can easily cause indiscriminate killing or wounding of the innocent. Consequently, pending an appropriate solution, we call on states to exercise caution in their use and especially to prevent their indiscriminate use against civilians,” said the statement by China’s UN delegation.
By October this year China had softened its position, calling only for the global community to “abide by the UN Charter and armed conflict law while using LAWS, respect the sovereignty and territorial integrity of other countries, and attach importance to the humanitarian consequences and other issues possibly caused by these weapons.”
China’s shift came after the US had cited these weapons as a linchpin of its revised Defense Innovation Initiative. Known as the Third Offset Strategy, the initiative embraces LAWS as a way of maintaining the US military’s technological edge and trimming spiraling defense overheads.
“In this way it would continue to outmatch Russia, China, Iran or North Korea, even if those countries were to catch up with the USA in the development of high-end weapon technologies…,” the SIPRI said, adding that Russia and China had both started to “formulate possible reactions.”
Russia is expected to step up research into LAWS, but they will remain a secondary objective behind efforts to reduce America’s dominance in conventional warfare based on wider deployment of nuclear weapons.
So far, China has focused on matching key US innovations. The SIPRI said it still lags the US in “many of the more fundamental technology areas”, including unmanned aerial vehicles, data links and sensors.
Border tensions, especially on the Korean Peninsula and the Himalayas between India and Pakistan, will encourage the spread of LAWS in Asia. India has technology gaps, but is believed to be considering using sensors and automated sentry systems in difficult terrains like the Siachen Glacier.
South Korea is a frontrunner in the development and export of active protection systems – weapons that shield armored vehicles – and robotic sentries. Sentry systems are prohibited from the tense Demilitarized Zone shared with the north, but reportedly have been deployed elsewhere.
Most LAWS used in Asia are defensive, especially automated air and maritime defense systems deployed by countries like South Korea, Taiwan, Singapore, Pakistan and Thailand. But there is a shift toward more “offensive” weapons that could up the tempo of the moratorium debate.
India, China and South Korea have all purchased the Harpy Air Defense Suppression System, an Israeli-made drone with a “loitering” capability of nine hours. Loitering weapons are the only offensive weapons known to be capable of acquiring and engaging targets completely autonomously.
Loitering LAWS now need human intervention to determine the loitering time, location of deployment and category of targets to be attacked. But this may not be the case with the next generation of the technology.