The use of fully autonomous weapons in a war theater would violate international law, campaigners and experts say, as has long been called for a ban on killer robots & # 39;
These AI-powered guns, planes, ships and tanks could fight future wars without being subject to any human control, as high-tech countries invest more in the weapons and the thumb to full autonomy.
Twenty-six countries explicitly support a ban on fully autonomous weapons, with Austria, Belgium and China recently praising thousands of scientists and experts in the field of artificial intelligence and more than 20 Nobel Peace Prize winners.
In a new report, the global NGO coalition campaign against Ban Killer Robots said that completely autonomous weapons would violate the Martens clause – an established international humanitarian law.
It requires that emerging technologies be judged on the basis of the "principles of humanity" and the "rules of public conscience" when they are not already covered by other treaty provisions.
"Allowing the development and use of killer robots would undermine established moral and legal standards," says Bonnie Docherty, senior weapon researcher at Human Rights Watch, who coordinates the campaign for stopping killer robots. "Countries must work together to prevent these weapon systems from being prevented before they spread all over the world.
"The tidal wave of opposition from scientists, faith leaders, technology companies, non-governmental groups and ordinary citizens shows that the public understands that killer robots are crossing a moral threshold, and their concerns, shared by many governments, require immediate response."
More than 70 governments meet for the sixth time at the UN in Geneva on 27 August to discuss the challenges raised by fully autonomous weapons.
The talks were formalized under an important disarmament treaty in 2017, but they are not yet focused on a specific goal and there has been widespread frustration among campaigners with the glacial pace of the process.
However, if the states recommend that the negotiations start in 2019, this will pave the way for their formal approval in November, after almost every country agreed that some form of human control over the use of force at the last meeting should be maintained in April.
"The idea to delegate decisions about life and death to cold, compassion-free machines without empathy or understanding can not comply with the Martens clause and it makes my blood cold," says Noel Sharkey, a robot writer who has written in history about the reality of robot war. 2007 and has acted as spokesman for the Campaign to ban Killer Robots.
"We expect more European countries to rise and the shouts will bring us that keyword" negotiation "into next year's mandate, and there is already a growing consensus that human control of weapon systems is crucial in conflict.
"Some states prefer to switch from a ban protocol to a protocol that requires a positive commitment to ensure meaningful human control, and both come down to the same humanitarian law," he added.
Fully autonomous weapons do not yet exist, but high-ranking military officials have said that the use of the devices – which would select and involve targets without meaningful human control – will be widespread in warfare in a matter of years.
At least 381 partially autonomous weapon and military robotics systems have been deployed or are under development in 12 states, including France, Israel, Russia, the UK and the US.
It has been reported that Russia opposes the ban on completely autonomous weapon systems and joins several others – including the US – to try to block any future negotiations.
Automatic systems, such as mechanized sentries in the Korean demilitarized zone and Israel's Iron Dome, have already been deployed but can not act completely autonomously.
Research by the International Data Corporation has suggested that global expenditures on robotics will double from $ 91.5 billion (£ 71.8 billion) in 2016 to $ 188 billion by 2020 and bring full autonomy closer to realization.