Fear of "killer robots" in the war of the future



Will machines soon take over human decision-making in war? Critics paint a "Terminator" -like future in which "killer robots" themselves open fire – with the risk of fatal consequences. Next week, the UN will discuss autonomous weapons, including a uniform definition.

The image shows robots from the movie "Terminator". But autonomous weapons like these will not be used in the war of the future, according to Daniel Nord of the Foreign Ministry, who represents Sweden at the UN meeting to discuss the issue.photo: Claudio Bresciani / TT

In the 1980s, a robot in the form of Arnold Schwarzenegger created a horror on the audience of the film when trying to exterminate humanity on behalf of its own computer Skynet. What is described in "Terminator" is a horror example, but today there are weapons that resemble what is known as killer robots, those are weapons whose computer systems themselves decide to carry out an attack.

"We are proposing autonomous weapons, killer robots, as something that will only be available in the future, but it will be real," said Nic Dawes, deputy director of Human Rights Watch (HRW) Human Rights Watch, at TT.

He calls it one of the most important questions of our time.

HRW is one of the organizations that require the world to ban autonomous weapons. One of the reasons is that the weapon is a violation of international law because no one makes the decision to open the fire.

"The question is also who will be held responsible if someone is killed by a murderous robot Is it the programmer?

Several countries, such as the United States and China, currently have these kinds of weapons under development and as soon as they are used to combat it, it is too late, claim critics. They are supported by researchers and company managers, including Elon Musk.

"If we do not stop now, we will probably see the weapons that are already being used in the next decade," Dawes said, warning of fatal consequences if the machines make unpredictable and incorrect decisions.

The issue of autonomous weapons is now on the UN table. On Monday, representatives from 70 countries will meet in Geneva to discuss how autonomous weapons should be defined, whether they need rules about them and in such cases how they should look. 26 governments support a ban, according to HRW, but no decisions have yet been taken. Several countries, such as the United States and Russia, have crossed over.

Skeptics of a prohibition point to the difficulty of defining what an autonomous weapon is.

"The definition of HRW also includes systems that are already being used, such as non-controversial hunting robots," says Martin Hagström, research director of the Total Defense Research Institute (FOI).

He believes that it is not possible to make rules for the autonomous weapons based on the technology used, but instead to regulate how countries should test the robots they develop. The software in what can become fully autonomous weapons in the future is comparable to self-driving cars, subject to strict revision rules.

"Like all weapons, the robots, if developed, must be compatible with international military legislation – it is Sweden's basic position, says Daniel Nord of the Foreign Ministry, representing Sweden at the UN meeting.

But so far, the countries have not agreed. However, they agreed to release weapons systems according to the North that have insufficient human control.

– Man must be in the decision-making cycle. You can not have a "Terminator" where you press a button and then it goes out and shoots people, he says.

Proponents usually claim that weapons controlled by machines can be safer when the human factor is scaled away. But it is a truth with modification, Nic Dawes thinks.

"Although these weapons are more accurate, they can be used in extremely dangerous ways and make the damage much, much bigger," he says.

facts

Autonomous weapons

Autonomous lethal weapons (LAWS) are weapons that can define a target and attack. The next step is completely autonomous weapons, in which people are not involved at all in the decision-making process, they are not used yet. For example, it could be a drunker who can determine in isolation what a man of the right age would seem to be a soldier who seems to be carrying a weapon and can move like a warrior, and decision "to open fire. Then there are calculations in the "brain" computer of the drinker – which leads to the decision.

So far, no international definition of autonomous weapons has been agreed.

The position of Sweden is that all developed weapons systems must be compatible with the law and that they must be tested according to the so-called "Article 36 process", which tests the weapons on international humanitarian law.

The meeting in Geneva is the sixth time that the UN is speaking formally about autonomous weapons. They hope to reach an agreement in 2019.

Source: Total Defense Research Institute, UN, UN


Source link

Leave a Reply