The Austrian government is calling for a system of international ethics on the use of killer robots and drones in combat. Vienna says it wants humans, and not algorithms, to decide on matters of life and death.
Vienna is embarking on a diplomatic initiative to draw up an ethical framework for the use of killer robots on the battlefields of the future.
Foreign Minister Alexander Schallenberg said similar standards should be adopted as those established for landmines and cluster weapons.
"We have to create rules before killer robots reach the battlefield of this Earth," Schallenberg told the German newspaper Welt am Sonntag.
He said the Austrian government was planning a conference in Vienna in 2021 "to usher in a process "to initiate a process that will hopefully lead to an international convention on the use of artificial intelligence on battlefields."
Until now, Schallenberg said, the theme has not been sufficiently addressed at a diplomatic level. "With this conference, we want to create a movement between states, experts, and nongovernmental organizations like the Red Cross," he said.
Decisions on life and death
Outlining the need for a framework, Schallenberg said the types of decisions made in combat scenarios should not be left to artificial intelligence.
"The decision on life and death should ultimately be made by a person with his entire moral-ethical understanding and not an algorithm of zeros and ones," said the minister.
The use of Lethal Autonomous Weapons has been under discussion by the United Nations since 2015, within the framework of the Convention on Certain Conventional Weapons.
Pioneering countries in the field of autonomous weapons systems — Russia, the United States and Israel — reject a binding ban under international law. These military heavyweights face a group of states that are demanding binding legislation as part of the NGO-led Campaign to Stop Killer Robots.
A total of 30 countries want a full ban on the use of killer robots as part of the campaign, as well as the European Parliament.