Robohub.org
 

Losing Humanity: The case against killer robots

by
20 November 2012



share this:

Released yesterday by the Human Rights Watch (HRW) organization, the report Losing Humanity: The Case against Killer Robots (herein: the report) and associated press release and video (below), warn of the increasing governmental interest in autonomous, “killer” robots, and the dangers that such technology would pose if ever allowed onto the battlefield.

As mentioned on Robohub before (Rise of the robots: The robotization of today’s military), today’s military is becoming increasingly robotic. In its current iteration, all military strikes carried out by robots are authorized by a human operator, meaning that military decisions – decisions to end human lives – are supported by and traceable through a human chain of accountability.

However as documented by the HRW, governments are showing increasing interest in automating this decision process, thus removing the accountable human from the chain. This line of thought raises the question: who is liable if an autonomous robot kills someone? The HRW argues that autonomous robots “would inherently lack human qualities that provide legal and non-legal checks on the killing of civilians”. We’ve discussed some of these ethical issues with Noel Sharkey (also featured in the above video) and Ronald Arkin before in our two part series on robot ethics (part 1, part 2).

Precursor technologies (discussed in detail on page 4 of the report) are already being developed and in some cases, deployed on the battlefield. These technologies have the ability to identify and target a threat, but will wait upon human permission before engaging. These technologies include South Korea’s sentry turrets (with a similar installation being carried out by Israel); the US Military’s missile defense systems; and various autonomous aircraft currently in development.

The US Government has confirmed its interest in autonomous military robots, through the release of military development roadmaps. The Unmanned Systems Integrated Roadmap FY2011-2036 released by the US Department of Defense states that the department “envisions unmanned systems seamlessly operating with manned systems while gradually reducing the degree of human control and decision making required for the unmanned portion of the force structure”.

Similar visions are expressed by the US Army, who state a “current goal of supervised autonomy, but with an ultimate goal of full autonomy”; the US Navy, who state that “while admittedly futuristic in vision, one can conceive of scenarios where UUVs sense, track, identify, target, and destroy an enemy – all autonomously”; and the US Airforce, who foresee that:

Increasingly humans will no longer be “in the loop” but rather “on the loop” – monitoring the execution of certain decisions. Simultaneously, advances in AI will enable systems to make combat decisions and act within legal and policy constraints without necessarily requiring human input.

These excerpts, all taken directly from unclassified military development roadmaps, show a clear governmental interest in reducing the dependence on humans, with a final goal of autonomous decision making and execution. Thankfully, these roadmaps do question the legal, ethical and political ramifications of fully autonomous “lethal systems”, which have “not yet been fully addressed and resolved”.

The report addresses these legal and ethical questions and calls for the following two resolutions to be made:

To All States:

  • Prohibit the development, production, and use of fully autonomous weapons through an international legally binding instrument.
  • Adopt national laws and policies to prohibit the development, production, and use of fully autonomous weapons.
  • Commence reviews of technologies and components that could lead to fully autonomous weapons. These reviews should take place at the very beginning of the development process and continue throughout the development and testing phases.

To Roboticists and Others Involved in the Development of Robotic Weapons:

  • Establish a professional code of conduct governing the research and development of autonomous robotic weapons, especially those capable of becoming fully autonomous, in order to ensure that legal and ethical concerns about their use in armed conflict are adequately considered at all stages of technological development.

In closing, I would like to leave you with an extract from page 8 the report:

In training their troops to kill enemy forces, armed forces often attempt “to produce something close to a ‘robot psychology,’ in which what would otherwise seem horrifying acts can be carried out coldly.” This desensitizing process may be necessary to help soldiers carry out combat operations and cope with the horrors of war, yet it illustrates that robots are held up as the ultimate killing machines.

I personally believe we are at the beginning of the era of robotics, during which robots will become increasingly present in our everyday lives. I only hope that this era and that the robots we develop, will be remembered for more than simply being the ultimate killing machines.

 



tags: , , , , , ,


Mike Hamer





Related posts :



Sponge makes robotic device a soft touch

A simple sponge has improved how robots grasp, scientists from the University of Bristol have found.
07 June 2023, by

#ICRA2023 awards finalists and winners

In this post we bring you all the paper awards finalists and winners presented during the 2023 edition of the IEEE International Conference on Robotics and Automation (ICRA).
05 June 2023, by

Ranking the best humanoid robots of 2023

Is Rosie the Robot Maid from the Jetsons here yet? As more and more companies announce their work towards the affordable humanoid robot, I wanted to create a reference chart.
03 June 2023, by

Robot Talk Episode 51 – James Kell

In this week's episode of the Robot Talk podcast, host Claire Asher chatted to James Kell from Jacobs Engineering UK all about civil infrastructure, nuclear robotics and jet engine inspection.
02 June 2023, by

Automate 2023 recap and the receding horizon problem

“Thirty million developers” are the answer to driving billion-dollar robot startups, exclaimed Eliot Horowitz of Viam last week at Automate.
01 June 2023, by

We are pleased to announce our 3rd Reddit Robotics Showcase!

The 2021 and 2022 events showcased a multitude of fantastic projects from the r/Robotics Reddit community, as well as academia and industry. This year’s event features many wonderful robots including...
30 May 2023, by





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association