Robohub.org
 

Open Roboethics initiative delivers statement to United Nations CCW

by
17 November 2015



share this:
AJung_Moon_UN_LAWS_CCW_ORi

On Friday November 13th, AJung Moon from the Open Roboethics initiative delivered a statement at the United Nations Convention on Certain Conventional Weapons (CCW) Meeting of States Parties.

So what was this meeting about? And why was ORi there?

The CCW actually has a long name that is much more descriptive of what it is all about: The Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects as amended on 21 December 2001.

Put simply, the CCW is where different countries come together to review excessively injurious or indiscriminate nature of certain weapons and discuss their prohibition or restriction. It is also the forum in which a coalition of non-governmental organizations, called the Campaign to Stop Killer Robots, brought up the issue of lethal autonomous weapons systems (LAWS). The Campaign has been advocating for an international ban of LAWS — fully autonomous weapons that can target and make life/death decisions without human intervention.

The CCW Meeting of States Parties is an annual meeting where all the countries subject to the weapons convention come together to “review the status and operations of the Convention and its protocols”. At this year’s meeting, part of the agenda was to decide whether and what kind of discussions they should have about the issue of LAWS next year. Should the States take the issue more seriously and formally start discussing for a ban?

ORi_UN_CCW_LAWSWhat do people think about lethal autonomous weapons? Should they be developed or used? Would people think differently about use of these weapons over remotely operated alternatives, such as drones, if their own country is attacking another country? Results from our latest public opinion survey suggests that most people are reluctant to endorse the use of such technologies in waging war.

As a research-driven think tank that has been investigating roboethics related issues in a bottom-up manner (i.e., we reach out to you, the community of future stakeholders of robotics technologies, to explore roboethics related questions) the Open Roboethics initiative was there to inform the decision makers of the public’s opinion on the issue.

Below is the statement AJung delivered on behalf of ORi.

At the end of the day, the CCW meeting concluded with the decision to continue to informally discuss the topic. There will be an informal meeting of experts to discuss the topic of LAWS on April 11th to 15 of next year, just like the informal meeting of experts that was held earlier this year. While this important issue is being discussed, ORi hopes to continue to inform the decision makers and enable stakeholders of the technology — that’s all of us all around the world in our books — to chime in to the conversation.

Public Opinion on the Ethics and Governance of Lethal Autonomous Weapons Systems: Open Roboethics initiative Statement for the UN Convention on Certain Conventional Weapons (CCW) Meeting of the States Parties, Geneva

November 13, 2015

Thank you, Mr. President.

Disinguished Delegates,

I am grateful to be speaking to you on behalf of the Open Roboethics initiative. The Open Roboethics initiative is a think tank based in Canada that takes stakeholder-inclusive approaches to investigating roboethics issues. What should a robot do? What decisions are we comfortable delegating to robots? These are some of the questions we have been exploring in the domain of self-driving vehicles, care robots, as well as lethal autonomous weapons systems, or LAWS.

I would like to share with you some of the key findings from a public opinion survey we conducted this year on the topic of LAWS. As the CCW continues its discussion of LAWS, I believe the Distinguished Delegates will find the results of our survey informative in moving the discussions forward, especially in consideration of the Martens Clause that underscores the importance of the public’s role in these discussions.

While existing public opinion surveys on this topic have mainly been limited to English-speaking populations, our survey was launched in 14 different languages and have attracted over a 1000 responses from 54 countries.

The following are some of our findings:

In general, when our participants were asked to think from the perspective of an aggressor, 71% of our participants indicated that they would rather have their country use remotely operated weapons systems instead of LAWS when waging war. From the perspective of a target of aggression, a majority has also indicated that they would rather be under attack by remotely operated than autonomous weapons systems.

When asked about international ban across different types of lethal autonomous weapons for missions on land, air, and sea, 67% of our participants indicated that all types of lethal autonomous systems should be internationally banned, while 14% said that none of such systems should be banned.

When asked about the development and use of LAWS, 85% of our participants were not in support of using lethal autonomous weapons for offensive purposes. In addition, majority of our participants were also against the development of LAWS for both defensive and offensives purposes.

Given a list of common reasons for supporting the development and use of LAWS, the most supported reason was to save human military personnel from physical harm of war. However, more participants indicated that there are no valid reasons for developing or using LAWS over a remotely operated alternative.

Given a list of reasons for rejecting the technology, the most support went to the assertion that humans should always be the ones to make life/death decisions. This particular principled reasoning has been echoed in other studies.

As acknowledged by scholars, experts, and civil society, there is no agreement on the definition and interpretation of the term “public conscience” referred to in the Martens Clause. It would also be inappropriate to equate results of a single public opinion survey as an accurate measure of public conscience. However, it is also clear that we cannot exercise the requirement of “the dictates of public conscience” without proactively listening to and taking into account the voice of the public.

Based on this year’s data, it is our conclusion that the public is reluctant to endorse development and use of LAWS for waging war. However, at the moment, data on the perception of the technology from non-English speaking countries remain scarce. Our results suggests that more systematic, international public engagement is necessary to support the requirements set out in the Martens Clause.

As part of our on-going efforts to understand and engage the public on this challenging issue, we plan to continue to conduct studies in this domain across the world. In the interest of time, I shared with you only some of the findings from our study. I invite you to visit our website, www.openroboethics.org for more detailed report of our findings.

Thank you.



tags: , , , , , , ,


Open Roboethics Initiative is a roboethics thinktank concerned with studying robotics-related design and policy issues.
Open Roboethics Initiative is a roboethics thinktank concerned with studying robotics-related design and policy issues.





Related posts :



Open Robotics Launches the Open Source Robotics Alliance

The Open Source Robotics Foundation (OSRF) is pleased to announce the creation of the Open Source Robotics Alliance (OSRA), a new initiative to strengthen the governance of our open-source robotics so...

Robot Talk Episode 77 – Patricia Shaw

In the latest episode of the Robot Talk podcast, Claire chatted to Patricia Shaw from Aberystwyth University all about home assistance robots, and robot learning and development.
18 March 2024, by

Robot Talk Episode 64 – Rav Chunilal

In the latest episode of the Robot Talk podcast, Claire chatted to Rav Chunilal from Sellafield all about robotics and AI for nuclear decommissioning.
31 December 2023, by

AI holidays 2023

Thanks to those that sent and suggested AI and robotics-themed holiday videos, images, and stories. Here’s a sample to get you into the spirit this season....
31 December 2023, by and

Faced with dwindling bee colonies, scientists are arming queens with robots and smart hives

By Farshad Arvin, Martin Stefanec, and Tomas Krajnik Be it the news or the dwindling number of creatures hitting your windscreens, it will not have evaded you that the insect world in bad shape. ...
31 December 2023, by

Robot Talk Episode 63 – Ayse Kucukyilmaz

In the latest episode of the Robot Talk podcast, Claire chatted to Ayse Kucukyilmaz from the University of Nottingham about collaboration, conflict and failure in human-robot interactions.
31 December 2023, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association