Robohub.org
 

Clearpath’s public stance on Killer Robots sets precedence in corporate responsibility

by
20 August 2014



share this:
gariepy-ryan_Clearpath

Last week the Waterloo-based Clearpath publicly pledged not to develop lethal autonomous weapons, otherwise known as “killer robots”. In an open letter to the public, Clearpath’s CTO Ryan Gariepy wrote to support the Campaign to Stop Killer Robots, an international coalition that is spearheading the efforts to stop lethal autonomous weapons systems from being developed and deployed. While the Campaign has garnered significant support since its launch, it has not previously had support from the for-profit robotics sector – making Clearpath’s public statement a noteworthy demonstration of corporate responsibility , particularly given the company’s background in military applications. During a short interview with Gariepy at Clearpath, he summarized the letter as:

“When it comes to lethal weapons systems, not only do we not build them, we don’t believe anybody should build them. That’s our big statement here.”

But that’s just a small part of the story.

Clearpath’s statement is important because it demonstrates how the robotics industry can actively participate in the process of designing the ethical and legal landscape of our robotic future. It essentially sets a precedence of what corporate social responsibility will mean as more robots start to roll out.

Gariepy says,

We do feel that it’s the right thing to do. We’ve looked at potential risks to the business, and we feel that though there is risk to our business, this statement supports what our team believes … and what our partners believe.

Believing in something is one thing, but making it publicly known is another. And on the surface, many robotics companies will have more to lose than to gain from making a public statement on this issue, especially those (such as Clearpath) with military contracts. However, compared to large publicly traded companies such as Samsung (which manufactures sentry robots) or iRobot (which makes battlefield robots in addition to robot vacuum cleaners), Clearpath may be young and nimble enough to set such a precedent without long term negative consequences.

Only five years old, and not publicly traded, Clearpath made the decision to take a public stance on the Killer Robots issue based on the wishes of its 56 employees. According to Gariepy, the Canadian company has been following the issue closely since Christof Heyns (United Nations Special Rapporteur on extrajudicial, summary or arbitrary executions) published a report on this topic for the UN in April 2013. Right before Christmas of last year, Clearpath’s employees spent a Friday afternoon debating the topic of lethal autonomous weapons. This led to an internal poll to determine employee consensus on the issue, followed by the drafting of the public statement.

Clearpath is not saying that any and all military robots are bad, and they are very clear about this in the letter. In fact, developing robots for military application is at the heart of the company, which started off with a handful of University of Waterloo Mechatronics Engineering graduates building a robot to clear minefields. Mine-clearing robots are less controversial than lethal autonomous weapons systems, pointing to the existence of a grey area between the different uses of military robotics and whether they result in a net positive for the society. Yet Clearpath has drawn a line in that grey area – something no other robotics company has yet stepped up to do.

After the launch of the Campaign in 2013, the Campaign to Stop Killer Robots travelled to Canada urging the Canadian government to support the cause. According to Executive Director of Mines Action Canada Paul Hannon, the Canadian government’s Foreign Affairs and National Defence departments are still discussing the issue internally and have yet to formally declare Canada’s position. So I asked Hannon whether he thinks being a Canadian company would have made it easier for Clearpath:

Being the first commercial company in the world to make such a statement took a lot of courage I think, no matter where the company is based. It is clear from the statement that they have given it very serious thought and understand the impact it could have.

My personal view is that is probably harder for a Canadian company than an American one because so much R&D funding comes from outside of Canada particularly the US. Canadian companies likely need to develop partnerships with other non-Canadian firms to access those sources of funding. Taking such a principled stand could impact current and potential partnerships, so more credit to them.

 

My intuition is that Clearpath’s attention to roboethics issues will prove to be business savvy in the long run. Robotics has had a difficult time gaining social acceptance – largely because of the ways it has been portrayed in dystopian science fiction, but also because of the inhumanity it can project onto the battlefield. As more robots start rolling out into familiar places such as our homes and offices, the call for roboethics will grow stronger, and this will surely impact the kinds of robots we will allow our governments to deploy in military applications. Public trust is at stake, and the companies that are transparent about where they stand on roboethics issues stand a greater chance of social acceptance.

Roboethics will be a natural avenue for companies to show their social responsibility , and I remain optimistic that we will see more positive movement in this direction. Google’s establishment of an Ethics Board following their acquisition of the AI company DeepMind is perhaps further evidence of this.

I second Hannon who says,

Let’s hope that more companies from many countries follow their lead.

I have my fingers crossed.

A brief history of the Campaign to Stop Killer Robots

It’s been a long time coming in the history of military robotics to reach this point.

I first became interested in roboethics after learning about Samsung’s sentry robot used to monitor the demilitarized zone between the South and North Korea. That was almost a decade ago. Of course long before that, military use of unmanned aerial vehichles (UAVs) was on the rise.

With the persistent and increased use of UAVs in military applications, it became obvious that a natural next step in the technological evolution of the remotely operated UAVs was to develop highly autonomous versions of them. The discussion of lethal autonomous weapons systems among academics became heated, as you’ll find from the 2009 two-part robots podcast interviews with Noel Sharkey and Ron Arkin.

public call to ban the development and deployment of autonomous lethal weapons happened more recently when the International Committee for Robot Arms Control (ICRAC) – one of the key player NGOs involved in the Campaign – drafted it in 2010. Back then, only a handful of academics and interested individuals signed up to support their cause.

ICRAC, along with Human Rights WatchArticle 36, and Mines Action Canada among other well-known international non-governmental organizations, started the campaign in April 2013, which got the United Nations to pay attention to the topic with the 117 states parties (87 countries attended the last meeting) of the Convention on Certain Conventional Weapons (CCW) this year.

Whether CCW will continue to work on the killer robots topic has yet to be decided, but the next annual meeting is scheduled for November of this year.



tags: , , , , , , ,


AJung Moon HRI researcher at McGill and publicity co-chair for the ICRA 2022 conference
AJung Moon HRI researcher at McGill and publicity co-chair for the ICRA 2022 conference





Related posts :



Open Robotics Launches the Open Source Robotics Alliance

The Open Source Robotics Foundation (OSRF) is pleased to announce the creation of the Open Source Robotics Alliance (OSRA), a new initiative to strengthen the governance of our open-source robotics so...

Robot Talk Episode 77 – Patricia Shaw

In the latest episode of the Robot Talk podcast, Claire chatted to Patricia Shaw from Aberystwyth University all about home assistance robots, and robot learning and development.
18 March 2024, by

Robot Talk Episode 64 – Rav Chunilal

In the latest episode of the Robot Talk podcast, Claire chatted to Rav Chunilal from Sellafield all about robotics and AI for nuclear decommissioning.
31 December 2023, by

AI holidays 2023

Thanks to those that sent and suggested AI and robotics-themed holiday videos, images, and stories. Here’s a sample to get you into the spirit this season....
31 December 2023, by and

Faced with dwindling bee colonies, scientists are arming queens with robots and smart hives

By Farshad Arvin, Martin Stefanec, and Tomas Krajnik Be it the news or the dwindling number of creatures hitting your windscreens, it will not have evaded you that the insect world in bad shape. ...
31 December 2023, by

Robot Talk Episode 63 – Ayse Kucukyilmaz

In the latest episode of the Robot Talk podcast, Claire chatted to Ayse Kucukyilmaz from the University of Nottingham about collaboration, conflict and failure in human-robot interactions.
31 December 2023, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association