Robohub.org
 

Clearpath’s public stance on Killer Robots sets precedence in corporate responsibility

by
20 August 2014



share this:
gariepy-ryan_Clearpath

Last week the Waterloo-based Clearpath publicly pledged not to develop lethal autonomous weapons, otherwise known as “killer robots”. In an open letter to the public, Clearpath’s CTO Ryan Gariepy wrote to support the Campaign to Stop Killer Robots, an international coalition that is spearheading the efforts to stop lethal autonomous weapons systems from being developed and deployed. While the Campaign has garnered significant support since its launch, it has not previously had support from the for-profit robotics sector – making Clearpath’s public statement a noteworthy demonstration of corporate responsibility , particularly given the company’s background in military applications. During a short interview with Gariepy at Clearpath, he summarized the letter as:

“When it comes to lethal weapons systems, not only do we not build them, we don’t believe anybody should build them. That’s our big statement here.”

But that’s just a small part of the story.

Clearpath’s statement is important because it demonstrates how the robotics industry can actively participate in the process of designing the ethical and legal landscape of our robotic future. It essentially sets a precedence of what corporate social responsibility will mean as more robots start to roll out.

Gariepy says,

We do feel that it’s the right thing to do. We’ve looked at potential risks to the business, and we feel that though there is risk to our business, this statement supports what our team believes … and what our partners believe.

Believing in something is one thing, but making it publicly known is another. And on the surface, many robotics companies will have more to lose than to gain from making a public statement on this issue, especially those (such as Clearpath) with military contracts. However, compared to large publicly traded companies such as Samsung (which manufactures sentry robots) or iRobot (which makes battlefield robots in addition to robot vacuum cleaners), Clearpath may be young and nimble enough to set such a precedent without long term negative consequences.

Only five years old, and not publicly traded, Clearpath made the decision to take a public stance on the Killer Robots issue based on the wishes of its 56 employees. According to Gariepy, the Canadian company has been following the issue closely since Christof Heyns (United Nations Special Rapporteur on extrajudicial, summary or arbitrary executions) published a report on this topic for the UN in April 2013. Right before Christmas of last year, Clearpath’s employees spent a Friday afternoon debating the topic of lethal autonomous weapons. This led to an internal poll to determine employee consensus on the issue, followed by the drafting of the public statement.

Clearpath is not saying that any and all military robots are bad, and they are very clear about this in the letter. In fact, developing robots for military application is at the heart of the company, which started off with a handful of University of Waterloo Mechatronics Engineering graduates building a robot to clear minefields. Mine-clearing robots are less controversial than lethal autonomous weapons systems, pointing to the existence of a grey area between the different uses of military robotics and whether they result in a net positive for the society. Yet Clearpath has drawn a line in that grey area – something no other robotics company has yet stepped up to do.

After the launch of the Campaign in 2013, the Campaign to Stop Killer Robots travelled to Canada urging the Canadian government to support the cause. According to Executive Director of Mines Action Canada Paul Hannon, the Canadian government’s Foreign Affairs and National Defence departments are still discussing the issue internally and have yet to formally declare Canada’s position. So I asked Hannon whether he thinks being a Canadian company would have made it easier for Clearpath:

Being the first commercial company in the world to make such a statement took a lot of courage I think, no matter where the company is based. It is clear from the statement that they have given it very serious thought and understand the impact it could have.

My personal view is that is probably harder for a Canadian company than an American one because so much R&D funding comes from outside of Canada particularly the US. Canadian companies likely need to develop partnerships with other non-Canadian firms to access those sources of funding. Taking such a principled stand could impact current and potential partnerships, so more credit to them.

 

My intuition is that Clearpath’s attention to roboethics issues will prove to be business savvy in the long run. Robotics has had a difficult time gaining social acceptance – largely because of the ways it has been portrayed in dystopian science fiction, but also because of the inhumanity it can project onto the battlefield. As more robots start rolling out into familiar places such as our homes and offices, the call for roboethics will grow stronger, and this will surely impact the kinds of robots we will allow our governments to deploy in military applications. Public trust is at stake, and the companies that are transparent about where they stand on roboethics issues stand a greater chance of social acceptance.

Roboethics will be a natural avenue for companies to show their social responsibility , and I remain optimistic that we will see more positive movement in this direction. Google’s establishment of an Ethics Board following their acquisition of the AI company DeepMind is perhaps further evidence of this.

I second Hannon who says,

Let’s hope that more companies from many countries follow their lead.

I have my fingers crossed.

A brief history of the Campaign to Stop Killer Robots

It’s been a long time coming in the history of military robotics to reach this point.

I first became interested in roboethics after learning about Samsung’s sentry robot used to monitor the demilitarized zone between the South and North Korea. That was almost a decade ago. Of course long before that, military use of unmanned aerial vehichles (UAVs) was on the rise.

With the persistent and increased use of UAVs in military applications, it became obvious that a natural next step in the technological evolution of the remotely operated UAVs was to develop highly autonomous versions of them. The discussion of lethal autonomous weapons systems among academics became heated, as you’ll find from the 2009 two-part robots podcast interviews with Noel Sharkey and Ron Arkin.

public call to ban the development and deployment of autonomous lethal weapons happened more recently when the International Committee for Robot Arms Control (ICRAC) – one of the key player NGOs involved in the Campaign – drafted it in 2010. Back then, only a handful of academics and interested individuals signed up to support their cause.

ICRAC, along with Human Rights WatchArticle 36, and Mines Action Canada among other well-known international non-governmental organizations, started the campaign in April 2013, which got the United Nations to pay attention to the topic with the 117 states parties (87 countries attended the last meeting) of the Convention on Certain Conventional Weapons (CCW) this year.

Whether CCW will continue to work on the killer robots topic has yet to be decided, but the next annual meeting is scheduled for November of this year.



tags: , , , , , ,


AJung Moon HRI researcher at McGill and publicity co-chair for the ICRA 2022 conference
AJung Moon HRI researcher at McGill and publicity co-chair for the ICRA 2022 conference





Related posts :



Robot Talk Episode 98 – Gabriella Pizzuto

In the latest episode of the Robot Talk podcast, Claire chatted to Gabriella Pizzuto from the University of Liverpool about intelligent robotic manipulators for laboratory automation.
15 November 2024, by

Online hands-on science communication training – sign up here!

Find out how to communicate about your work with experts from Robohub, AIhub, and IEEE Spectrum.
13 November 2024, by

Robot Talk Episode 97 – Pratap Tokekar

In the latest episode of the Robot Talk podcast, Claire chatted to Pratap Tokekar from the University of Maryland about how teams of robots with different capabilities can work together.
08 November 2024, by

Robot Talk Episode 96 – Maria Elena Giannaccini

In the latest episode of the Robot Talk podcast, Claire chatted to Maria Elena Giannaccini from the University of Aberdeen about soft and bioinspired robotics for healthcare and beyond.
01 November 2024, by

Robot Talk Episode 95 – Jonathan Walker

In the latest episode of the Robot Talk podcast, Claire chatted to Jonathan Walker from Innovate UK about translating robotics research into the commercial sector.
25 October 2024, by

Robot Talk Episode 94 – Esyin Chew

In the latest episode of the Robot Talk podcast, Claire chatted to Esyin Chew from Cardiff Metropolitan University about service and social humanoid robots in healthcare and education.
18 October 2024, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association