Robohub.org
 

From literature to living rooms: Perceptions of robots in society

by
18 December 2014



share this:

As drones have become increasingly accessible, media outlets have been preoccupied with news stories that fuel our fears about the prospect of privacy invasion and physical harm. Although drones have only recently become mainstream, society has endured a long-held fixation with the need to regulate robots in order to save itself from coming into harm’s way.

This dystopian view of robots originates in Golem literature and the romantics. In 16th Century Jewish literature, Rabbi Loew of Prague created the Golem, a creature constructed from clay to protect the community from being expelled by the Roman Emperor. Rabbi Loew would deactivate the Golem on Friday evenings in preparation for the Sabbath. One Friday, the Rabbi forgot to deactivate the Golem, and it became a violent monster that needed to be destroyed. A similar theme emerged in Marry Shelley’s Frankenstein, in which a man-made monster turned against its creator.

The blueprints outlined in Golem literature and the romantics were further refined in the realm of science fiction. Writing just prior to the advent of the modern robotics industry, Asimov advanced three laws to negotiate the dangers associated with the introduction of robots into society proper. Asimov’s Three Laws of Robotics provide that:

  • A robot may not injure a human being or, through inaction, allow a human being to come to harm;
  • A robot must obey orders given to it by human beings; except where such orders would conflict with the First Law; and
  • A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Asimov later added a Zeroth Law that would supersede the Three Laws: a robot may not harm humanity, or by inaction, allow humanity to come to harm.

I posed the following question to Tony Dyson, designer of R2-D2, the brave and lovable droid who many perceive as the true hero of Star Wars: As robots become increasingly autonomous, do you think we will need Asimov’s laws? Here is what Dyson had to say:

I would love to say yes, all intelligent machines (autonomous robots) that are programmed to think for themselves, must also have an overriding ‘hard wired’ set of rules to work with. These should not be guidelines, but must be a set of laws, clearly defined by the ruling body. However the practical problem is, as Rodney Brooks, co-founder of iRobot has alluded to: ‘People ask me about whether our robots follow Asimov’s laws. There is a simple reason [they don’t]: I can’t build Asimov’s laws in them.’

 

So we ask the question, do we face any danger from robots without Asimov’s laws? I don’t see our AI research progressing into ‘Skynet Terminator’ anytime soon, but I may be just saying that, as part of my evil plan – there is a good reason why I share the same name as the ‘Head Robotic Scientist’ in the film Terminator.



Why do we fear robots? The term robot comes from the Czech word robota, which means forced labour. Simply put, we create robots to serve and fulfill our needs. However, advances in artificial intelligence are bringing us closer to achieving autonomous robotics. If and when robots become truly autonomous, we fear that they will no longer serve us – or worse that they will turn against us and destroy us. The consequence of our fear of robots is that we will systematically resist technological advances that may prove beneficial. The debate is yet to be settled on whether robot surgeons will err less frequently than their human counterparts, or whether driverless cars will decrease the number of accidents on our roads. The point is that if we resist these advances, such questions will remain unanswered.

How can we move forward and change our perceptions about robots? In Japan, robots are highly integrated into society and this may have something to do with the different cultural outlook on human-robot interaction. For instance, in 2007, Japan’s Ministry of Foreign Affairs designated Astro Boy as the nation’s envoy for safe overseas travel. In North America, Hollywood could play an important role in shaping positive attitudes towards consumer drones and robots.

Earlier this year, Clive Thompson published an article in the Smithsonian titled “Why Do We Love R2-D2 and Not C-3PO? Thompson explored how the design of robots impacts our reaction to them, arguing that: “R2-D2 changed the mold. Roboticists now understand it’s far more successful to make their contraptions look industrial—with just a touch of humanity. The room-cleaning Roomba looks like a big flat hockey puck, but its movements and beeps seem so “smart” that people who own them give them names.” And it appears that Hollywood does in fact inspire robot makers… Co-founder of iRobot, Helen Greiner recently posted a note on Dyson’s LinkedIn profile, stating: “Because of Tony’s compelling emotive design, I fell in love with R2D2 when I was 11. This enabled my whole career in robotics from attending MIT to cofounding iRobot, the company that makes the Roomba vacuuming robot. I hope you see a little of R2D2 in your Roomba!”



tags: ,


Diana Marina Cooper is Vice President of Legal and Policy Affairs at PrecisionHawk.
Diana Marina Cooper is Vice President of Legal and Policy Affairs at PrecisionHawk.





Related posts :



Open Robotics Launches the Open Source Robotics Alliance

The Open Source Robotics Foundation (OSRF) is pleased to announce the creation of the Open Source Robotics Alliance (OSRA), a new initiative to strengthen the governance of our open-source robotics so...

Robot Talk Episode 77 – Patricia Shaw

In the latest episode of the Robot Talk podcast, Claire chatted to Patricia Shaw from Aberystwyth University all about home assistance robots, and robot learning and development.
18 March 2024, by

Robot Talk Episode 64 – Rav Chunilal

In the latest episode of the Robot Talk podcast, Claire chatted to Rav Chunilal from Sellafield all about robotics and AI for nuclear decommissioning.
31 December 2023, by

AI holidays 2023

Thanks to those that sent and suggested AI and robotics-themed holiday videos, images, and stories. Here’s a sample to get you into the spirit this season....
31 December 2023, by and

Faced with dwindling bee colonies, scientists are arming queens with robots and smart hives

By Farshad Arvin, Martin Stefanec, and Tomas Krajnik Be it the news or the dwindling number of creatures hitting your windscreens, it will not have evaded you that the insect world in bad shape. ...
31 December 2023, by

Robot Talk Episode 63 – Ayse Kucukyilmaz

In the latest episode of the Robot Talk podcast, Claire chatted to Ayse Kucukyilmaz from the University of Nottingham about collaboration, conflict and failure in human-robot interactions.
31 December 2023, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association