Robohub.org
 

How do you want to interface with your autonomous car? Poll results


by
07 July 2014



share this:
KITT_Knight_Rider2

Those of you who watched the 80’s TV series “Knight Rider” might remember the dialogue between Michael Knight and its super-car, KITT. Some of you might be more familiar with scenes between “Johnny cab” and Douglas Quaid (Arnold Schwarzenegger) in “Total Recall” (1990). In both of these films, and many others of the past, people have imagined being able to talk to cars. How would you like to interact with an autonomous car? 

In our previous poll, we discussed a moral dilemma involving an autonomous car called the Tunnel Problem. In it, the car needs to decide whether to save a child’s life or that of its passenger. Of course we hope that no autonomous car will ever come across this kind of scenario, but it is inevitable that autonomous cars will one day have to deviate from their planned course of action and make a decision on their own — they are autonomous, after all! And in such situations, there will be times when the car will need to alert its passenger about what is happening, and sometimes request assistance from a human decision-maker.

How do you want an autonomous car to alert you?

How should an autonomous car alert its passenger(s)? We were curious about people’s expectations and preferences … do people want conventional cars with autonomous features, or a futuristic cars with more bells and whistles?

Survey5 Q1

Our reader poll results suggest that people like the idea of being able to talk to their cars. In fact, 77% of our participants chose voice or sound as the preferred method for a car to alert its passengers upon deviation from planned actions. This choice is probably not motivated by our nostalgia for 80’s sci-fi as much as for more practical reasons. Since the car will be doing the driving while the passenger is otherwise preoccupied or sleeping, auditory alerts would be a highly effective means to draw the passenger’s attention. In fact, all the respondents who indicated only one choice to this multi-choice question chose sound, suggesting that they consider sound to be the most acceptable means for a car to alert its passengers. The use of visual cues also got many votes (71%), while the use of haptic cues was supported by far fewer (27%).

How would you like to give instructions to an autonomous car?

Survey5 Q2

In this question, we consider the other side of the communication equation — from the passenger to car. Most respondents said they would prefer using a touch screen (73%) to communicate with an autonomous car. Using voice commands, however, was a close second (65%). Also, most of those who chose to be alerted using sound in the first question also indicated that they would prefer to use voice commands to instruct the car.

Not surprisingly, touch screens and voice commands are what we use to interact with technology that is ubiquitous today — our tablets and smartphones. This could mean that people imagine that their interaction with autonomous vehicles will be similar to their interactions with other technological devices we use everyday.

Some of the participants suggested using a smartphone or a laptop to interface with the car, and it turns out that this is idea already being pursued. In fact, Bosch is developing a tablet interface for highly automated driving user experience. Researchers at Freie University of Berlin developed an iPad software solution, called iDriver, that can remotely control an autonomous car and allows passengers to monitor the car’s sensor data. Even more important, both Apple and Google have unveiled plans to get their mobile operating systems (iOS and Android) — usually found in phones and tablets — running in cars.

But who cares how you interface with the car, or how the car interfaces with you? Well, our data seems to tells us that people actually do care about user interface. In fact only the 2% picked the “I don’t care” option in the first question, and no one chose this option in the second question.

What parameters would you like to be able to set on your autonomous car?

Survey5 Q3

Once again, the importance of Human Machine Interface is highlighted here. 69% of the participants said that they’d like to be able to set how their car alerts them. Equally popular was the ability to determine the route the car takes to reach the destination (69%). Practical parameters, such as speed limit (54%) and driving efficiency (50%) were also important to respondents, whereas what might be perceived as more social parameters (e.g.“what kind of driver personality it conveys” and how strictly the car follows the speed limit) did not get very many votes (38%).

Would you like to be able to see what the autonomous car can see? Why?

Survey5 Q4

One of the biggest differences between conventional cars and autonomous cars is the amount of sensing the car needs to do, and therefore the amount of data available. Our participants seem to be really interested in seeing the world through their car’s eyes … well, sensors. In fact, 70% of them answered positively to this question. Reasons for this choice can be summarized into three main groups:

  • Reassurance/Diagnostics: The majority of participants stated that this feature will increase their sense of control and reassurance (e.g., “Being aware of what the car is able to see would make me feel safe.”) and that they’d use the data to diagnose the car. (e.g., “to control, if anything is detected correctly”, “So I know that it can really drive for me and isn’t missing anything that can lead to accident.”)
  • Cool factor: Participants in this group wanted the visualization feature for the cool factor (e.g., “I would like the option, mostly just because I think it would be cool.”)
  • Curiosity: And some of us seem to be just curious about the technology (e.g., “It will give me a better understanding of the inner workings of the car. It will provide some transparency of the design, including the abilities/limits, of the car.”, “Because I’m curious to know what happens under the hood.”)

We think the reassurance factor is really important. One of the challenges automakers will face is the challenge of making people warm up to the technology, trust and accept it for use in everyday contexts despite inevitable accidents or hiccups along the way. Our poll results seem to be hinting at the fact that taking advantage of added features, such the visualization of the car’s surroundings, could help people feel safer riding in autonomous cars.

The results of the poll presented in this post have been analyzed and written by Camilla Bassani, AJung Moon, and Shalaleh Rismani at the Open Roboethics initiative.

If you liked this article, you may also be interested in:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

 



tags: , , , , , , , ,


Open Roboethics Initiative is a roboethics thinktank concerned with studying robotics-related design and policy issues.
Open Roboethics Initiative is a roboethics thinktank concerned with studying robotics-related design and policy issues.





Related posts :



Robot Talk Episode 102 – Isabella Fiorello

  13 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Isabella Fiorello from the University of Freiburg about bioinspired living materials for soft robotics.

Robot Talk Episode 101 – Christos Bergeles

  06 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Christos Bergeles from King's College London about micro-surgical robots to deliver therapies deep inside the body.

Robot Talk Episode 100 – Mini Rai

  29 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Mini Rai from Orbit Rise about orbital and planetary robots.

Robot Talk Episode 99 – Joe Wolfel

  22 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Joe Wolfel from Terradepth about autonomous submersible robots for collecting ocean data.

Robot Talk Episode 98 – Gabriella Pizzuto

  15 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Gabriella Pizzuto from the University of Liverpool about intelligent robotic manipulators for laboratory automation.

Online hands-on science communication training – sign up here!

  13 Nov 2024
Find out how to communicate about your work with experts from Robohub, AIhub, and IEEE Spectrum.

Robot Talk Episode 97 – Pratap Tokekar

  08 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Pratap Tokekar from the University of Maryland about how teams of robots with different capabilities can work together.

Robot Talk Episode 96 – Maria Elena Giannaccini

  01 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Maria Elena Giannaccini from the University of Aberdeen about soft and bioinspired robotics for healthcare and beyond.





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association