Robohub.org
 

How do you want to interface with your autonomous car? Poll results

by
07 July 2014



share this:
KITT_Knight_Rider2

Those of you who watched the 80’s TV series “Knight Rider” might remember the dialogue between Michael Knight and its super-car, KITT. Some of you might be more familiar with scenes between “Johnny cab” and Douglas Quaid (Arnold Schwarzenegger) in “Total Recall” (1990). In both of these films, and many others of the past, people have imagined being able to talk to cars. How would you like to interact with an autonomous car? 

In our previous poll, we discussed a moral dilemma involving an autonomous car called the Tunnel Problem. In it, the car needs to decide whether to save a child’s life or that of its passenger. Of course we hope that no autonomous car will ever come across this kind of scenario, but it is inevitable that autonomous cars will one day have to deviate from their planned course of action and make a decision on their own — they are autonomous, after all! And in such situations, there will be times when the car will need to alert its passenger about what is happening, and sometimes request assistance from a human decision-maker.

How do you want an autonomous car to alert you?

How should an autonomous car alert its passenger(s)? We were curious about people’s expectations and preferences … do people want conventional cars with autonomous features, or a futuristic cars with more bells and whistles?

Survey5 Q1

Our reader poll results suggest that people like the idea of being able to talk to their cars. In fact, 77% of our participants chose voice or sound as the preferred method for a car to alert its passengers upon deviation from planned actions. This choice is probably not motivated by our nostalgia for 80’s sci-fi as much as for more practical reasons. Since the car will be doing the driving while the passenger is otherwise preoccupied or sleeping, auditory alerts would be a highly effective means to draw the passenger’s attention. In fact, all the respondents who indicated only one choice to this multi-choice question chose sound, suggesting that they consider sound to be the most acceptable means for a car to alert its passengers. The use of visual cues also got many votes (71%), while the use of haptic cues was supported by far fewer (27%).

How would you like to give instructions to an autonomous car?

Survey5 Q2

In this question, we consider the other side of the communication equation — from the passenger to car. Most respondents said they would prefer using a touch screen (73%) to communicate with an autonomous car. Using voice commands, however, was a close second (65%). Also, most of those who chose to be alerted using sound in the first question also indicated that they would prefer to use voice commands to instruct the car.

Not surprisingly, touch screens and voice commands are what we use to interact with technology that is ubiquitous today — our tablets and smartphones. This could mean that people imagine that their interaction with autonomous vehicles will be similar to their interactions with other technological devices we use everyday.

Some of the participants suggested using a smartphone or a laptop to interface with the car, and it turns out that this is idea already being pursued. In fact, Bosch is developing a tablet interface for highly automated driving user experience. Researchers at Freie University of Berlin developed an iPad software solution, called iDriver, that can remotely control an autonomous car and allows passengers to monitor the car’s sensor data. Even more important, both Apple and Google have unveiled plans to get their mobile operating systems (iOS and Android) — usually found in phones and tablets — running in cars.

But who cares how you interface with the car, or how the car interfaces with you? Well, our data seems to tells us that people actually do care about user interface. In fact only the 2% picked the “I don’t care” option in the first question, and no one chose this option in the second question.

What parameters would you like to be able to set on your autonomous car?

Survey5 Q3

Once again, the importance of Human Machine Interface is highlighted here. 69% of the participants said that they’d like to be able to set how their car alerts them. Equally popular was the ability to determine the route the car takes to reach the destination (69%). Practical parameters, such as speed limit (54%) and driving efficiency (50%) were also important to respondents, whereas what might be perceived as more social parameters (e.g.“what kind of driver personality it conveys” and how strictly the car follows the speed limit) did not get very many votes (38%).

Would you like to be able to see what the autonomous car can see? Why?

Survey5 Q4

One of the biggest differences between conventional cars and autonomous cars is the amount of sensing the car needs to do, and therefore the amount of data available. Our participants seem to be really interested in seeing the world through their car’s eyes … well, sensors. In fact, 70% of them answered positively to this question. Reasons for this choice can be summarized into three main groups:

  • Reassurance/Diagnostics: The majority of participants stated that this feature will increase their sense of control and reassurance (e.g., “Being aware of what the car is able to see would make me feel safe.”) and that they’d use the data to diagnose the car. (e.g., “to control, if anything is detected correctly”, “So I know that it can really drive for me and isn’t missing anything that can lead to accident.”)
  • Cool factor: Participants in this group wanted the visualization feature for the cool factor (e.g., “I would like the option, mostly just because I think it would be cool.”)
  • Curiosity: And some of us seem to be just curious about the technology (e.g., “It will give me a better understanding of the inner workings of the car. It will provide some transparency of the design, including the abilities/limits, of the car.”, “Because I’m curious to know what happens under the hood.”)

We think the reassurance factor is really important. One of the challenges automakers will face is the challenge of making people warm up to the technology, trust and accept it for use in everyday contexts despite inevitable accidents or hiccups along the way. Our poll results seem to be hinting at the fact that taking advantage of added features, such the visualization of the car’s surroundings, could help people feel safer riding in autonomous cars.

The results of the poll presented in this post have been analyzed and written by Camilla Bassani, AJung Moon, and Shalaleh Rismani at the Open Roboethics initiative.

If you liked this article, you may also be interested in:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

 



tags: , , , , , , , , ,


Open Roboethics Initiative is a roboethics thinktank concerned with studying robotics-related design and policy issues.
Open Roboethics Initiative is a roboethics thinktank concerned with studying robotics-related design and policy issues.





Related posts :



One giant leap for the mini cheetah

A new control system, demonstrated using MIT’s robotic mini cheetah, enables four-legged robots to jump across uneven terrain in real-time.
23 October 2021, by

Robotics Today latest talks – Raia Hadsell (DeepMind), Koushil Sreenath (UC Berkeley) and Antonio Bicchi (Istituto Italiano di Tecnologia)

Robotics Today held three more online talks since we published the one from Amanda Prorok (Learning to Communicate in Multi-Agent Systems). In this post we bring you the last talks that Robotics Today...
21 October 2021, by and

Sense Think Act Pocast: Erik Schluntz

In this episode, Audrow Nash interviews Erik Schluntz, co-founder and CTO of Cobalt Robotics, which makes a security guard robot. Erik speaks about how their robot handles elevators, how they have hum...
19 October 2021, by and

A robot that finds lost items

Researchers at MIT have created RFusion, a robotic arm with a camera and radio frequency (RF) antenna attached to its gripper, that fuses signals from the antenna with visual input from the camera to locate and retrieve an item, even if the item is buried under a pile and completely out of view.
18 October 2021, by

Robohub gets a fresh look

If you visited Robohub this week, you may have spotted a big change: how this blog looks now! On Tuesday (coinciding with Ada Lovelace Day and our ‘50 women in robotics that you need to know about‘ by chance), Robohub got a massive modernisation on its look by our technical director Ioannis K. Erripis and his team.
17 October 2021, by
ep.

339

podcast

High Capacity Ride Sharing, with Alex Wallar

In this episode, our interviewer Lilly speaks to Alex Wallar, co-founder and CTO of The Routing Company. Wallar shares his background in multi-robot path-planning and optimization, and his research on scheduling and routing algorithms for high-capacity ride-sharing. They discuss how The Routing Company helps cities meet the needs of their people, the technical ins and outs of their dispatcher and assignment system, and the importance of public transit to cities and their economics.
12 October 2021, by





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association