Robohub.org
 

Will motion sickness really be a barrier to self-driving cars?


by
14 April 2015



share this:

car_sick_passengerEarlier this week I was sent some advance research from the U of Michigan about car sickness rates for car passengers. I found the research of interest, but wish it had covered some questions I think are more important, such as how carsickness is changed by potentially new types of car seating, such as face to face or along the side.

To my surprise, there was a huge rush of press coverage of the study, which concluded that 6 to 12% of car passengers get a bit queasy, especially when looking down in order to read or work. While it was worthwhile to work up those numbers, the overall revelation was in the “Duh” category for me, I guess because it happens to me on some roads and I presumed it was fairly common.

Oddly, most of the press was of the “this is going to be a barrier to self-driving cars” sort, while my reaction was, “wow, that happens to fewer people than I thought!”

Having always known this, I am interested in the statistics, but to me the much more interesting question is, “what can be done about it?”

For those who don’t like to face backwards, the fact that so many are not bothered is a good sign — just switch seats.

Some activities are clearly better than others. While staring down at your phone or computer in your lap is bad during turns and bumps, it may be that staring up at a screen watching a video, with your peripheral vision very connected to the environment, is a choice that reduces the stress.

I also am interested in studying if there can be clues to help people reduce sickness. For example, the car will know of upcoming turns, and probably even upcoming bumps. It could issue tones to give you subtle clues as to what’s coming, and when it might be time to pause and look up. It might even be the case that audio clues could substitute for visual clues in our plastic brains.

Another interesting thing to test would be having your tablet or phone deliberately tilt its display to give you the illusion you are looking at the fixed world when you look at it, or to have a little “window” that shows you a real world level so your eyes and inner ears can find something to agree on.

More advanced would be a passenger pod on hydraulic struts able to tilt with several degrees of freedom to counter the turns and bumps, and make them always be such that the forces go up and down, never side to side. With proper banking and tilting, you could go through a roundabout (often quite disconcerting when staring down) but only feel yourself get lighter and heavier.


This post originally appeared on Robocars.comIf you liked this article, you may also be interested in:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter.

 



tags: , , , ,


Brad Templeton, Robocars.com is an EFF board member, Singularity U faculty, a self-driving car consultant, and entrepreneur.
Brad Templeton, Robocars.com is an EFF board member, Singularity U faculty, a self-driving car consultant, and entrepreneur.





Related posts :



Robot Talk Episode 126 – Why are we building humanoid robots?

  20 Jun 2025
In this special live recording at Imperial College London, Claire chatted to Ben Russell, Maryam Banitalebi Dehkordi, and Petar Kormushev about humanoid robotics.

Gearing up for RoboCupJunior: Interview with Ana Patrícia Magalhães

and   18 Jun 2025
We hear from the organiser of RoboCupJunior 2025 and find out how the preparations are going for the event.

Robot Talk Episode 125 – Chatting with robots, with Gabriel Skantze

  13 Jun 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Gabriel Skantze from KTH Royal Institute of Technology about having natural face-to-face conversations with robots.

Preparing for kick-off at RoboCup2025: an interview with General Chair Marco Simões

and   12 Jun 2025
We caught up with Marco to find out what exciting events are in store at this year's RoboCup.

Interview with Amar Halilovic: Explainable AI for robotics

  10 Jun 2025
Find out about Amar's research investigating the generation of explanations for robot actions.

Robot Talk Episode 124 – Robots in the performing arts, with Amy LaViers

  06 Jun 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Amy LaViers from the Robotics, Automation, and Dance Lab about the creative relationship between humans and machines.

Robot Talk Episode 123 – Standardising robot programming, with Nick Thompson

  30 May 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Nick Thompson from BOW about software that makes robots easier to program.

Congratulations to the #AAMAS2025 best paper, best demo, and distinguished dissertation award winners

  29 May 2025
Find out who won the awards presented at the International Conference on Autonomous Agents and Multiagent Systems last week.



 

Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2025.05 - Association for the Understanding of Artificial Intelligence


 












©2025.05 - Association for the Understanding of Artificial Intelligence