Robohub.org
 

No, a Tesla didn’t predict an accident and brake for it


by
11 January 2017



share this:
tesla-auto-crash

You may have seen a lot of press around a dashcam video of a car accident in the Netherlands. It shows a Tesla in AutoPilot hitting the brakes around 1.4 seconds before a red car crashes hard into a black SUV that isn’t visible from the viewpoint of the dashcam. Many press have reported that the Tesla predicted that the two cars would hit, and because of the imminent accident, it hit the brakes to protect its occupants.

The accident is brutal but apparently nobody was hurt.

https://www.youtube.com/watch?v=lqqN5iRrAiM

The press speculation is incorrect. It got some fuel because Elon Musk himself retweeted the report linked to, but Telsa has in fact confirmed the alternate and more probable story which does not involve any prediction of the future accident. In fact, the red car plays little to no role in what took place.

Tesla’s autopilot uses radar as a key sensor. One great thing about radar is that it tells you how fast every radar target is going, as well as how far away it is. Radar for cars doesn’t tell you very accurately where the target is (roughly it can tell you what lane a target is in.) Radar beams bounce off many things, including the road. That means a radar beam can bounce off the road under a car that is in front of you, and then hit a car in front of it, even if you can’t see the car. Because the radar tells you “I see something in your lane 40m ahead going 20mph and something else 30m ahead going 60mph” you know it’s two different things.

The Tesla radar saw just that — the black SUV was hitting the brakes (possibly for a dirt patch that appears to show on the video) and the red car wasn’t. Regardless of the red car being there, the autopilot knew that if another car ahead was braking hard, it should also brake hard, and it did. Yes, it’s possible that it could also calculate that the red car, if it keeps going, will hit the black car, but that’s not entirely relevant — it’s clear that the Tesla should stop, regardless of what the red car is going to do. Tesla reported in their blog about how they were doing more with the radar, including tracking hidden vehicles with it. The ability of automotive radar to do this has been known for some time, and I have always presumed most teams have taken advantage of it. You don’t always get returns from hidden cars, but it’s worth using them if you do.

In the future, we will see robocar systems predicting accidents, but I am not aware of this being announced by any team. All robocars are tracking all objects ahead of them, for position and velocity, and they are extrapolating their velocity and predicting where they will go. Those predictions would also include detecting that vehicles might hit (if they continue their current course) and also if they could not avoid hitting at a certain point. If an imminent accident is predicted, it would make sense to know that and also react to it in advance. A car might even be able to predict a bit of what will happen after the accident, though that is chaotic.

A system like that would outperform the autopilot or any automatic emergency braking system. Presently, those systems largely track objects in their lane. They don’t brake because cars are stopped in adjacent lanes, because that would mean they could not work in traffic jams or carpool lanes when there are lanes at different speeds, and they could not deal with stalled cars on the side of the road.

However, if you saw what the Tesla saw from the lane to the right, it would still be a very smart thing to brake. Tesla has not commented on this, but I presume its system would not have braked if it had been in that lane, at least not braked before the accident. It might brake because other cars like the red car immediately moved into the right lane.


If you liked this post on robocars, you’ll also enjoy these articles:

See all the latest robotics news on Robohub, or sign up for our weekly newsletter



tags: , , , , ,


Brad Templeton, Robocars.com is an EFF board member, Singularity U faculty, a self-driving car consultant, and entrepreneur.
Brad Templeton, Robocars.com is an EFF board member, Singularity U faculty, a self-driving car consultant, and entrepreneur.





Related posts :



Robot Talk Episode 115 – Robot dogs working in industry, with Benjamin Mottis

  28 Mar 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Benjamin Mottis from ANYbotics about deploying their four-legged ANYmal robot in a variety of industries.

Robot Talk Episode 114 – Reducing waste with robotics, with Josie Gotz

  21 Mar 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Josie Gotz from the Manufacturing Technology Centre about robotics for material recovery, reuse and recycling.

Robot Talk Episode 113 – Soft robotic hands, with Kaspar Althoefer

  14 Mar 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Kaspar Althoefer from Queen Mary University of London about soft robotic manipulators for healthcare and manufacturing.

Robot Talk Episode 112 – Getting creative with robotics, with Vali Lalioti

  07 Mar 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Vali Lalioti from the University of the Arts London about how art, culture and robotics interact.

Robot Talk Episode 111 – Robots for climate action, with Patrick Meier

  28 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Patrick Meier from the Climate Robotics Network about how robots can help scale action on climate change.

Robot Talk Episode 110 – Designing ethical robots, with Catherine Menon

  21 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Catherine Menon from the University of Hertfordshire about designing home assistance robots with ethics in mind.

Robot Talk Episode 109 – Building robots at home, with Dan Nicholson

  14 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Dan Nicholson from MakerForge.tech about creating open source robotics projects you can do at home.

Robot Talk Episode 108 – Giving robots the sense of touch, with Anuradha Ranasinghe

  07 Feb 2025
In the latest episode of the Robot Talk podcast, Claire chatted to Anuradha Ranasinghe from Liverpool Hope University about haptic sensors for wearable tech and robotics.





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association