The Uber car and Tesla’s autopilot, both in the news for fatalities are really two very different things. This table outlines the difference. Also, see below for some new details on why the Tesla crashed and more.
Uber ATG Test | Tesla Autopilot |
---|---|
A prototype full robocar capable of unmanned operations on city streets. | A driver assist system for highways and expressways |
Designed for taxi service | Designed for privately owned and driven cars |
A full suite of high end roobcar sensors including LIDAR | Productive automotive sensors, cameras and radar. |
1 pedestran fatality, other accidents unknown | Fatalities in Florida, China, California, other serious crashes without injury |
Approximately 3 million miles of testing | Late 2016: 300M miles, 1.3B miles data gathering. |
A prototype in testing which needs a human safety driver monitoring it | A production product overseen by the customer |
Designed to handle everything it might encounter on the road | Designed to handle only certain situations. Users are expressly warned it doesn’t handle major things like cross traffic, stop signs and traffic lights. |
Still in an early state, needing intervention every 13 miles on city streets | In production and needing intervention rarely on highways but if you tried to drive it on city streets it would need it very frequently |
Needs a state licence for testing with rules requring safety drivers | No government regulation needed, similar to the adpative cruise control that it is based on |
Only Uber employees can get behind the wheel | Anybody can be behind the wheel |
Vehicle failed in manner outside its design constraints — it should have readily detected and stopped for the pedestrian | Vehicles had incidents in ways expected under their design constraints |
Vehicle was trusted too much by safety driver, took eyes off road for 5 seconds | Vehicles trusted too much by drivers, took eyes off road for 6 seconds or longer |
Safety drivers get 3 weeks training, fired if caught using a phone | No training or punishments for customers, though manual and screen describe proper procedures for operating |
Safety driver recorded with camera, no warnings by software of inattention | Tesla drivers get visible, then audibile alerts if they take hands off the wheel for too long |
Criticism that solo safety driver job is too hard, that inattention will happen | Criticism that drivers are overtrusting the system, regularly not looking at the road |
Killed a bystander, though it had right of way | Killed customers who were ignoring monitoring requirements |
NTSB Investigating | NTSB Investigating |
Each company faces a different challenge to fix its problems. For Uber, they need to improve the quality of their self-drive software so that such a basic failure as we saw here is extremely unlikely. Perhaps even more importantly, they need to revamp their safety driver system so that safety driver alertness is monitored and assured, including going back to two safety drivers in all situations. Further, they should consider some “safety driver assist” technology, such as the use of the system in the Volvo (or some other aftermarket system) to provide alerts to the safety drivers if it looks like something is going wrong. That’s not trivial — if the system beeps too much it gets ignored, but it can be done.
For Tesla, they face a more interesting challenge. Their claim is that in spite of the accidents, the autopilot is still a net win. That because people who drive properly with autopilot have half the accidents of people who drive without it, the total number of accidents is still lower, even if you include the accidents, including these fatalities, which come to those who disregard the warnings about how to properly use it.
That people disregard those warnings is obvious and hard to stop. Tesla argues, however, that turning off Autopilot because of them would make Telsa driving and the world less safe. For them, options exist to make people drive diligently with the autopilot, but they must not make the autopilot so much less pleasant such that people decide to not use it even properly. That would actually make driving less safe if enough people did that.
A theory, now given credence by some sample videos, suggests the Telsa was confused by the white lines which divide the road at an off-ramp, the expanding triangle known as the “gore.” As the triangle expands, a simple system might think they were the borders of a lane. Poor lane marking along the gore might make the vehicle even think the new “lane” is a continuation of the lane the car is in, making the car try to drive the lane — right into the barrier.
This video made by Tesla owners near Indiana, shows a Telsa doing this when the right line of the gore is very washed out compared to the left. At 85/101 (the recent Tesla crash) the lines are mostly stronger but there is a 30-40 foot gap in the right line which perhaps could trick a car into entering and following the gore. The gore at 85/101 also is lacking the chevron “do not drive here” stripes often found at these gores. It is not good at stationary objects like the crumple barrier, but its warning stripes are something that should be in its classification database.
Once again, the Tesla is just a smart cruise control. It is going to make mistakes like this, which is why they tell you you have to keep watching. Perhaps crashes like this will make people do that.
The NTSB is angry that Tesla released any information. I was not aware they frowned on this. This may explain Uber’s silence during the NTSB investigation there.