Robohub.org
 

How much can customers test robocars?


by
07 June 2016



share this:
Google_Tesla_robocar_

Reports from Tesla suggest they are gathering massive amounts of driving data from logs in their cars — 780 million miles of driving, and as much as 100 million miles in autopilot mode. This contrasts with the 1.6 million miles of test operations at Google. Huge numbers, but what do they mean now, and in the future?

As I’ve written before, testing is one of the largest remaining challenges in robocar development — how do you prove to yourself, and then to others, that you’ve reached the desired safety goals? Logging tons of miles are an important component. If car companies can get their customer to do the testing for them, that can be a significant advantage. (As I wrote last week, another group which can get others to do testing are companies like Uber and even operators of large commercial and taxi fleets.) Lots of miles mean lots of testing, lots of learning, and lots of data.

Does Tesla’s quick acquisition of so many miles mean they have lapped Google? The short answer is no, but it suggests a significant threat since Google is, for now, limited to testing with its small fleet and team of professional testing drivers.

Tesla is collecting a lot less data from its cars than Google, orders of magnitude less. First of all, Tesla has a lot fewer sensors and no LIDAR, and to the best of my knowledge from various sources I have spoken to, Tesla is only collecting a fraction of what their sensors gather. To collect all that they gather would be a huge data volume, not one you would send over the cell network, and even over the wifi at home, it would be very noticeable. Instead, reports suggest Tesla is gathering only data on incidents and road features the car did not expect or did not handle well. However, nothing stops them in the future from logging more, though they might want to get approval from owners to use all that bandwidth.

Tesla intends to make a car for people to buy today. As such, it has no LIDAR, because a car today, and even the autopilot, can be done without LIDAR. Tomorrow’s LIDARs will be cheap but today’s production LIDARs for cars are simple and/or expensive. So while the real production door-to-door self-driving car almost certainly uses LIDAR, Tesla is unable and unwilling to test and develop with it. (Of course, they can also argue that in a few years, neural networks will be good enough to eliminate the need for LIDAR. That’s not impossible, but it’s a risky bet. The first cars must be built in a safety-obsessed way, and you’re not going to release the car less safe than you could have made it just to save what will be only a few hundred dollars of cost.)

As noted, Google has been doing their driving with professional safety drivers, who are also recording a lot of data from the human perspective that ordinary drivers never will. That isn’t 100 times better, but it’s pretty significant.

Tesla is also taking a risk, and this has shown up in a few crashes. Their customers are beta testing a product that’s not yet entirely safe. In fact, it was a pretty bold move to do this, and it’s less likely that the big car companies would have turned their customers into beta testers — at least no until forced by Tesla. If they do, then the big automakers have even more customers than Tesla, and they can rack up even more miles of testing and data gathering.

When it comes to training neural networks, ordinary drivers can provide a lot of useful data. That’s why Commma.ai, who I wrote about earlier, is even asking volunteers to put a smartphone on their dash facing out to get them more training data. At present, this app does not do much, but it will not be hard to make one that offers things like forward collision warning and lane departure warning for free, paid for by the data it gathers.

Watch me on Dateline NBC: On Assignment

On Sunday, June 5, at 7 pm (Eastern and Pacific) the news show Dateline: NBC conducted a segment on self-driving cars featuring Sebastian Thrun, Jay Leno and myself. I sat down for several hours with Harry Smith, but who knows how much actual airtime will be shown. Here is the promo for the episode and another more specific one.



tags: ,


Brad Templeton, Robocars.com is an EFF board member, Singularity U faculty, a self-driving car consultant, and entrepreneur.
Brad Templeton, Robocars.com is an EFF board member, Singularity U faculty, a self-driving car consultant, and entrepreneur.





Related posts :



Robot Talk Episode 102 – Isabella Fiorello

  13 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Isabella Fiorello from the University of Freiburg about bioinspired living materials for soft robotics.

Robot Talk Episode 101 – Christos Bergeles

  06 Dec 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Christos Bergeles from King's College London about micro-surgical robots to deliver therapies deep inside the body.

Robot Talk Episode 100 – Mini Rai

  29 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Mini Rai from Orbit Rise about orbital and planetary robots.

Robot Talk Episode 99 – Joe Wolfel

  22 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Joe Wolfel from Terradepth about autonomous submersible robots for collecting ocean data.

Robot Talk Episode 98 – Gabriella Pizzuto

  15 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Gabriella Pizzuto from the University of Liverpool about intelligent robotic manipulators for laboratory automation.

Online hands-on science communication training – sign up here!

  13 Nov 2024
Find out how to communicate about your work with experts from Robohub, AIhub, and IEEE Spectrum.

Robot Talk Episode 97 – Pratap Tokekar

  08 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Pratap Tokekar from the University of Maryland about how teams of robots with different capabilities can work together.

Robot Talk Episode 96 – Maria Elena Giannaccini

  01 Nov 2024
In the latest episode of the Robot Talk podcast, Claire chatted to Maria Elena Giannaccini from the University of Aberdeen about soft and bioinspired robotics for healthcare and beyond.





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association