Robohub.org
 

Human error as a cause of vehicle crashes

by
07 February 2014



share this:

car_crash

 

Source: Wonderlane.

Some ninety percent of motor vehicle crashes are caused at least in part by human error. This intuitive claim is a fine place to start discussions about the safety potential of vehicle automation. (It is not an appropriate place to end these discussions. After all, humans can be amazing drivers, the performance of advanced automation systems is still unclear, automated vehicles can be misused, and automation shifts some error from driver to designer.) And since the claim is often made without sufficient citation, I’ve compiled several relevant sources.

(1) The most thorough analysis of crash causation, the Tri-Level Study of the Causes of Traffic Accidents published in 1979, found that “human errors and deficiencies” were a definite or probable cause in 90-93% of the incidents examined. The executive summary is here (see vii), and the two-part final report is here and here.  The study continues to be cited, and the Venn diagram on this page provides a useful summary.

(2) A UK study published in 1980 (summarized here at 88-89) likewise identifies driver error, pedestrian error, or impairment as the “main contributory factor[]” in 95% of the crashes examined.

(3) Another US study published in 2001 (available here) found that “a driver behavioral error caused or contributed to” 99% of the crashes investigated.

(4) An annual UK crash report (available here with updated data here in table RAS50002) identifies 78 potential contributing factors, most of which suggest human error. However, multiple factors were recorded for some crashes, and none were recorded for others.

(5) NHTSA’s 2008 National Motor Vehicle Crash Causation Survey is probably the primary source for the common assertion by NHTSA officials that “[h]uman error is the critical reason for 93% of crashes” (at least if “human error” and “driver error” are conflated). The 93% figure is absent from the report itself (probably intentionally) but calculable from the totals given on page 24.

(Note that this 2008 report identifies a single “critical reason” for each crash analyzed. This critical reason, which is “often the last failure in the causal chain,” is “an important element in the sequence of events leading up to a crash” but “may not be the cause of the crash” and does not “imply the assignment of fault to a vehicle, driver, or environment, in particular.” Indeed, several NHTSA officials have told me that they deliberately avoid any reference to causation in this context.)

One final note: Human error, which could mean many things, should certainly encompass drinking and texting. Please don’t do either.

This article was originally posted on CIS Blog on 18/12/2013: Human error as a cause of vehicle crashes.


tags: , , , , ,


Bryant Walker Smith is an expert on the legal aspects of autonomous driving and a fellow at Stanford Law School.
Bryant Walker Smith is an expert on the legal aspects of autonomous driving and a fellow at Stanford Law School.





Related posts :



Robot Talk Episode 100 – Mini Rai

In the latest episode of the Robot Talk podcast, Claire chatted to Mini Rai from Orbit Rise about orbital and planetary robots.
29 November 2024, by

Robot Talk Episode 99 – Joe Wolfel

In the latest episode of the Robot Talk podcast, Claire chatted to Joe Wolfel from Terradepth about autonomous submersible robots for collecting ocean data.
22 November 2024, by

Robot Talk Episode 98 – Gabriella Pizzuto

In the latest episode of the Robot Talk podcast, Claire chatted to Gabriella Pizzuto from the University of Liverpool about intelligent robotic manipulators for laboratory automation.
15 November 2024, by

Online hands-on science communication training – sign up here!

Find out how to communicate about your work with experts from Robohub, AIhub, and IEEE Spectrum.
13 November 2024, by

Robot Talk Episode 97 – Pratap Tokekar

In the latest episode of the Robot Talk podcast, Claire chatted to Pratap Tokekar from the University of Maryland about how teams of robots with different capabilities can work together.
08 November 2024, by

Robot Talk Episode 96 – Maria Elena Giannaccini

In the latest episode of the Robot Talk podcast, Claire chatted to Maria Elena Giannaccini from the University of Aberdeen about soft and bioinspired robotics for healthcare and beyond.
01 November 2024, by





Robohub is supported by:




Would you like to learn how to tell impactful stories about your robot or AI system?


scicomm
training the next generation of science communicators in robotics & AI


©2024 - Association for the Understanding of Artificial Intelligence


 












©2021 - ROBOTS Association