See Tom Harvey’s article challenging the NTSB News Release on Self-Driving Vehicle Pedestrian Fatality.
NTSB News Release for Self-Driving Vehicle Fatality Highlights Human Errors and Downplays the Need for Advanced Technology that Considers Human Fallibility
Tom Harvey, CSP
Allied Safety Associates
The March 2018 fatal event in Tempe, AZ when a self-driving experimental vehicle struck a pedestrian continues to offer safety lessons to safety professionals and influence the public at large. The “Safety Matters” article on page 13 of the July 2018 edition of PSJ referencing the “News Release” from NTSB’s preliminary investigation into the fatal crash is a perfect example of NTSB spreading the wrong safety message. The first part of the News Release goes to great lengths to describe a human being a human, imperfect, like we all are. The NTSB cites five “faults” of the pedestrian, all of which should easily be expected by an alert driver, none of which should be thought of as primary reasons the person was killed, but that is where the blame is implied based on the presentation of their preliminary findings.
People wear dark clothes, push bicycles with missing reflectors, jaywalk by disregarding roadway signs (instructing them to use a crosswalk 120 yards north), and have residual intoxicants in their systems. All of these “human factors” should reasonably be expected, but were not. Conversely, 1.) the “smart” car detected the pedestrian six seconds before impact (~120 yards at 43 mph) but misinterpreted the data inputs, 2.) the car’s factory-equipped automatic emergency braking feature was disabled, and 3.) dependence on an in-car human operator (who’s tasked with other duties while “operating” the vehicle) to intervene are critical design and engineering flaws. Further, the human operator was monitoring the self-driving interface, was not alerted to the impending danger, and thus, did not begin braking until after impact with the pedestrian. Unsafe actions by the pedestrian victim should not divert attention away from the true causes—multiple engineering system-design failures that did not consider reasonable human activity.
The NTSB misleads the public by emphasizing the human factors first—as faults, when they are normal characteristics of human nature. The serious design flaws introduced later are exponentially more significant. Who among us would accept this inverted NTSB rationale if the victim were a loved one? How much longer will we accept “human error” as a leading cause, when to be human is to err? Safety strategies that continue to try to “fix” human behaviors before fixing broken organizational systems—like poor design and engineering, will continue to meet with marginal results. By focusing first on operational safety systems, human behavior will naturally benefit, and risks will be mitigated. The NTSB may not promote this strategy, but product liability legal settlements do. Knowing that the smart car company would be held to full account (not the human victim), the case was settled within a week. Advanced safety technologies must consider human fallibility. NTSB should align with these realities, not against them.
(2018, June). NTSB Investigating Fatal Crash Involving Self-Driving Test Vehicle. Professional Safety Journal, Vol. 63, NO.7. pp.13
(2018, May 24). Preliminary Report Released for Crash Involving Pedestrian, Uber Technologies, Inc., Test Vehicle. Retrieved from https://www.ntsb.gov/news/pressreleases/Pages/NR20180524.aspx.
Additional information obtained from various news sources via internet.