Report: Tesla Autopilot Crash Driver Ignored Warnings

John Lister's picture

A man killed when his semi-autonomous Tesla car crashed ignored multiple warnings issued by the vehicle, newly-released documents show. The papers also suggest subsequent changes to the way the car's Autopilot feature works should prevent a repeat of the incident.

Tesla's Autopilot feature is a brand name that covers a technology that's often classed as being "driver assistance," rather than strictly self-driving. The features in question covered a cruise control that kept the car at a constant speed while taking account of nearby vehicles, and an auto-steer function that kept the car within its lane.

Joshua Brown died last year after crashing into a tractor-trailer at more than 70 miles per hour. The crash drew media attention, as it was the first known case of such a fatal crash with a car that used at least some form of self-driving technology. (Source: ntsb.gov)

The Washington Post notes that Tesla vehicles clocked up more than 130 million miles of driving under the Autopilot feature before its first fatal crash. In contrast, the United States suffers a fatal crash every 100 million miles, according to Insurance Institute for Highway safety. (Source: washingtonpost.com)

No Conclusions In Case

The National Transport Safety Board has now issued a docket of documents containing the facts it gathered about the case. The docket doesn't contain any analysis, conclusions or recommendations as these will follow when the investigation is complete.

The documents show Brown drove for 41 minutes before the crash, of which 37 minutes involved the Autopilot feature. While using Autopilot, Brown had his hands consistently off the wheel, against Tesla's advice.

13 Warnings Before Crash

During this time, the car gave six audible warnings that Brown needed to put his hands back on the wheel. The car dashboard also gave seven visual warnings.

Since the crash, Tesla has made several changes to Autopilot. It now issues a warning if the driver has their hands off the wheel for more than a minute while traveling at 45 miles per hour; if the driver ignores three such warnings within a one-hour period, Autopilot is switched off and can't be reactivated until the car has been parked and the engine stopped.

Tesla has also removed the option to set Autopilot at above the prevailing speed limit, the only exception being on freeways where there's a barrier between lanes running in opposite directions.

What's Your Opinion?

Is driver assistance technology safe enough for use on public roads? Should Tesla have stopped people being able to ignore safety warnings from the beginning? Is there any point having cars that can at least partially drive themselves if drivers have to keep their hands on the wheel?

Rate this article: 
Average: 5 (6 votes)

Comments

Dennis Faas's picture

In hindsight, I think Tesla could have done more to implement safety features (as they have now done so), rather than relying on audible alerts and dashboard warnings that could have prevented the crash.

As for whether cars should only be able to partially drive themselves - I think this is more of a legal standpoint. If cars were completely autonomous and a crash occurs, then car companies are likely going to be held liable, with plenty of bad press and safety questioned. If it's only 'partially' autonomous with the onus being on the driver, then car companies are essentially off the hook. I think it will remain that way until it can be proven (if ever) that self driving cars can never crash, though I am not sure how that will pan out if you have a mix of autonomous cars and cars which are not autonomous.

ecash's picture

WHO is responsible..

BEST thing they could of done...STOP CAR and turn all warning lights on..because the IDIOT ISNT PAYING ATTENTION..

I want an autonomous car...REALLY..I want to BLAME those repsonsible for programming for my death..

Im not driving, so I dont need a license, or insurance..