TESLA AUTOPILOT LINKED TO 13 DEATHS, HUNDREDS OF CRASHES IN NEW INVESTIGATION

Moreover, U.S. federal regulators are probing whether a December Tesla recall actually fixed the safety issues with Autopilot and FSD.

Even when you call your Level 2 Advanced Driver Assistance System (ADAS) Autopilot, and even though you clearly list what it can and cannot do, it still needs close human monitoring—and to ensure drivers are actually doing that. This is the conclusion drawn by the National Highway Traffic Safety Administration (NHTSA) after concluding its investigation into Tesla’s Autopilot, which began in 2021. And it poses more questions and potential headaches for something Tesla CEO Elon Musk is staking the company's future on

Moreover, NHTSA's most recent investigation, which examined nearly 1,000 crashes over five years, attributed hundreds of Tesla Autopilot incidents to driver misuse and the system's own lax approach to monitoring driver attention.  

The agency examined 956 reported crashes that occurred between 2018 and August 2023 and found that a misuse of the system caused 13 fatalities and “many more involving serious injuries.” According to the Office of Defective Investigations (ODI), “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities,” and “the prominence and scope of the system’s controls may be insufficient to prevent driver misuse.”

What this means is that the systems that Tesla put in place to check if drivers were still paying attention to the road and keeping their hands on the wheel while the car was on Autopilot could be easily bypassed. In older Teslas, for instance, you could jam an orange or a ball into the steering wheel so that it applied enough pressure to trick the car into thinking your hands were still on the wheel. This was addressed in later models, but it was a big talking point back in 2018 around the time of the first fatal Autopilot crashes.

The NHTSA concluded that “this mismatch resulted in a critical safety gap between drivers’ expectations of [Autopilot’s] operating capabilities and the system’s true capabilities” and “this gap led to foreseeable misuse and avoidable crashes.”

Of the 956 crashes examined, officials examined Autopilot-related trends in about half of them. 

"Of the remaining 467 crashes, ODI identified trends resulting in three categories," officials said, including "collisions in which the frontal plane of the Tesla struck another vehicle or obstacle with adequate time for an attentive driver to respond to avoid or mitigate the crash (211), roadway departures where Autosteer was inadvertently disengaged by the driver’s inputs (111), and roadway departures in low traction conditions such as wet roadways (145). ODI observed this pattern across all Tesla models and hardware versions."

This investigation was closed after a December recall where Tesla modified Autopilot's functions

But while this particular investigation has ended, Tesla is still under official scrutiny. On Thursday, the NHTSA launched another probe into Autopilot, which encompasses over 2 million Tesla vehicles and includes all five models sold by the manufacturer.

This new investigation is meant to ensure that the over-the-air “recall” fixes and updates that Tesla rolled out to its cars in December have done the job.

Get the best news, reviews, columns, and more delivered straight to your inbox.
Sign up

For more information, read our

Privacy Policy and Terms of Use.

Meanwhile, issues surrounding the technology continue to make headlines. One Tesla driver was recently charged with vehicular homicide after their 2022 Model S, which they say had Autopilot on at the time, crashed into and killed a motorcyclist near Seattle.

According to the Associated Press, the police have yet to confirm if Autopilot was enabled at the time of the accident, but the driver admitted to not only having the automated driving system on but that they were also looking at their phone, not the road when it happened.

This investigation is still ongoing, but even once the information that Autopilot was indeed enabled is confirmed, the chance that it could lighten the driver’s legal punishment is slim—you have to pay attention while driving, no matter how good your car’s ADAS systems are.

More On This

2024-04-26T18:19:32Z dg43tfdfdgfd