A jury in Miami, Florida, has ruled that Tesla bears partial responsibility for a fatal 2019 crash involving its Autopilot system, ordering the electric vehicle manufacturer to pay a significant portion of $329 million in damages to the victims and a survivor. The damages awarded include both compensatory and punitive amounts.
Tesla is held liable for $129 million in compensatory damages and $200 million in punitive damages, though the jury assigned Tesla 33% responsibility for the crash itself. This means that Tesla’s share of the compensatory damages amounts to roughly $42.5 million. Typically, in cases like these, punitive damages are capped at three times the compensatory damages.
Plaintiffs’ attorneys informed CNBC on Friday that because punitive damages were assessed solely against Tesla, they expect the company to pay the entire $200 million punitive amount, bringing Tesla’s total financial obligation to approximately $242.5 million.
Tesla has announced plans to appeal the verdict.
The plaintiffs had requested a jury award based on total damages of $345 million. The trial began on July 14 in the Southern District of Florida.
At the center of the lawsuit is the question of who was responsible for the deadly crash in Key Largo, Florida. The incident involved George McGee, a Tesla Model S owner who was driving while using Tesla’s Enhanced Autopilot system, a partially automated driving technology.
During the drive, McGee dropped his mobile phone and attempted to retrieve it, believing that Enhanced Autopilot would automatically brake if an obstacle appeared ahead. However, his Model S sped through an intersection at just over 60 miles per hour and collided with a parked car and its owners standing nearby.
Naibel Benavides, aged 22, tragically died at the scene from injuries sustained in the crash. Her body was found approximately 75 feet from the impact site. Her boyfriend, Dillon Angulo, survived but suffered multiple broken bones, a traumatic brain injury, and ongoing psychological trauma.
“Tesla designed Autopilot only for controlled access highways yet deliberately chose not to restrict drivers from using it elsewhere, alongside Elon Musk telling the world Autopilot drove better than humans,” said Brett Schreiber, counsel for the plaintiffs, in an emailed statement on Friday. “Tesla’s lies turned our roads into test tracks for their fundamentally flawed technology, putting everyday Americans like Naibel Benavides and Dillon Angulo in harm’s way.”
Following the verdict, the plaintiffs’ families embraced each other and their legal team, and Angulo was reported as “visibly emotional” while hugging his mother, according to NBC.
Tesla responded to CNBC with the following statement:
“Today’s verdict is wrong and only works to set back automotive safety and jeopardize Tesla’s and the entire industry’s efforts to develop and implement life-saving technology. We plan to appeal given the substantial errors of law and irregularities at trial.
Even though this jury found that the driver was overwhelmingly responsible for this tragic accident in 2019, the evidence has always shown that this driver was solely at fault because he was speeding, with his foot on the accelerator – which overrode Autopilot – as he rummaged for his dropped phone without his eyes on the road. To be clear, no car in 2019, and none today, would have prevented this crash.
This was never about Autopilot; it was a fiction concocted by plaintiffs’ lawyers blaming the car when the driver – from day one – admitted and accepted responsibility.”
The ruling arrives as Tesla CEO Elon Musk continues efforts to convince investors that Tesla can become a leader in autonomous vehicles and that its self-driving systems are safe enough to support fleets of robotaxis on public roads across the United States.
Tesla shares dipped 1.8% on Friday and have now fallen roughly 25% for the year, marking the largest decline among major tech companies.
This verdict could set an important precedent for ongoing and future lawsuits related to Tesla’s Autopilot. Approximately a dozen active cases are currently underway, all focusing on incidents where Autopilot or Tesla’s Full Self-Driving (FSD) system was engaged immediately prior to fatal or serious crashes.
The National Highway Traffic Safety Administration (NHTSA) opened an investigation in 2021 into possible safety defects in Tesla’s Autopilot. During that inquiry, Tesla implemented multiple over-the-air software updates aimed at addressing identified issues.
A second probe, currently ongoing, is examining whether Tesla’s “recall remedy” effectively resolves Autopilot’s problematic behavior, especially in situations involving stationary first responder vehicles.
Moreover, the NHTSA has cautioned Tesla that its social media messaging may mislead drivers into believing Tesla vehicles are capable of fully autonomous driving, despite the company’s owner manuals explicitly stating that drivers must maintain hands-on control and full attention at all times.
According to TeslaDeaths.com, a site tracking Tesla-related crashes, there have been at least 58 deaths in incidents where Autopilot was activated shortly before impact.