HyperAI
Back to Headlines

Tesla ordered to pay $329 million in damages after fatal Autopilot crash, jury rules

a day ago

A Miami jury has ruled that Tesla must pay $329 million in damages after a 2019 crash involving its Autopilot system that resulted in one death and serious injuries. The award includes $129 million in compensatory damages and $200 million in punitive damages. The trial, held in the Southern District of Florida, began on July 14 and centered on the responsibilities of Tesla and its driver in the fatal incident in Key Largo, Florida. George McGee, the Tesla Model S owner, was using the company’s Enhanced Autopilot—a partially automated driving feature—when he dropped his phone and reached for it. He claimed he believed the system would automatically brake if an obstacle appeared. Instead, his car accelerated through an intersection at over 60 mph, striking a parked vehicle and its occupants. Naibel Benavides, 22, died at the scene, her body found 75 feet from the impact point. Her boyfriend, Dillon Angulo, survived but suffered multiple fractures, a traumatic brain injury, and lasting psychological trauma. Plaintiffs’ attorneys argued that Tesla misled the public by promoting Autopilot as safer than human drivers, despite designing it for controlled-access highways. They contended that Tesla failed to restrict Autopilot use in non-highway environments and that Elon Musk’s public statements created a false sense of safety. “Tesla’s lies turned our roads into test tracks for their fundamentally flawed technology,” said attorney Brett Schreiber in a statement. “Everyday Americans like Naibel Benavides and Dillon Angulo were put in harm’s way.” The verdict brought emotional reactions from the victims’ families, who embraced each other and their legal team. Angulo was visibly moved as he hugged his mother. Tesla plans to appeal the decision, calling the verdict “wrong” and warning it could hinder progress in automotive safety technology. The company reiterated that its vehicles require active driver supervision and that Autopilot is not a fully autonomous system. The ruling comes amid growing scrutiny of Tesla’s self-driving ambitions. CEO Elon Musk is pushing to position Tesla as a leader in autonomous vehicles, including plans for robotaxi fleets. However, Tesla’s stock fell 1.5% on the news and is down 25% for the year—the worst performance among major tech companies. This case may influence other pending lawsuits, with about a dozen similar cases currently active, all involving Autopilot or Full Self-Driving (FSD) systems during fatal or injurious crashes. The National Highway Traffic Safety Administration (NHTSA) has been investigating Tesla’s Autopilot since 2021, issuing a second probe to assess whether recent software updates effectively addressed safety concerns—particularly around the system’s response to stationary emergency vehicles. The agency has also warned Tesla that its social media messaging may mislead drivers into believing the cars are capable of full autonomy. A website tracking Tesla-related collisions, TeslaDeaths.com, reports at least 58 deaths where Autopilot was active just before impact. The outcome of this case could have significant implications for how automakers market and regulate driver-assistance technologies.

Related Links