Tesla’s Latest Legal Triumph: Second Major Victory in US Autopilot Trials

Tesla won its first US autopilot trial involving allegations that its Autopilot driver assistant feature led to a fatal car crash. This is the second major victory for Tesla this year after winning an earlier case in April.

In April, a Los Angeles resident lost her legal battle against Tesla. She sued Tesla, alleging its Autopilot was defective and caused her Model S vehicle to swerve into a curb, causing her severe injuries. During the trial, Tesla argued that it tells drivers its cars require human monitoring, despite having “autopilot” and “full self-driving” capabilities. Tesla convinced the jury that driver distraction was to blame for the crash. Though the Los Angeles lawsuit was not binding on other cases, it helped Tesla hone its legal strategies for future trials, potentially leading to yesterday’s win.

Yesterday, in Tesla’s second victory, the jury determined that Tesla’s Autopilot system was not responsible for a 2019 fatal car crash. The case centered on allegations that Tesla’s Autopilot feature caused a driver’s Model 3 Tesla to swerve off the road and collide with a tree, killing the driver and severely injuring his two passengers. The win marks another major victory for the car company which is facing several other similar lawsuits across the United States.

Let’s Breakdown the Case

Two juries have now determined that Tesla’s Autopilot software is not defective. Elon Musk (Tesla’s CEO) has publicized Tesla’s new Autopilot software and more advanced Full Service Driving system, as the next crucial step in his company’s future. However, since its roll-out, Tesla has come under intense regulatory and legal scrutiny.

The complaint filed on February 10, 2020, alleged that Tesla’s Autopilot system caused car owner Micah Lees’ Model 3 Tesla to veer off the road at 65 miles an hour on a street east of Los Angeles. Lee crashed into a palm tree and his car burst into flames, killing him on impact. Two passengers were also seriously injured, one being an eight-year-old who was disemboweled.

The surviving passengers filed a lawsuit against Tesla. They claimed that Tesla’s Autopilot software caused the crash. Tesla denied liability, arguing that the driver, Lee, consumed alcohol before driving and the crash was caused by human error.

Legal Arguments

Central to Plaintiffs’ case was the 2017 internal safety analysis done by Tesla showing “incorrect steering command” as a defect, involving an “excessive” steering wheel angle. Plaintiff’s attorney, Johnathan Michaels, argued that the report showed Tesla predicted this type of crash was possible, but created a protocol instructing employees not to accept liability for the problem. Tesla’s attorney, Michael Carey, countered, arguing that the report did not identify a defect, but was intended to address any theoretical issues that may arise in the vehicle.

The attorney for the passengers insisted that Tesla was aware of the issue, but released the Autopilot system during its experimental stage, in order to increase the company’s market share –prioritizing profit over human life. Tesla attorneys countered, explaining that the reason for the crash was human error. They asked the jury not to award damages solely because the victims sustained severe injuries.

The jury found in favor of Tesla, declining to find that the company’s Autopilot software was defective. The jury found that the vehicle involved in the crash did not have a manufacturing defect with a 9-3 vote in favor of Tesla, after four days of deliberations.  

Tesla’s Autopilot Capabilities

On Tesla’s website, it advertises that its Autopilot is a suite of driver assistance features that comes standard with the purchase of a new car. Under its “Full Self-Driving Capability” description, Tesla’s website states: “Your vehicle will be able to drive itself almost anywhere with minimal driver intervention.”

Tesla’s Autopilot function includes features to “automatically control the car’s speed, following distance, steering and some other driving actions, such as taking exits off a freeway.” According to the Washington Post, a 2018 Tesla Model 3 contains warnings about the software’s limitations, and urges drivers to always pay attention with their hands on the wheel and eyes on the road.

Product Liability or Driver Responsibility?

Tesla faces its next legal battle over its Autopilot system in Florida. In 2019, a man named Jeremy Banner crashed into a semi-truck at 70 mph, killing him on impact. His family sued Tesla after the collision, alleging the Autopilot malfunctioned.

This case raises a similar question as its predecessors: Is Tesla liable when things go wrong in a vehicle guided by Autopilot, or is the driver solely responsible? Banner’s family argues that Tesla should share in the responsibility because “the company’s marketing of Autopilot exaggerates its capabilities, creating a false sense of complacency that can lead to deadly crashes.” Tesla claims its Autopilot software is “safer than a human-operated vehicle,” but is this declaration deadly?

The Washington Post did an analysis of federal data and concluded that more than 700 car crashes of vehicles guided by Tesla’s Autopilot have occurred (19 of them being fatal) since Autopilot’s introduction to the world in 2014.

Yesterday’s win was a huge victory for Tesla, as it was the first case to involve a fatal car accident. However, the company has a long way to go in proving to consumers that its Autopilot system is safe. One of the biggest arguments Tesla puts forward is its emphasis that though it advertises “autopilot” and “self-driving” capabilities in its vehicles, the cars still require “human monitoring.”

These arguments raise interesting questions surrounding product liability and marketing. Is it contradictory and irresponsible for Tesla to advertise “self-driving” cars, or should drivers know to maintain control of their vehicle at all times regardless of Autopilot? According to juries so far –the responsibility is on the driver.

Want to know more?

Interested in reading up on these cases and others like them? Head on over to Trellis.law. There you can see the actual court documents and receive live updates on cases as they unfold. Multiple lawsuits surrounding Tesla and its Autopilot function are being litigated in the courts and Trellis will help you keep track of them.  

Sources:

https://www.reuters.com/legal/tesla-autopilot-fatality-case-soon-reach-california-jury-2023-10-24/?utm_source=Sailthru&utm_medium=Newsletter&utm_campaign=Daily-Docket&utm_term=102423&user_email=03c628500922090a5119c8dff709ad5dc27bc0bbaaad6750edc2d7752a33ffa4

https://www.reuters.com/business/autos-transportation/tesla-wins-autopilot-trial-involving-fatal-crash-2023-10-31/?utm_source=Sailthru&utm_medium=Newsletter&utm_campaign=Afternoon-Docket&utm_term=103123&user_email=03c628500922090a5119c8dff709ad5dc27bc0bbaaad6750edc2d7752a33ffa4

https://www.reuters.com/business/autos-transportation/tesla-braces-its-first-trial-involving-autopilot-fatality-2023-08-28/

https://www.reuters.com/business/autos-transportation/tesla-braces-its-first-trial-involving-autopilot-fatality-2023-08-28/

https://www.claimsjournal.com/news/national/2023/09/01/319048.htm#:~:text=The%20second%20trial%2C%20set%20for,Tesla’s%20roof%20and%20killing%20Banner.

https://www.washingtonpost.com/technology/interactive/2023/tesla-autopilot-crash-analysis/

https://www.tesla.com/support/autopilot