The Tesla maker didn’t make any changes to its driver-assistance technology to account for crossing traffic in the nearly three years (REUTERS)News 

Engineers Claim Tesla Failed to Address Autopilot Issues After Fatal Accident

According to engineers from Tesla Inc., the company did not address the restrictions in its Autopilot system after a tragic accident in Florida in 2016, which resulted in the death of a driver. This information has come to light in a lawsuit filed by a family regarding a strikingly similar fatal collision in 2019, which is now set to be decided by a jury.

The electric car maker made no changes to its driver-assistance technology to accommodate cross traffic in the more than three years between two high-profile accidents that killed Tesla drivers whose cars crashed into the side of trucks, it has recently been revealed. certificates of several engineers.

After years of touting self-driving as the way of the future, Tesla and CEO Elon Musk are under legal pressure from consumers, investors, regulators and federal prosecutors who question whether the company has overstated its progress toward self-driving vehicles. in the last eight years.

Tesla is also in the crosshairs of multiple investigations by the National Highway Traffic Safety Administration into possible Autopilot failures linked to at least 17 deaths since June 2021.

Musk vs. the experts

The trial, set for October, the first for the company in a death blamed on Autopilot, pits Musk’s repeated claims that Teslas are the safest cars ever made against technology experts who are expected to testify that the company’s marketing has lulled drivers into a false sense of security. .

Musk was acquitted after being questioned by a Florida judge in the case last year. According to a 2020 report by Tesla’s former head of Autopilot software, the billionaire CEO is “hands-on,” “very involved in defining the product” and “very involved in making certain decisions about how things should work” with Autopilot. , Christopher “CJ” Moore, in the family’s revised complaint.

Tesla lawyers representing the company did not immediately respond to requests for comment.

The automaker claims it has been transparent about Autopilot’s limitations, including the challenges of detecting traffic crossing in front of its cars. Tesla warns in its owner’s manual and in-car displays that drivers must be alert and ready to take control of the vehicles at any time.

Tesla won its first trial over a non-fatal Autopilot crash earlier this year, when a Los Angeles jury acquitted the company of wrongdoing after a woman claimed her Model S’s driver-assistance feature caused her to veer into the center of the center line. city street.

Tractor trailer

The case, which was set to go to a jury in Palm Beach County, Florida, was brought by the family of Jeremy Banner, a 50-year-old father of three who had engaged Autopilot 10 seconds before his Model 3 clawed at his lower abdomen. tractor trailer in 2019. A National Highway Traffic Safety Administration investigation found that Banner probably didn’t see the truck crossing the two-lane highway on his way to work. Autopilot apparently didn’t see it either.

Despite the company’s knowledge that “cross traffic exists or is possible, Autopilot was not designed to detect it at the time,” according to 2021 testimony from company engineer Chris Payne presented in a recent lawsuit. Engineer Nicklas Gustafsson provided a similar report in the 2021 deposit.

Last week, Banner’s widow amended her complaint to seek punitive damages, raising the stakes for Tesla in the lawsuit. He claims the company should have reprogrammed Autopilot to turn off in dangerous conditions after Tesla driver Joshua Brown crashed into the side of a truck in 2016.

“There is evidence on record that Defendant Tesla engaged in willful misconduct and/or gross negligence in selling a vehicle equipped with an Autopilot system that Tesla knew was defective and that it knew caused a previous fatal accident,” Banner’s family said. amended complaint.

One of the expert witnesses called by the Banner family is Mary “Missy” Cummings, who recently served as a consultant to the National Highway Traffic Safety Administration. Cummings, a Duke University professor and vocal proponent of Autopilot, told the court that Tesla “is guilty of willful misconduct and gross negligence” for failing to test and improve Autopilot between the Brown and Banner crashes.

Tesla made “public statements that its Autopilot technology is far more powerful than it actually is,” Cummings wrote.

Trey Lytal, an attorney representing the Banner family, did not immediately respond to a request for comment.

Related posts

Leave a Comment