US regulator found autopilot safety flaws in 2 million Tesla vehicles. (REUTERS)News 

Tesla Issues Recall for 2 Million Cars to Address Autopilot Safety Issues, Dealing a Major Setback to Elon Musk

Tesla Inc. is set to address its largest recall to date, involving over 2 million vehicles, following a ruling by the leading US auto-safety regulator. The regulator concluded that Tesla’s driver-assistance system, Autopilot, does not provide sufficient safeguards against misuse. This decision comes after an extensive investigation into the defect, which the National Highway Traffic Safety Administration intends to continue in order to assess the effectiveness of Tesla’s proposed solutions. According to a spokesperson from the NHTSA, the investigation revealed that Tesla’s methods of ensuring driver engagement were insufficient.

“Automated technology holds great promise for improving safety, but only when implemented responsibly,” NHTSA said Wednesday. “Today’s operation is an example of improving automated systems by prioritizing safety.”

Tesla said in its recall report that it plans to begin deploying over-the-air software that includes additional controls and alerts on or shortly after Dec. 12. The automaker’s shares were down 1.9% in New York trading at 10:00 a.m.

The recall is the second this year and involves Tesla’s automated driving systems, which have come under increasing scrutiny after hundreds of crashes — some of them fatal. Although CEO Elon Musk has predicted for years that the automaker is close to full autonomy, both Autopilot and beta features Tesla markets require a fully attentive driver to keep their hands on the wheel.

Autopilot has been a standard Tesla feature, so the recall affects most of the company’s vehicles on US roads. The company monitors the surroundings of its vehicles with the help of several cameras, keeps up with the surrounding traffic and helps drivers stay in clearly marked lanes.

Tesla has been marketing higher-level functionality, which it calls FSD Beta, since late 2016. That feature package was recalled in February after NHTSA raised concerns about Teslas using the system in illegal or unpredictable ways, including exceeding speed limits and failing to yield. full stops.

Late last year, Musk suggested on X, the social media platform formerly known as Twitter, that Tesla update FSD Beta to allow some drivers to disable alerts that tell them to put their hands on the steering wheel. NHTSA asked the company for more information days later.

NHTSA first conducted an investigation into Autopilot’s failure after a fatal crash in 2016, but cleared the system early the following year. Its two ongoing fault detectors — launched in August 2021 and February 2022 — have caused the Tesla to crash into first-responding vehicles and brake suddenly on freeways.

The agency has launched more than 50 special crash investigations into Tesla cars suspected to be related to Autopilot, and under the Biden administration the investigation has accelerated.

Regulators reviewing Tesla’s driving systems go beyond NHTSA. The company announced in January that it had received document requests from the Ministry of Justice related to Autopilot and FSD Beta. Bloomberg also reported that month that the Securities and Exchange Commission was investigating Musk’s role in framing Tesla’s self-driving claims.

Related posts

Leave a Comment