International Business

NHTSA Demands Additional Records from Tesla in the Autopilot Safety Investigation

NHTSA Demands Additional Records from Tesla in the Autopilot Safety Investigation

As part of an investigation into the safety of Autopilot, Tesla must give significant additional documents to the National Highway Traffic and Safety Administration or risk receiving heavy fines.

If Tesla fails to supply the federal agency with information about its advanced driver assistance systems, which are marketed as Autopilot, Full Self-Driving and FSD Beta options in the U.S., the company faces “civil penalties of up to $26,315 per violation per day,” with a maximum of $131,564,183 for a related series of daily violations, according to the NHTSA.

After identifying a run of accidents in which Tesla vehicles using Autopilot collided with stopped first responder vehicles and road maintenance vehicles, the agency launched an investigation into the safety of Autopilot in 2021.

To date, none of Tesla’s driver assistance systems are autonomous, and the company’s cars cannot function as robotaxis like those operated by Cruise or Waymo. Instead, Tesla vehicles require a driver behind the wheel, ready to steer or brake at any time. Autopilot and FSD only control braking, steering and acceleration in limited circumstances.

Among other details, the federal vehicle safety authority wants information on which versions of Tesla’s software, hardware and other components have been installed in each car that was sold, leased or in use in the U.S. from model years 2014 to 2023, as well as the date when any Tesla vehicle was “admitted into the ‘Full-Self Driving beta’ program.”

Driver assistance features that have been internally tested but not fully debugged are included in the company’s FSD Beta. Instead of relying solely on expert safety drivers, as is the industry norm, Tesla engages its consumers as software and car safety testers through the FSD Beta program.

Tesla previously conducted voluntary recalls of its cars due to issues with Autopilot and FSD Beta and promised to deliver over-the-air software updates that would remedy the issues.

A notice on the NHTSA website in February 2023 said Tesla’s FSD Beta driver assistance system may “allow the vehicle to act unsafe around intersections, such as traveling straight through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal without due caution.”

NHTSA data shows that more Tesla vehicles outfitted with the company’s driver assistance systems have been involved in fatal collisions, 21 in total, than any other automaker with a comparable technology.

According to a separate letter out Thursday, the NHTSA is also reviewing a petition from an automotive safety researcher, Ronald Belt, who asked the agency to reopen an earlier probe to determine the underlying causes of “sudden unintended acceleration” events that have been reported to the NHTSA.

When a driver experiences abrupt unwanted acceleration events, their automobile can be stopped or moving along normally when it suddenly accelerates forward, possibly causing a crash.

Tesla’s vice president of vehicle engineering, Lars Moravy, did not immediately respond to a request for comment.