International Business

Tesla’s Hidden Setup Enables Certain Drivers to Take the Wheel while Using Autopilot and FSD Without being Prompted

Tesla’s Hidden Setup Enables Certain Drivers to Take the Wheel while Using Autopilot and FSD Without being Prompted

The company can enable a hidden setting in Tesla vehicles that enables a driver to use the company’s advanced driver assistance systems, marketed as Autopilot and Full Self-Driving, without keeping their hands on the steering wheel for an extended period of time. The security researcher behind the handle “@GreentheOnly” made the discovery.

When a Tesla vehicle has this mode enabled, it eliminates what owners of the cars refer to as the “nag.” The researcher has nicknamed the feature “Elon Mode,” but that is not the company’s internal nomenclature for it, he said.

Tesla does not offer a self-driving vehicle today. CEO Elon Musk has promised to deliver a self-driving car since at least 2016, and said a Tesla would be able to complete a demo drive across the United States without human intervention by the end of 2017.

Instead, a human driver must maintain focus and be prepared to steer or brake at any time when using Tesla’s driver assistance technologies.

When a Tesla driver is using Autopilot or FSD (or its derivatives), a visual sign typically blinks on the touchscreen of the vehicle to remind drivers to exert resistance on the steering wheel on a regular basis. If the driver does not grasp the steering wheel, the nag escalates to a beeping noise. If the driver still does not apply torque to the steering wheel at that point, the vehicle can temporarily disable the use of Autopilot for up to several weeks.

Elon Musk said in a tweet last year in December, he would remove the “nag” for at least some Tesla owners in January. That plan never came to fruition. By April 2023, Musk said in a tweet, “We are gradually reducing it, proportionate to improved safety” in reference to the nags.

The security researcher who revealed “Elon mode,” and whose identity is known to both Tesla and CNBC, asked to remain pseudonymous, citing privacy concerns.

The Verge previously reported on “Elon mode.”

He has tested features of Tesla’s vehicles for years and is an owner of a Tesla Model X. He has also reported bugs to the company consistently, and earned tens of thousands of dollars from filing successful Tesla bug bounties, as previously reported.

The “white hat hacker” said in an interview via direct message on Tuesday, that “Unless you work at Tesla, or otherwise have access to relevant databases at the company,” there’s no way to know how many cars have “Elon mode” available today.

In February, Tesla issued a voluntary recall in the U.S. for 362,758 of its vehicles, warning that its Full Self-Driving Beta system may cause crashes. (It was the second such recall.) Tesla delivered an over-the-air software update to address the issues.

The FSD Beta system at that time could cause crashes, the safety recall report said, by allowing affected vehicles to: “Act unsafe around intersections, such as traveling straight through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal without due caution.”

GreentheOnly said he expects future recalls related to issues with FSD Beta and how well the system automatically stops for “traffic-control devices” like traffic lights and stop signs.

According to the most recent available data from the National Highway Traffic Safety Administration, Tesla has reported 19 incidents to the agency that resulted in at least one fatality, and where the company’s driver assistance systems were in use within 30 seconds of the collision.

In total, Tesla reported to NHTSA 21 incidences involving its driver assistance systems-equipped vehicles that resulted in fatalities.

Tesla did not immediately respond to a request for comment.