Business

Tesla’s Autopilot ‘tricked’ to operate withoutdriver

Tesla's Autopilot can be easily tricked into operating without a driver, according to a Consumer Reports test. This discovery follows a fatal crash in Texas, where no driver was believed to be present. The findings raise concerns about the safety and reliability of Tesla’s driver assistance system.

Tesla’s Autopilot ‘tricked’ to operate withoutdriver

Tesla Autopilot Can Be Easily Tricked, Consumer Reports Finds

Tesla's Autopilot feature, designed to assist drivers, has been found to be easily tricked into operating without a driver, according to Consumer Reports.

Engineers from the magazine tested the Tesla Model Y on a closed track, and concluded that the system could be easily fooled into driving with no one in the driver's seat. The report follows a fatal crash in Texas, where authorities believe no one was in the driver’s seat.

Insufficient Safeguards

The Consumer Reports team stated that they were able to repeatedly trick the car into driving around the track without a driver. Jake Fisher, the auto testing director at Consumer Reports, said, “The system not only failed to make sure the driver was paying attention, but it also couldn’t tell if there was a driver at all.” Fisher added, “It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient.”

Tesla’s Autopilot system is advertised as an advanced driver assistance system, intended to “enhance safety and convenience behind the wheel.” However, Tesla's official website states that the system requires a "fully attentive driver" and is not meant to make the car autonomous.

Tesla's Safety Requirements

To use Autopilot, Tesla requires the driver to have their hands on the wheel, buckle their seatbelt, and not open any doors. Despite this, Consumer Reports showed how easily the system could be bypassed.

Crash Investigations

The findings come after a fatal crash in Texas, where two men were killed when their Tesla crashed into a tree and caught fire. Police believe no one was in the driver's seat. However, Elon Musk, CEO of Tesla, denied that Autopilot was enabled during the crash. He tweeted that data logs show the system was not in use and added that standard Autopilot requires lane markings to function, which the street lacked.

In light of the incident, two US Democratic Senators sent a letter to the National Highway Traffic Safety Administration (NHTSA) requesting an investigation into the crash. The NHTSA has opened investigations into 28 crashes involving Tesla vehicles, though Tesla has not yet responded to media requests for comment.

This revelation raises concerns about the safety and reliability of Tesla's Autopilot system, particularly when it comes to preventing accidents involving driverless operation.

 

Source : BBC

2 min read
Apr 24, 2021
By L. F.
Share

Related posts

Jan 28, 2025 • 3 min read
Why all the buzz around Deepseek?

Discover Deepseek, the Chinese startup shaking up AI with its open-source R1 model. Free and highly...

Jan 27, 2025 • 2 min read
With Operator, OpenAI's ambitions in agentic AI are becoming clearer

Discover OpenAI's Operator, an AI agent that redefines web task automation. Capable of filling forms...

Dec 16, 2024 • 2 min read
The Federal Council defines its digital strategy for 2025

Discover Switzerland's Digital Strategy for 2025, focusing on artificial intelligence (AI), cybersec...