Tesla will recall more than two million vehicles over concerns that its hands-free Autopilot system could be misused by drivers after US auto safety regulators spent two years investigating accidents involving driver-assist technology.
The recall affects certain Tesla Models 3, S, X and Y sold between 2012 and 2023, the National Highway Traffic Safety Administration (NHSTA) said. The regulator warned that Tesla’s Autopilot system may not have sufficient controls to prevent “driver abuse.”
According to the report, the risk of a crash increases when Tesla’s Autopilot is activated and the driver does not maintain control of the vehicle or is unwilling to intervene. The findings were based on a study of 956 crashes in which autopilot may have been used.
Tesla’s Autopilot is designed to allow cars to automatically steer, accelerate and brake within their lane, while Enhanced Autopilot can help with lane changes on highways but does not make them autonomous.
The regulator said in a statement: “Automated technologies promise to improve safety, but only if used responsibly. “This action is an example of improving automated systems by prioritizing safety.”
It is unclear whether the recall will affect Tesla vehicles in the UK. All models are sold with Autopilot as standard, but UK legislation prohibits autonomous vehicles from being used on the road unless they pass approved tests.
Tesla said it disagrees with the regulators’ findings but will send software updates over the air that “include additional controls and alerts in addition to those already present in affected vehicles to further assist drivers.” Autosteer activated.”
The update will also eventually prevent a driver from using Autosteer unless they “repeatedly fail to demonstrate a consistent and sustained sense of driving responsibility while the feature is enabled,” Tesla added.
In October, Tesla announced that it had received a subpoena from the US Department of Justice over its Full Self-Driving (FSD) and Autopilot systems.
Bryant Walker Smith, a law professor at the University of South Carolina, told Reuters that a purely software solution would be very limited. The recall “really puts more responsibility on human drivers rather than on a system that allows for this type of abuse,” he said.
Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University who studies the safety of self-driving vehicles, called the software update a compromise that doesn’t address the lack of night vision cameras to monitor drivers’ eyes or Tesla’s incompetence. obstacles and stop. “This compromise is disappointing because it does not address the problem of older cars not having sufficient driver monitoring equipment,” he said.
Professor Koopman and Michael Brooks of the Center for American Auto Safety Experts criticized Tesla’s software update. “It’s not about getting to the bottom of what the research is looking at,” Brooks said. “This doesn’t answer the question, why don’t Teslas with Autopilot detect and respond to an emergency?”
NHTSA said the investigation remains open as it monitors the effectiveness of Tesla’s decisions. Since mid-October, Tesla and NHTSA have held several meetings to discuss regulators’ findings about potential driver abuse and Tesla’s proposed software solutions in response.
Additional information from Reuters
Source: I News

I am Moises Cosgrove and I work for a news website as an author. I specialize in the market section, writing stories about the latest developments in the world of finance and economics. My articles are read by people from all walks of life, from investors to analysts, to everyday citizens looking for insight into how news will affect their finances.