The noose is tightening around the Autopilot in the United States, on the sidelines of the NHTSA investigation. A massive recall larger than a simple update could occur.
The US Federal Highway Safety Agency (NHTSA, for National Highway Traffic Safety Administration) is putting pressure on Tesla. Last week, she changed the status of the Autopilot survey to “technical analysis.”
A few days earlier, the NHTSA had set a deadline for Tesla to obtain answers on its phantom brakings. But it is also about additional accidents that the agency wanted to investigate. Several Teslas have thus struck vehicles involved in road accidents.
Tesla: US authorities are cracking down on Autopilot
The NHTSA had explained that it wanted “explore the degree to which Tesla’s Autopilot and related systems could exacerbate the human factor or behavioral safety risks by reducing the effectiveness of driver supervision”. More concretely, the agency sought to understand whether the system poses an additional risk in the event of danger.
And the first lessons seem to go towards this hypothesis, since the NHTSA has found a precise pattern. In the 16 accidents it studied, the federal agency confirms having discovered that the Autopilot “reverted his control of the vehicle less than a second before the first impact”.
A feature to clear Tesla of accidents?
This means that, factually, the drivers were responsible for the accident at the time of the collision. But this deactivation was done much too late for the people behind the wheel to really act.
It would indeed require a deactivation at least three or four seconds before impact for this to be the case. That would leave time for a human reflex, a second being too short a time to react.
This, on the other hand, is in line with Tesla’s official reports after the accidents. After studying the data from the black boxes, the firm still ensures that the Autopilot was not active during the impact.
Tesla Model Y earns top score in US crash test
But the version of motorists is very often different, ensuring that the car was working with Autopilot. These incidents come at a time when the firm is encountering complaints for other incidents. This was the case of a Parisian taxi driver, whose Tesla would have stopped responding, not wanting to brake on the way. He decided to challenge the brand of Elon Musk and file a complaint for endangering the lives of others.
The NHTSA investigation could thus create a massive recall for Tesla, more complex than its usual recalls. Indeed, if it is the agency that mandates the recall, it must be done under the supervision of its employees. From there, it is impossible to do so by a remote update.