Tesla has issued a recall for 362,758 vehicles in the US equipped with its Full Self-Driving Beta, specifically for an issue with the Autosteer on City Streets feature.
“In certain rare circumstances and within the operating limitations of FSD Beta, when the feature is engaged, the feature could potentially infringe upon local traffic laws or customs while executing certain driving maneuvers in the
following conditions before some drivers may intervene,” said the Tesla recall notice published by the National Highway and Traffic Safety Administration (NHTSA).
These concerns revolve around the following driving scenarios:
- travelling or turning through certain intersections during a stale yellow traffic light (i.e. when the light is already yellow and the vehicle could have safely stopped)
- the perceived duration of the vehicle’s static position at certain intersections with a stop sign, particularly when the intersection is clear of any other road users
- adjusting vehicle speed while travelling through certain variable speed zones, based on detected speed limit signage and/or the vehicle’s speed offset setting that is adjusted by the driver
- negotiating a lane change out of certain turn-only lanes to continue travelling straight
The NHTSA said it advised Tesla on January 25 it had identified potential concerns related to FSD Beta’s operation in these environments, and requested the automaker issue a formal recall.
Subsequent to this, the NHTSA and Tesla met “numerous times” to discuss the issues and proposed over-the-air improvements.
While Tesla didn’t agree with the NHTSA’s analysis, it decided to administer a voluntary recall “out of an abundance of caution”.
Tesla has identified 18 warranty claims that may be related to the conditions described above, but isn’t aware of any resulting injuries or deaths.
The company’s Autopilot feature, also Level 2 autonomous driving technology, has been under investigation by the NHTSA.
It’s undertaking a probe into 830,000 Tesla vehicles to investigate an issue that could see them crash into parked emergency vehicles.
At least 14 Tesla vehicles have crashed into emergency vehicles while using Autopilot, while the NHTSA has investigated a total of 35 crashes since 2016 where either Autopilot or FSD were in use, with a death toll of 19 people.
In another probe of 416,000 vehicles, the NHTSA is also investigating claims of phantom braking in Tesla Model 3 and Model Y vehicles.
Like Autopilot, Tesla says on its website that FSD Beta requires a “fully attentive driver who has their hands on the wheel and is prepared to take over at any moment”.
To enable Autopilot, Tesla says the driver needs to agree to keep their hands on the steering wheel at all times and to always “maintain control and responsibility” for the vehicle, though the company has a time-lapse video on its website showing the system being used without the driver having their hands on the wheel.
In addition to the NHTSA probes, which have yet to lead to any enforcement actions, the US automaker is reportedly the subject of a criminal investigation by the US Department of Justice surrounding its Autopilot system.
Prosecutors in Washington D.C. and San Francisco are reportedly examining whether Tesla misled consumers, investors, and regulators by making unsupported claims about the capability of its driver assist technology.