Tesla is recalling a total of 1,610,105 examples of the Model 3, Model Y, Model S and Model X in China due to issues with steering software and door-locking systems.

    According to the Chinese State Administration for Market Regulation (SAMR), this recall includes not only Chinese-produced Model 3 and Model Ys, but also 7538 examples of the Model S and Model X imported to the country.

    “For vehicles within the scope of this recall, when the automatic assisted steering function is turned on, the driver may misuse the level two combined assisted driving function, increasing the risk of vehicle collision and posing a safety hazard,” said the company in its Chinese recall notice.

    Separately, Tesla is also recalling the imported Model S and Model X vehicles as their doors could unlatch during a crash.

    All of the issues with these Model 3, Model Y, Model S and Model X vehicles will be solved by an over-the-air (OTA) update at some point, despite the company officially calling it a recall.

    At this stage no recall for similar issues has been issued in Australia.

    This particular issue in China comes less than a month after Tesla recalled two million cars in the US due to Autopilot software issues.

    A recent software update called 2023.44.30.1 included a change called Autopilot Suspension.

    “For maximum safety and accountability, use of Autopilot will be suspended if improper usage is detected,” the release notes read.

    “Improper usage is when you, or another driver of your vehicle, receive five ‘Forced Autopilot Disengagements’.

    “A disengagement is when the Autopilot system disengages for the remainder of a trip after the driver receives several audio and visual warnings for inattentiveness.

    “Driver-initiated disengagements do not count as improper usage and are expected from the driver.”

    If a Tesla detects a driver is misusing the Autopilot system, the feature isn’t simply deactivated for that drive.

    “Autopilot features can only be removed per this suspension method and they will be unavailable for approximately one week,” says Tesla in the release notes.

    Tesla’s Autopilot is considered a Level 2 autonomous driver assistance technology under the levels of autonomy defined by The Society of Automotive Engineers (SAE).

    This means Tesla’s Autopilot system can steer, brake, and accelerate by itself, but still requires the driver to keep their hands on or near the steering wheel, and be alert to the current situation.

    The driver should intervene if they believe the car is able to do something illegal or dangerous.

    To determine if the driver has proper control, Tesla measures torque applied to the vehicle’s steering wheel by the driver.

    If the car detects the driver is not holding the wheel for long periods of time, the system will prompt the driver to take over.

    Despite this, the US National Highway Traffic Safety Administration (NHTSA) has found that “the prominence and scope of the feature’s controls may not be sufficient to prevent driver misuse of the SAE Level 2 advanced driver-assistance feature”.

    Autopilot has been the subject of an extensive, long-running investigation by the NHTSA. It has opened more than 36 investigations into Tesla crashes, 23 of these involving fatalities.

    A former Tesla employee recently claimed the US electric vehicle (EV) manufacturer’s self-driving technology is unsafe for use on public roads.

    As reported by the BBC, Lukasz Krupski said he has concerns about how AI is being used to power Tesla’s Autopilot self-driving technology.

    “I don’t think the hardware is ready and the software is ready,” said Mr Krupski.

    “It affects all of us because we are essentially experiments in public roads. So even if you don’t have a Tesla, your children still walk in the footpath.”

    Mr Krupski claims he found evidence in company data that suggested requirements for the safe operation of cars with a certain level of semi-autonomous driving technology hadn’t been followed.

    These developments also come as a US judge ruled there is “reasonable evidence” that Tesla CEO Elon Musk and other managers knew about dangerous defects with the company’s Autopilot system.

    A Florida lawsuit was brought against Tesla after a fatal crash in 2019, where the Autopilot system on a Model 3 failed to detect a truck crossing in front of the car.

    Stephen Banner was killed when his Model 3 crashed into an 18-wheeler truck that had turned onto the road ahead of him, shearing the roof off the Tesla.

    Despite this Tesla had two victories in Californian court cases in 2023.

    Micah Lee’s Model 3 was alleged to have suddenly veered off a highway in Los Angeles while travelling 65mph (104km/h) while Autopilot was active, striking a palm tree and bursting into flames, all in a span of a few seconds.

    Additionally, Tesla won a lawsuit against Los Angeles resident Justine Hsu, who claimed her Model S swerved into a kerb with Autopilot active.

    In both cases, Tesla was cleared of any wrongdoing.

    MORE: New Tesla update cracks down on Autopilot misuse

    Jack Quick

    Jack Quick is an automotive journalist based in Melbourne. Jack studied journalism and photography at Deakin University in Burwood, and previously represented the university in dance nationally. In his spare time, he loves to pump Charli XCX and play a bit of Grand Theft Auto. He’s also the proud owner of a blue, manual 2020 Suzuki Jimny.

    Buy and Lease
    Uncover exclusive deals and discounts with a VIP referral to Australia's best dealers
    Uncover exclusive deals and discounts with a VIP referral to Australia's best dealers