Former Tesla employee Lukasz Krupski has claimed the US electric vehicle (EV) manufacturer’s self-driving technology is unsafe for use on public roads.

    As reported by BBC, Mr Krupski said he has concerns about how AI is being used to power Tesla’s Autopilot self-driving technology.

    “I don’t think the hardware is ready and the software is ready,” said Mr Krupski.

    “It affects all of us because we are essentially experiments in public roads. So even if you don’t have a Tesla, your children still walk in the footpath.”

    Mr Krupski claims he found evidence in company data that suggested requirements for the safe operation of cars with a certain level of semi-autonomous driving technology hadn’t been followed.

    He also said Tesla employees had spoken to him about cars phantom braking, i.e. braking suddenly and unexpectedly despite no obstacle being ahead.

    Mr Krupski was the one who leaked the more than 100GB of Tesla internal data to German newspaper Handelsblatt earlier this year.

    He claims his attempts to highlight his concerns internally were ignored, and he felt compelled to share what he had found with data protection authorities.

    Tesla’s Autopilot system is a Level 2 autonomous driving system that can assist with driving by adjusting the vehicle’s speed and its position within a lane, although a person is required in the driver’s seat with their hands on the steering wheel.

    This latest development comes as a US judge ruled there is “reasonable evidence” that Tesla CEO Elon Musk and other managers knew about dangerous defects with the company’s Autopilot system.

    The Florida lawsuit was brought against Tesla after a fatal crash in 2019, where the Autopilot system on a Model 3 failed to detect a truck crossing in front of the car.

    Stephen Banner was killed when his Model 3 crashed into an 18-wheeler truck that had turned onto the road ahead of him, shearing the roof off the Tesla.

    Despite this Tesla had two victories in Californian court cases earlier this year.

    A jury found Tesla’s Autopilot technology didn’t cause a fatal crash.

    Micah Lee’s Model 3 was alleged to have suddenly veered off a highway in Los Angeles while travelling 65mph (104km/h), striking a palm tree and bursting into flames, all in span of a few seconds.

    According to Reuters, the plaintiffs alleged the Model 3’s Autopilot system was engaged at the time and caused the crash that killed Mr Lee and seriously injured two passengers including a then eight-year old boy who was disembowelled.

    Additionally, Tesla won a lawsuit against Los Angeles resident Justine Hsu, who claimed her Model S swerved into a kerb with Autopilot active.

    Jurors told Reuters they believed driver distraction was to blame, and that Tesla had clearly warned its Level 2 system was not driverless technology.

    The US National Highway Traffic Safety Administration (NHTSA) has opened more than 36 investigations into Tesla crashes, with 23 of these crashes involving fatalities.

    MORE: Tesla blames former employees for massive data leak
    MORE: US judge finds Tesla knew about dangerous Autopilot defect

    Jack Quick

    Jack Quick is an automotive journalist based in Melbourne. Jack studied journalism and photography at Deakin University in Burwood, and previously represented the university in dance nationally. In his spare time, he loves to pump Charli XCX and play a bit of Grand Theft Auto. He’s also the proud owner of a blue, manual 2020 Suzuki Jimny.

    Buy and Lease
    Uncover exclusive deals and discounts with a VIP referral to Australia's best dealers
    Uncover exclusive deals and discounts with a VIP referral to Australia's best dealers
    Also on CarExpert