The culpability of drivers using semi-autonomous technology is set to be tested in an Australian court for the first time after a Tesla owner who claimed the car was on Autopilot was committed to stand trial following a serious accident.
Twenty-four-year-old Sakshi Agrawal will face four charges in the Victorian County Court after a Magistrate ruled on Monday that there was enough evidence to support a potential positive conviction.
The charges, which include driving in a dangerous manner and failing to assist after a collision, stem from a crash in Melbourne in March last year that left a young nurse with critical injuries.
The driver has previously claimed that the Tesla Model 3 was operating on Autopilot mode, which allows the car to steer, accelerate and brake automatically within its lane without driver intervention. In this mode, the driver is still meant to be actively monitoring their surroundings and keeping their hands on the steering wheel.
Agrawal allegedly struck a nurse on Wattletree Road in Armadale when they were attempting to board a tram. The woman, acute care nurse Nicole Lagos, was thrown into the air and dragged up to 20 metres along the road, resulting in life-threatening injuries.
The impending case will now potentially hinge on whether the Autopilot mode was activated and, if so, how much responsibility a driver has when they are operating a vehicle in semi-autonomous mode.
The major collision investigators have said this is the first case they have scrutinised that has involved a Tesla.
In Australia, drivers are required to remain in control of the vehicle, even when Autopilot is activated.
While this will be a landmark first case in Australia, there have been a number of similar cases in the US and around the world.
From July 2021 to October 2022 there were 605 reported crashes involving vehicles using advanced driver assistance systems, according to the US Department of Transportation. Of these incidents, 474 involved Teslas, equating to three-quarters of the accidents.
Six years ago an independent investigative body ruled that an “over-reliance” on Tesla’s Autopilot played a significant role in a fatal accident in 2016. The investigators found that the technology had allowed the driver to be over-reliant on it and not pay adequate attention to the road.
In early 2020 there were two fatal crashes involving Tesla’s Autopilot in the same day, while two people died in an accident in the US in 2021 where it was reported that “no one” was driving.
Earlier this year Tesla shareholders launched a proposed class action alleging that the company’s CEO had overstated the effectiveness and safety of the Autopilot feature. The lawsuit claims that Tesla had made false and misleading statements that hide how the feature “created a serious risk of accident and injury”.
In February this year the US National Highway Traffic Safety Administration required the recall of more than 350,000 Tesla’s which were using a full self-driving software in beta form, claiming it was unsafe around intersections.