Chronically bad drivers could be blocked from using Tesla’s Autosteer feature after US road safety authorities forced the electric vehicle maker to issue a ‘recall’ that will improve the feature on all of the more than 2 million cars it has sold in the US and Canada since 2012.
Announced by Tesla after a two-year investigation by the US National Highway Traffic Safety Administration (NHTSA), the voluntary ‘recall’ – which does not actually require most vehicles to be taken to Tesla service centres and updates over-the-air automatically – addresses issues not with the technology, but with what Tesla has described as “driver misuse of Autosteer”.
Autosteer, which is a driver assistance feature that complements the Autopilot traffic-aware cruise control built into every Tesla, uses in-car sensors to keep the vehicle in its lane on major highways (in September Tesla launched a version for city streets under the auspices of its Full Self-Driving (FSD) feature – which was the subject of its own recall earlier this year and is still in early testing stages in Australia).
Autosteer requires drivers to keep their hands on the wheel, continually monitor the vehicle’s situation and be prepared to take over on a moment’s notice – yet readily available products and YouTube videos show how easy it is to trick the cars’ safety systems.
Such methods have been abused by drivers such as 25-year-old Param Sharma – who was jailed after being repeatedly caught sitting in the back seat of his Tesla Model 3 while the car drove itself along busy San Francisco area highways – or the couple who went viral after using Autosteer to get intimate at speed.
Others have been found passed out drunk behind the wheel of a moving Tesla or using Full Self-Driving Mode as a designated driver; in one Norwegian incident, the car recognised that its unconscious driver was not controlling the vehicle and safely stopped itself.
The NHTSA began investigating Autosteer in 2021 after multiple reports of collisions with stationary emergency vehicles – and ultimately concluded that “the prominence and scope of the feature’s controls may not be sufficient to prevent driver misuse” of the feature “in certain circumstances”.
“If a driver misuses the feature such that they fail to maintain continuous and sustained responsibility for vehicle operation and are unprepared to intervene, fail to recognise when the feature is cancelled or not engaged and/or fail to recognise when the feature is operating in situations where its functionality may be limited,” the recall notes, “there may be an increased risk of a collision.”
Second thoughts from first responders
It’s not the first time concerns over autonomous cars’ interactions with safety vehicles have raised regulators’ eyebrows.
California authorities this year introduced and then quickly pulled self-driving taxis from San Francisco roads, suspending the license of robotaxi operator Cruise after a series of incidents, including interfering with emergency vehicles and a fatality that the company attempted to cover up.
These latest concerns of the NHTSA – which actively investigates safety issues such as distracted driving, flaunting seatbelt rules, and cyber security risks and recently proposed anti drink driving measures that would mandate in-built breathalysers in all new cars – reflect lingering challenges for self-driving technology that many fear still isn’t ready for mainstream use – concerns echoed earlier this month by an ex-Tesla whistleblower.
Tesla has continuously tweaked its self-driving features, with its current holiday update adding new visualisations of parking spaces and enthusiasts welcoming an imminent ‘Tap to Park’ feature that helps the car park itself.
Although it disputed the NHTSA’s assessment, Tesla accepted the recall in the interest of moving on and is already rolling out software update 2023.44.30 to 2012-2023 Model S, 2016-2023 Model X, 2017-2023 Model 3 and 2020-2023 Model Y vehicles.
The new version, Tesla said, will “incorporate additional controls and alerts… to further encourage the driver to adhere to their continuous supervisory responsibility whenever Autosteer is engaged.”
Additional measures will include making alerts about Autosteer more prominent on the car screen; making it easier to engage and disengage the feature, something already added in a recent update; and adding “additional checks” when engaging Autosteer, using it off of major highways, or when approaching traffic controls.
Tesla vehicles already calculate rolling Safety Scores based on driver behaviour – and drivers who the car senses have “repeatedly failed to demonstrate continuous and sustained driving responsibility” will be locked out of using the Autosteer feature.