A global law firm has warned Australia that it is still too early to have fully automated vehicles on our roads, as the current regulatory environment is not yet equipped to manage the legal complexities.

In a new report, law firm Norton Rose Fulbright outlines the current regulatory environment around automated driving systems and makes recommendations on how to improve the process.

“It is probably too early to have highly or fully automated cars on the road in Australia, because it is still unclear where legal responsibility lies for many different facets of operating a vehicle,” the report states.

“Australia’s legal framework will need to further develop to address concerns as the world moves closer to the reality of cars commuting people to places while utilising no or limited human involvement.”

The current systems

Australia’s Transport Infrastructure Council has stated it aims to have safe, purpose built, end-to-end autonomous vehicle regulation in place by 2020.

Having put into place a set of Enforcement Guidelines late last year, the Transport Infrastructure Council is beginning to address the main legal question around automated vehicles – who is in control?

“The Australian Road Rules and other driving laws are currently based on the principle that a human driver is in control of the vehicle,” the report states.

Australia’s current laws are based on the SAE International Standard J3016, which classifies vehicle automation from Levels 0 to 2.

Most vehicles are currently classified between Level 0 and Level 1, with Level 0 vehicles involving no driver automation, while Level 1 vehicles use minimal automation, such as cruise control features.

Level 2 vehicles are those vehicles where steering and breaking may be automated.

But with the quality of driverless cars increasing so rapidly, vehicles are now falling into Level 3, 4 and 5 categories.

The report outlines the Transport Infrastructure Council’s current Enforcement Guidelines on the matter.

The guidelines mainly focus on how the requirement of “proper control” of a vehicle applies to driverless cars.

It does not intend to cover Level 4 and Level 5 automation.

“The indicators of proper control in the Enforcement Guidelines depend on the level of automation, ranging from still needing one hand on the wheel for Level 1 automation, to this requirement not applying when driving vehicles with Levels 2 or 3 automation,” states Norton Rose Fulbright.

“In Level 3 automation, the driver must not engage in activities that prevent them from responding to take over demands, are not in line with the intended use of the automated driving function, or are prohibited by law.”

Under such legislation, the safety driver behind the wheel of the driverless car that struck and killed a woman in Arizona earlier this year may have been liable, as it was later revealed she had been on her phone at the time of the incident.

The report also explains that an automated driving system is just that – not a human – meaning it cannot be held responsible for its actions or for any non-compliance with laws.

“In principle, a system should only be responsible for those things over which it can have control, e.g., the dynamic driving task within its operational design domain,” it explains.

“Being in control of a vehicle means being responsible for the actions of the vehicle, including for breaches of traffic laws or involvement in a crash.

“A person in the vehicle should not be responsible for contraventions of the law while the system is engaged to undertake a driving task it is designed to perform. To hold the human responsible in this case may restrict the introduction of autonomous vehicles in Australia.”

Making a call

In its closing remarks, Norton Rose Fulbright explains legislation must be modified to cater for driverless vehicles which “effectively eliminate human driver fault.”

With human error potentially being removed from motor vehicle accidents, it also cautions against pegging blame on manufacturers.

“It will not follow that the fewer accidents that occur should result in greater liability for the vehicle manufacturer.

“An automated driving system may be 'state of the art' and not malfunction, but nevertheless be simply incapable of dealing with a situation which its designers and programmers had not anticipated.

“To that extent at least, the allocation of the residual risk of loss from the use of autonomous vehicles could not be undertaken under existing Australian product liability principles.”