As Australia conducts its first on-road driverless car trials on a closed section of Adelaide’s Southern Expressway, leading academics – particularly those interested in how humans might interact with autonomous cars – are watching the tests closely.

Exact details of the trials are being tightly kept, but they will involve Volvo XC90s and “demonstrate lane keeping, adaptive cruise control and adaptive queue assist in a fully automated mode”, according to the South Australian government.

Professor Michael Regan is the chief scientist of human factors at the Australian Road Research Board (ARRB) Group, which is leading the trial under the auspices of the Australian Driverless Vehicle Initiative (ADVI).

The initiative has been “running now for about a year or so and is a collaboration between government, industry and academia”, he said.

“It’s really about trying to prepare Australia for the introduction of self-driving vehicles.”

Regan said that Swedish researchers would drive the Volvos during the field trial, which will also involve non-autonomous pace cars.

“There will be a number of vehicles involved – there will be some vehicles driven totally hands-off and pace cars working with those autonomous vehicles helping them to activate some of those autonomous systems,” Regan said.

“One of the key things will be looking at human factors.”

This essentially boils down to people’s attitudes towards self-driving vehicles, especially those on the road and those that are rolled out while the technology remains imperfect.

“The road to full automation is going to be a bit of a challenge because it’s not as if we’ll have completely autonomous vehicles overnight,” Regan said.

“And we can say at the moment vehicle automation technology is not 100 percent reliable in all driving scenarios.

“A human is still needed to take over control of these highly autonomous vehicles if they fail or reach their limits of competence, and that creates some interesting challenges and solutions from what we call a human factors and ergonomic perspective.”

One of the human factors is going to be driver attention and distraction.

Researchers believe that being behind the wheel of a self-driving car will be fairly reasonably unstimulating, leading them to become distracted or to do “secondary activities”.

“A lot of drivers will want to do secondary activities anyway, and we can’t blame them. If you make the vehicles autonomous, that’s what we’d like to do,” Regan said.

“But if they are inattentive when they need to take over control it can have adverse effects on takeover time and quality.”

Similarly, becoming distracted could reduce an operator’s situational awareness, also impairing their reaction should something unexpected occur.

Researchers are also interested in the question of over-reliance and trust in self-driving vehicle technology.

“We want to try and have people having a moderate degree of trust in automation and being aware of the capabilities and limitations of these systems,” Regan said.

Those that put too much trust in systems could see their driving skills quickly degrade, again creating dangers in their ability to act should a system failure occur. Conversely, those that don’t trust self-driving systems enough to use them could be sacrificing key safety capabilities.

In addition, researchers want to see the effect of self-driving cars on the incidence of motion sickness – again a potential cause of impairment to someone’s ability to override or take control of a failing vehicle.

Should self-driving cars be obvious?

UNSW Professor Toby Walsh welcomes the advent of self-driving cars but raises one other potential human factor.

In the likely transition period when roads carry a mix of self-driving and human-operated vehicles, should it be obvious which are which?

Walsh thinks so – but he’s concerned that the requirement is being overlooked.

In the past, self-driving cars were obvious, often distinguished by their Lidar devices mounted to the roof.

But this is changing, with some newer self-driving models barely distinguishable from regular cars.

“I think some manufacturers are deliberately trying to go down that route and I’m not sure we should be allowing that to happen,” Walsh said.

“We should perhaps be distinguishing drivers of different capabilities.

“We do that already with provisional drivers and I would argue that we should be thinking along these lines for autonomous cars.

“They should at least be carrying distinguishing markings, if not distinguishing lights.

“Your response to other vehicles on the road will depend in part on what you know about them.”

Walsh believes that human drivers are likely to act differently if they know a self-driving car is nearby.

He notes the rear-end accidents that Google often reports occur to its self-driving cars in the US. These accidents are typically blamed on human error, but Walsh argues the incidence of such accidents might be less if the self-driving car was more obviously identifiable.

“Perhaps in the Google accidents its car was following the rules too exactly and the driver that did rear end the self-driving car wasn’t aware they were trailing an autonomous car and would stop so precisely,” he posited.

If the car was more obviously self-driving, human drivers around it might “take more diligence” in their own movements, he added.