Everything is about to change, and we don’t know how or when.

That was the take home message from a host of experts in artificial intelligence and automation, who spoke at the ACS Canberra Conference, held last Tuesday.

In light of Tesla boss Elon Musk’s recent claims that AI safety is a bigger threat than North Korea, conference attendees were able to hear from a range of industry-leading experts about the threats and opportunities of AI and automation.

President of the International Federation for Information Processing, Professor Mike Hinchey, was unfazed by Musk’s predictions, and gave his support for autonomous systems.

“Let’s let them take the mundane and boring jobs, and let them create new jobs in new fields,” he said.

“We will create more jobs in other areas, and the jobs we don’t want to do will go away.”

Predictions

But just how many jobs will go away?

A 2015 Committee For Economic Development of Australia report found that 40% of Australian jobs had a greater than 70% chance of being automated by 2030. Similar studies of global economies have produced matching results.

But according to ACS President, Anthony Wong, there is no way to truly know what the outcome will be.

“Predictions are just predictions,” he told the crowd in Canberra.

Former CEO of the National Broadband Network, and co-author of Changing Jobs: The Fair Go in the New Machine Age, Professor Michael Quigley, echoed Wong’s statement.

“Published predictions are not too consistent,” he said. “There is no crystal ball to provide reliable predictions about the future of work.”

Automation, AI and technology

While these predictions have generated concern amongst the public over mass job losses, Hinchey explains that this is as much about confusion over definitions as it is fear.

“It is really important we differentiate between what is true artificial intelligence, and what is autonomy, and what is simply an algorithm that has been executed,” he said.

“The AI community is its own worst enemy. For over 50 years they have been claiming things that could never possibly come true.

“It is very worrying to have people like Elon Musk and Stephen Hawking, who are very well respected in their fields, very intelligent people, making wild claims that there’s no basis of fact to believe.”

Hinchey suggested that discussion should centre around the technology being used to create these advances, and to view automation and artificial intelligence as a product of this technological revolution, rather than as a threat.

Preparing for the future

And while there is no way to pinpoint the impact of artificial intelligence and automation, Quigley believes there are ways to ensure we have the social infrastructure to adapt to change.

“We don’t know whether it’s going to be 10% or 40% of jobs in Australia lost to automation and computerisation in the next few decades.

“That certainty doesn’t mean we should just sit on our hands and wait to see what happens.

“We must embrace technological change, but make sure it works for people and not against them. That means caring about how people are impacted by technology and ensuring we have adequate social, and industrial arrangements in place to make sure people are not left behind in this new machine age.”

While social adaptation will be critical, Hinchey stressed the importance of ensuring the safety of these systems before we put our trust into them.

“There are many circumstances that if we allowed the systems to change and make adaptations without considering the safety aspects of it, we could actually have serious problems and we could actually kill people.

“If we want to have systems that are truly artificially intelligent, they need to be able to adapt, but to do that we have to consider the safety of the system.

“These systems must be built with safety and responsibility in mind and in context.”