The death of a contractor assembling a robot in a Volkswagen plant has seen an outpouring of “scapegoating” directed at robots and automation – despite “human error” being officially blamed.
The 22-year-old worker was “grabbed” and crushed against a metal plate. The robot was designed to grab auto parts on a production line, according to news reports.
However, academics have taken issue with the accident being branded as a robot “killing” the worker – suggesting that it gives robot technology far too much credit.
“It’s important to understand that with present technology we cannot ‘blame’ the robot,” said Dr Blay Whitby, a lecturer in computer science and artificial intelligence at the University of Sussex.
“They are not yet at a level where their decision-making allows us to treat them as blameworthy.”
Dr Ron Chrisley, director of the Centre for Cognitive Science at the same university, said positioning the accident as a robot killing a worker was “misleading, verging on irresponsible”.
“Despite what one might be encouraged to believe from fiction or recent alarmist worries, [robots] themselves have no real intentions, emotions, purposes, etc,” Chrisley said.
“They can only kill in the sense that a hurricane can kill; they cannot kill in the same sense that some animals can, let alone in the human sense of murder.”
Chrisley noted that robots were likely to face similar instances of blame as their numbers grew.
“As robots become more prevalent in society, more and more it will seem like they actually have their own autonomy, allowing them to form their own purposes, goals and intentions, for which they can and should be held responsible,” he said.
“Although there may eventually come a day when that appearance is matched by reality, there will be a long period of time, which has already begun, in which this appearance is false.
“Robots are not autonomous in this sense, and are not responsible for what they do.”
Chrisley has an academic interest in this early tendency to classify “human-robotic interactions in terms of ‘what humans are responsible for’ vs ‘what robots are responsible for’ - despite the latter class being empty".
“This raises the danger of scapegoating the robot, and failing to hold the human designers, deployers and users involved fully responsible,” he said.
He added that even “if there is a ‘problem with the robot’ (be it faulty materials, a misperforming circuit board, bad programming, or poor design of installation/operation protocols) those faults, and/or not anticipating them, are, in some sense, a case of human error".
Most academics believed the incidence of accidents involving robots would only increase.
“Even if safety standards continue to rise, meaning that the chance of an accident happening in any given human/robotic interaction will go down, we can expect more and more incidents like this to occur in future, simply because there will be more and more cases of human/robotic interaction,” Chrisley said.
“Industrial strength robots can be very powerful and usually have safety protocols. But of course we have human errors in operation or programming as well as break downs and accidents happen,” University of Sheffield Emeritus Professor of AI and Robotics Noel Sharkey said.
“We could see many more of these as the current robotics revolution progresses.”
Closer to home, robots are being scapegoated for the effect they will have on Australian jobs.
“More than five million jobs, almost 40 percent of Australian jobs that exist today, have a moderate to high likelihood of disappearing in the next 10 to 15 years due to technological advancements,” a controversial Committee for Economic Development of Australia (CEDA) report concluded.
However, as Communications Minister Malcolm Turnbull noted, this is most likely to be the case only if Australians stand still and do not adapt to changes occuring around them.
“While many jobs and in some cases entire industries are at risk of being replaced by computers, technology can be harnessed to create a net increase in employment,” Turnbull said.
“Our challenge is to ensure that enough Australians have the skills and technological imagination to take advantage of new technologies; to approach disruption as an opportunity to invent and create, and not something that we seek to prevent.”