Human lives are put in jeopardy by ‘killer computers’ whose integrity goes unquestioned by companies and regulators, a software engineer has warned in explaining how developers’ cultures of silence exacerbate technological disasters such as the UK’s Horizon scandal.

That scandal – which has seen hundreds of witnesses testifying during the ongoing Post Office Horizon IT Inquiry – ballooned into a national disgrace due to decisions that technical managers made in the late 1990s to downplay known bugs in the system, computer scientist Junade Ali writes in his newly released book, How to Protect Yourself from Killer Computers.

In tracing the history of that system – whose fundamental design failures led to the wrongful prosecution of hundreds of subpostmasters for theft, fraud, and false accounting based on Post Office Limited’s false assertions that the system was flawless – Junade Ali notes that major development projects often gather so much institutional, political, and commercial political momentum that efforts to apply professional discipline are quickly quashed.

Citing the revelations of former Fujitsu software engineer David McDonnell – who in 2022 revealed that in 1999 he had tried to push for Horizon’s faulty Cash Accounts module to be rewritten, but was rebuffed and reassigned to another project – 27-year-old Ali, who was last year named the youngest-ever Fellow of the Institution of Engineering and Technology (IET), said many software engineers continue to operate in cultures that favour outputs over quality.

The Horizon team’s early adoption of Agile software development methodologies – which at the time were a revolutionary way of building and releasing software in smaller increments – was an example, with practitioners initially unskilled in Agile development, and poor oversight and testing predictably leaving the Horizon code base a mess.

“The teachings of Agile, like reducing the amount of work-in progress and reducing task switching, are hugely powerful in improving software delivery,” Ali observed.

“However, in many environments Agile is then simply used as an excuse to get developers to work faster and harder, leading to the neglect of engineering rigour and heightening the risk of potentially fatal outcomes.”

In a society where technology is being trusted to manage programs with potentially life-altering consequences – and in which artificial intelligence (AI) is amplifying this trust as businesses race to deploy it – Ali said it was crucial to be aware when human failures exacerbate technological errors.

With a decade’s experience building high-reliability systems for critical applications in transport and other industries, Ali has seen all too often how institutions double down on the assumption that they are correct – as in the case of a software bug in Toyota cars that caused them to unintentionally accelerate, killing 89 people and injured 57 more.

After years of denials by the world’s largest car maker, an independent review of the code found Toyota’s car software failed to meet industry safety standards and included poorly architected modules that simply did not work.

“As a software engineer and computer scientist, I have played a role in this reprogramming of society,” Ali writes.

“However, many are unaware of the scale of risk [that] computer software attracts: it has played a role in miscarriages of justice, helped enable criminality, and even killed.”

Human risk compounds software risk

Recent years are littered with examples of software bugs that have had unintended real world consequences, such as the Australian government’s “crude and cruel” Robodebt debacle or the 2008 emergency landing of Qantas Flight 72 after 112 passengers and crew were injured due to an unknown defect in the Airbus A330 flight control computer.

Yet many times, the immediate damage caused by poor code is exacerbated by poor human management of the situation, with time and budget pressured managers blaming developers and building organisational buffers to protect themselves and those around them.

This is what happened in the Horizon debacle – where UK Post Office Limited executives openly worried about the risks of being prosecuted for their aggressive pursuit of hundreds of subpostmasters.

Ali singled out Gareth Jenkins, the Horizon system’s chief architect, for blame after highlighting his attempts to downplay or deny Horizon’s faults.

Despite his being a registered engineer with the Engineering Council UK and a member of the British Computer Society (BCS), Ali notes that Jenkins has not been censured by either body despite clear evidence that he and others had actively abrogated their responsibilities to protect the public.

Recent testimony by former Post Office executive Angela van den Bogerd showed how that organisation was part of an extensive subterfuge to hide the management’s wrongdoing, giving false evidence about the integrity of the Horizon system.

Similar malfeasance was brewing in the ongoing prosecution of thousands of UK residents for failing to pay TV license fees – which stems not from software bugs but from yet another snowballing human disaster as many vulnerable residents cop criminal convictions for failing to pay an esoteric mandatory fee.

“Killer computers emerge from toxic cultures,” Ali writes.

“There is rightfully an expectation that the public have technology that can be depended on…. Businesses and engineers must ensure these needs are met.”

It was also important to ensure that software engineers operate under robust whistleblower protections.

“Sometimes you as the human need to be the safety check for a computer system that gets things wrong,” he continued.

“Proper systems of oversight and scrutiny are required to ensure that we do not have leaders who are insecure to the extent that this becomes a problem.”

“Ensuring we have healthy organisations where risk can be appropriately identified and managed is vital.”

Dr Junade Ali MSc PhD CEng FIET is a British software engineer and computer scientist, and the author of ‘How to Protect Yourself from Killer Computers’.