EXCLUSIVE: Plans to use automation and AI in NSW bail hearings have been paused after a prototype hit technical roadblocks and the state’s judicial watchdog decided it would not make judgments more accurate or restore the public’s confidence in them.

The Bail Assistant was intended to prompt the use of the correct bail criteria based on the facts of the case, like whether the defendant has previous convictions and to eventually predict the probable outcomes of granting the accused bail.

However, preliminary models failed to incorporate all of the Bail Act’s mandatory tests and users found it just as difficult to navigate, which prompted the NSW Judicial Commission (JudCom) to conclude there is “no evidence of a pressing need for AI solutions.”

The project, first announced in 2021, “will not proceed while the cost of further development is likely to outweigh the benefits” the sentencing and judiciary oversight body told Information Age.

The agency is exploring alternative ways of supporting judges and magistrates to manage the “complexity of the Bail Act,” the spokesperson said.

“There is no reason to believe that a software application is more likely to improve public confidence in bail determinations than other measures, such as law reform or simplification of the legislation itself.”

Judges ceded discretion to stricter bail rules, then machines came for it

Sydney tabloids criticising judges and magistrates for not remanding alleged offenders has eroded public trust in judicial decision-making and prompted parliament to mandate criteria making it harder to grant bail for serious offences, making the laws harder to navigate in 30-minute hearings.

“I have always felt that the Bail Act was probably more complex than it needed to be,” Murali Sagi, the former JudCom deputy executive who designed the tool said in a February 2022 email included in Freedom of Information documents.

Shortly before exiting his dual role as JudCom president and NSW Chief Justice, Thomas Bathurst said the Bail Assistant could both make accessing bail decisions more efficient and assuage common concerns with bail decisions like “judicial bias.”

“Technology could be used to turn a decision-maker’s mind to factors they must or must not take into account”, he told attendants of the 2021 Maurice Byers Annual Lecture.

The following year, “66 per cent of young people who were refused bail at their first court bail appearance were Aboriginal,” according to NSW’s crime statistics agency BOCSAR, highlighting the seriousness of the concern.

Bathurst also acknowledged that some experts were concerned AI models used in the justice system could adopt human biases.

Regenerate response

Bathurst said in the speech first revealing that JudCom’s Bail Assistant was “being developed” that it was a two-fold project.

When first deployed, the Bail Assistant would be a pre-coded decision-making tool that provides options and considerations that reflect the conditions the judges and magistrates have entered.

“The Bail Assistant will be intended as a tool to support the judicial officer from start to finish to assess bail concerns efficiently, make an informed bail decision, and record the decision accurately.”

In its second phase, the system would use AI models to predict the likelihood of the accused abiding by their bail conditions as well as use previous decisions to rank the consistency of choosing to release or detain them.

“Eventually, the Bail Assistant is designed to be a supervised machine learning system, which could use data from past bail decisions to predict probable outcomes and to bring up relevant precedent.”

Bathurst added that the assistant would, ultimately, “not make the decision” because the judge or magistrate would still be able to override its recommendation.

Not every problem needs an app

Lawyers with a background in the remand system who tested the first model in early 2022 raised several doubts about the tool, according to the FOI documents.

The Bail Assistant failed to exhaust all “possible bail hearing outcomes”, correctly “follow the procedural order of steps set out in the Bail Act” or “reflect jurisdictional preconditions” like the unique “bail consideration criteria” of “Commonwealth Offences.”

JudCom identified these faults and indefinitely paused the program, potentially evading the erroneous outcomes of other agencies’ experiments in high-stakes automated decision-making systems like the buggy debt recovery programs of the Australian Taxation Office, NSW Revenue and Services Australia’s notorious ‘RoboDebt’.