An Australian lawyer has been penalised for the first time for submitting fake AI-generated cases to court.
The Victoria-based lawyer – known only as 'Mr Dayal' – has been stripped of his ability to practise as a principal lawyer and can no longer operate his own law practice, after submitting documents to the Federal Circuit and Family Court of Australia last year containing AI-generated false citations.
He admitted he did not verify the contents.
It’s the first time an Australian lawyer has been sanctioned for the use of AI.
Generative AI tools are increasingly common in the legal sector, but their propensity to “hallucinate” information is proving to be problematic, with several recent examples of false or inaccurate citations being presented in cases.
Several other lawyers have been referred to state regulators for potential punishment over such cases.
Bogus AI cases
In October last year, a judge referred the Victorian lawyer to the Victorian Legal Services Board after discovering he had used AI to generate a number of citations that were entirely made up.
The lawyer was representing a husband in a dispute between a married couple and provided a list of prior cases that had been requested by the judge.
But when the judge checked this list, they were unable to find the cases in question, with the lawyer admitting he had prepared the list using AI-based legal software and had not verified them before submitting the document.
The lawyer gave the court an “unconditional apology” and said he would “take the lessons learned to heart”, admitting he did not fully understand how the AI tool worked.
But the judge decided to refer the case to the regulator due to the increasing usage of generative AI in the legal sector.
The Victorian Legal Services Board this week confirmed that the lawyer had his practising certificate varied in mid-August as a result of an investigation into his AI usage.
This means he is no longer able to practise as a principal lawyer, is not authorised to handle trust money, cannot operate his own law practice,e and can only practise as an employee solicitor.
He will also undertake supervised legal practice for two years, and report to the regulator quarterly over this time.
“The board’s regulatory action in this matter demonstrates our commitment to ensuring legal practitioners who choose to use AI in their legal practice do so in a responsible way that is consistent with their obligations,” a spokesperson for the Victorian Legal Services Board said in a statement.
“We strongly advise legal practitioners to refer to our statement on the use of artificial intelligence in Australian legal practice, and if they intend on using AI in the course of legal practice, consider undertaking continuing professional development to improve their knowledge.”
A growing problem
Just last month, a Western Australian lawyer was referred to the state regulator after submitting fake AI-generated cases to court.
The lawyer tendered documents citing four cases that either did not exist or were referenced inaccurately and admitted to having an “overconfidence in relying on AI tools and failed to adequately verify the generated results”.
The judge overseeing the case said that the attraction of AI for lawyers was currently a “dangerous mirage”.
Earlier in August, a Victorian defence lawyer acting for a child accused of murder referenced non-existent case citations and inaccurate quotes from a speech in parliament and later admitted this was due to the use of AI.
And in July a Melbourne law firm was busted using AI to cite fake cases, and was ordered to pay costs in the case.
Legal authorities in New South Wales, Victoria and Western Australia have warned that lawyers “cannot safely enter” confidential or commercially-sensitive information into generative AI tools, which should only be used for “lower-risk and easier to verify tasks”.