Australian lawyers “cannot safely enter” confidential or commercially sensitive information into public generative AI (genAI) engines like ChatGPT, Gemini or Grok, authorities in three states have warned in issuing explicit guidelines on genAI’s use in law.
The new formal guidelines – which have been introduced by legal overseers across NSW, Victoria, and Western Australia within the Statement on the Use of Artificial Intelligence in Australian Legal Practice – include seven points lawyers must consider before using genAI.
“Lawyers cannot safely enter confidential, sensitive or privileged client information” into public genAI engines “or any other public tools”, the guidelines note, advising lawyers to “carefully review contractual terms” to ensure security if using data in commercial AI tools.
The output of AI engines must also not be used as a substitute for legal advice, the guidelines note, warning that “LLM-based tools cannot reason, understand, or advise. Lawyers are responsible for exercising their own forensic judgement when advising clients.”
Guidelines advise limiting genAI to “lower-risk and easier to verify” tasks – such as drafting polite emails – and prohibiting its use for “higher-risk” tasks that need independent verification, like language translation, legal analysis, or executive decision-making.
Recognising the inevitability of ‘hallucinations’ in current LLMs, the guidelines warn that lawyers using AI “must be able and qualified to personally verify the information they contain, and must actually ensure that their contents are accurate and not likely to mislead.”
Don't bill clilents for AI-related costs
Lawyers must be transparent with clients about their use of AI, and shouldn’t bill clients for related costs – such as the additional time spent verifying or correcting its output.
They should also “continuously and actively” supervise junior and support staff members’ use of the tools.
“While enjoying the benefits of AI, it’s important for lawyers to remember that it’s their duty to provide accurate legal information,” Victorian Legal Services Board CEO and commissioner Fiona McLeay wrote, “and not the duty of the AI program they use.”
“Unlike a professionally trained lawyer, AI can’t exercise superior judgement, or provide ethical and confidential services,” she continued.
“Ultimately, it’s your expertise that clients rely on when they engage your services.”
Putting the genie back in the bottle
The guidelines come amidst growing concern that the use of genAI has become so normalised that even professionals are starting to use it in ways that may inadvertently undermine their own expertise.
A US personal injury lawyer, for example, was last year slapped down by a judge to whom he had submitted a brief that, opposing counsel pointed out, included citations of cases that didn’t actually exist.
A group of Australian academics was last year forced to apologise after admitting they used Google’s Bard genAI to research information for a submission to a Parliamentary inquiry.
More recently, a Victorian child protection worker recently violated the privacy of a child in a custody case after their use of ChatGPT led to the inadvertent release of sensitive information about the child.
The problem has become so pointed that the NSW Supreme Court recently issued a detailed Practice Note on the use of genAI in cases presented before the court.
This includes a ban on using genAI in preparing affidavits, witness statements, character references “or other material that is intended to reflect the deponent or witness’ evidence and/or opinion…. [such material] should contain and reflect a person’s own knowledge.”
GenAI must also not be used to alter, embellish, strengthen, “or otherwise [rephrase]” a witness’s evidence,” the Supreme Court ordered, with lawyers required to add explicit statements to affidavits and other documents that the technology was not used in preparing them.
The new court rules also require explicit disclosure, transparency and verification rules around the preparation of written submissions, summaries of argument, and expert reports – including an obligation to seek explicit court leave to use genAI.
Profession welcomes guidelines
By mandating specific obligations on NSW Supreme Court cases – and imposing explicit expectations on lawyers across three Australian states – the new guidelines have been welcomed as a profession-wide victory in a time when few industries have done the same.
Government efforts to develop broadly applicable AI safety guidelines led to the September release of the Voluntary AI Standard, but few industries have so far gone as far as imposing guidelines with potentially severe reputational and legal consequences if they are breached.
Backed by the weight of judicial mandate, the rules “provide a clear expression of lawyers’ obligations to the Court,” Law Society of NSW president Brett McGrath said, “including a requirement for legal practitioners to be aware of [genAI’s] risks and shortcomings.”
“The obligations imposed by the Practice Note will help protect litigants, the broader community and the justice system itself, from the limitations of AI tech” – including fictitious citations and “contamination of witness statements with material not properly in evidence.”