Generative AI (genAI) should be integrated into Australia’s education system as a national priority despite its “high stakes risks”, according to a Parliamentary report advocating the creation of new genAI large language models (LLMs) aligned to the Australian curriculum.

The new report – which is entitled Study Buddy or Influencer and was handed down by the House of Representatives Standing Committee on Employment, Education, and Training – advised educators to lean into genAI technology, suggesting that the government use the Australian Curriculum to build new “high-quality genAI education products” that build on the strengths of the technology identified in pilot programs, such as using genAI as a “study buddy” for students.

Among its 25 recommendations, which aim to help schools balance the risks and promise of genAI, is the incorporation of a “higher-quality filter” that would allow the government to restrict the data used to train new LLMs that would be shaped to reflect the “Australian context”, including content that reflects Indigenous knowledge and gender- and disability-inclusive perspectives.

A major education push would work to improve genAI literacy not only amongst students but for teachers, support staff, students, parents, guardians, and policy makers, with a focus on helping the Australian Curriculum, Assessment and Reporting Authority (ACARA) integrate AI literacy across all subjects during the next curriculum review cycle.

“If managed correctly, genAI in the Australian education system will be a valuable study buddy and not an algorithmic influencer,” committee chair Lisa Chesters MP said in releasing the report, whose recommendations she said “forge a strong foundation to regulate the application of genAI in Australia’s education sector.”

A high-level approach to managing genAI’s risks

Reflecting the concerns expressed through a consultation process that saw over 100 submissions received by the inquiry, the report also notes the “unacceptable risks” that genAI technologies pose to the security and privacy of the people using them.

The Department of Industry, Science and Resources (DISR) AI Expert Group – which was established to manage the evaluation of risk and the creation of ‘guardrails’ to regulate AI’s use and functionality – would be engaged to evaluate the risks of the technology in the context of the “vulnerability of children”.

This would, significantly, see bans on the use of genAIs to detect emotion – a feature built into increasingly interactive genAI tools like OpenAI’s ChatGPT-4o – which the report calls an “unacceptable risk category in schools” because genAI systems’ “understanding and interpretation of human emotion can be limited and lead to incorrect suggestions from the technology.”

To head off potential issues early on, developers and providers of education technology (EdTech) solutions should be regulated with the creation of a “system-wide risks-based legal framework,” the report advises, recommending that the government increase support for implementation of the Australian Framework for Generative AI in Schools that was announced late last year.

Ensuring safety requires regular testing and independent quality assurance as well as mandating transparency about LLM training and algorithm function, and taking steps to ensure that educational providers don’t select genAI tools that will store users’ data offshore or sell them to third parties.

Is genAI already outlearning us?

The report’s recommendations echo the long-standing debate about regulating fast-moving AI technologies, with the Australian government adopting an increasingly interventionist stance and looking to the EU and its AI Act for inspiration as it spruiks its National AI framework.

Publication of the report is welcome but is just the start, the Independent Education Union of Australia (IEU) noted as it reiterated its inquiry submission with a warning that any changes to genAI policy needed to be implemented through collaboration with teachers and unions – who cannot be expected to be “AI enforcement police” and must not be expected to add AI training onto the long list of things overworked teachers are doing in their unpaid spare time.

The extensive analysis and recommendations in the new report are “crucial to guiding the safe and ethical use of genAI in classrooms,” IEUA NSW/ACT branch secretary Carol Matthews said while noting the importance of addressing “inequitable access to technology and genAI opportunities for disadvantaged students [as] an urgent priority.”

Translating the report’s goals into workable genAI policy will require a broad and proactive approach to addressing threats that “need to be identified and carefully managed,” Matthews said, simultaneously warning that a failure to turn the recommendations into action could exacerbate the already wide gulf between the technology and the national regulatory framework.

“The reality,” she said, “is that genAI is outpacing efforts to provide a comprehensive regulatory response on behalf of teachers, school staff, and students.”