Applying AI to student academic data has helped the University of NSW (UNSW) identify 284 students at risk of failing and intervene early to help them – but as the program expands amidst surging dropouts, one data analyst warns that AI success doesn’t come overnight.
Years before its Academic Success Monitor (ASM) tool entered initial testing – which ran last year and involved 33 academics teaching 25 courses across UNSW’s Sydney facilities – the project team was already working to figure out where it could source large quantities of good data about student performance to benefit staff.
After evaluating a range of potential data sources the “big fish”, UNSW AI project manager Walter Tejada Estay told Information Age, turned out to be Moodle – the learning management system (LMS) that, at UNSW as at most other universities, is a core tool for supporting student learning and engagement.
Working with Moodle’s developers and the direct engagement of the UNSW chief data and insight's officer Kate Carruthers, the UNSW Learning Analytics Intelligence team did a deep dive into the type and quality of the data the system was collecting about student interactions – which includes, among other things, click-level data that can show how quickly students are processing data and moving through learning modules.
As the team analysed, cleaned, and organised the data related to one course, it became clear that knowledge could be similarly applied across all UNSW courses to provide a broad view of student performance.
“When we did that,” Estay said, “we started to understand that we had something more powerful. We were initially just trying to provide some insights, then realised that we had enough data to actually try to predict something.”
Yet extending the system across all UNSW courses was easier said than done: standardising years’ worth of data across thousands of courses – and combining it with data about student enrolments, transcripts, and other elements – required the creation of a “generic” data model that included around 40 different elements common to the courses and excluded some of the idiosyncrasies present only in certain courses.
“Our assumption was that if we just put this data into a machine learning model, maybe it would be able to understand what was happening,” Estay explained.
“When we started experimenting with machine learning, that’s where our first use case emerged.”
“If we just provided that data at that level of granularity, the ML was able to identify patterns of success or risk course by course – and to categorise whether students are looking like they’re going to pass or fail.”
Enabling intervention when it still matters
Better understanding the reasons students fail courses – or drop out of entire degrees – has become crucial as recent Department of Education figures show a slump in university completion with direct implications for skills pipelines in a range of key areas.
For example, nearly 70 per cent of IT jobs recently said to be in shortage and the government has created bodies like Jobs and Skills Australia to address the situation.
Just 40.9 per cent of students commencing a Bachelor’s degree in 2019 had completed it by 2022 – yet this isn’t just the fault of the pandemic, since the figure from 2016 to 2019 was only marginally higher, at 43 per cent.
ASM, which runs on Microsoft solutions including the Azure OpenAI Service, proved remarkably accurate at producing meaningful results.
After its initial testing last year showed the system’s validity, an expanded ASM pilot earlier this year analysed data about 17,000 students enrolled in 80 courses – and identified 284 students who were at risk of failing and needed additional support.
Interviews with academics confirmed the model’s specificity, with feedback indicating that most felt ASM’s predictions were accurate – and three quarters indicating that it had identified potential risks much earlier than was previously possible.
Even more importantly, 49 per cent of students who received “proactive nudges” warning of their impending problems ultimately lifted their game, showing statistically significant increases in class engagement.
Those results have been strong enough that UNSW will start rolling out ASM to all first year students and teaching staff at the beginning of 2025 – ultimately reaching all of UNSW’s more than 80,000 students and 7,000 staff.
Estay’s team is exploring additional use cases, such as a model customised to evaluate students in online courses, and is working on ways to incorporate attendance records based on requests from academics.
“We are incorporating new indicators progressively and iteratively,” Estay said, “and this is going to be a continuous improvement thing to increase our precision accuracy.”
“And while we have binary categorisation right now – the student will either pass or fail – we would like to go to a more goal-based approach where we can provide targets to students who would like to do better.”