As universities embrace artificial intelligence, tomorrow’s school leavers will enter a sector rich with personalised study, creative opportunities and fresh pathways for lifelong learning success.
When it comes to artificial intelligence in education, South Australia has been quick to act. While many jurisdictions hesitated, the state government moved early to embrace AI rather than resist it. One of the results was EdChat, a bespoke AI tool piloted in secondary schools before being rolled out more widely. According to the University of Adelaide’s Professor Edward Palmer and researcher Dr Daniel Lee, this willingness to adopt gave the state a six-to-eight-month lead over others – a head start that allowed schools and universities to experiment with how AI might reshape teaching, learning, and assessment.
But early adoption has not been without challenges, as Dr Lee’s research has revealed. “We were surprised at how little secondary school students had engaged with EdChat,” Dr Lee said. “Even when they had access, many chose to use generic tools like ChatGPT instead.”

That decision speaks volumes about how students perceive institutional versus public AI platforms. As Prof Palmer, the Director of the Digital Learning and Society Hub in the School of Education at the University of Adelaide, explained, students may prefer to work outside the gaze of teachers, or believe that commercial tools provide better interaction. Some even feared that school-provided platforms were being monitored.
This tension between control and autonomy underpins much of the debate around AI in higher education: who sets the boundaries, who decides what tools are used, and how does this affect authenticity and integrity in learning? Prof Palmer suggested that students may simply trust newer commercial models more than locked-down institutional versions, which are designed to be secure but sometimes less flexible.
“The ability to be able to create might be better in some of the more recent releases of AI engines,” he said. “That could be influencing their choices as well.”

Assessment design is emerging as one of the critical battlegrounds. Dr Lee’s team is currently researching how different forms of assessment fare in an AI-rich environment. Some are highly vulnerable to AI-generated shortcuts; others, more resilient.
“There’s a whole range of ways to assess students, and some lend themselves to AI more than others,” he said.
Prof Palmer sees opportunity rather than crisis: “If AI helps a student pass, that’s fine, as long as it also helps them learn. I don’t mind whether students use AI in my assignments, provided they can demonstrate they’ve gained knowledge from the process.”
That shift in perspective demands more personalised tasks. Rather than abstract problem sets that AI can easily solve, Prof Palmer advocates for assignments rooted in students’ own experiences – asking them to draw on personal observations, conduct small experiments, or document learning in ways that are harder for AI to replicate.
“If the work is simply a matter of solving problems on a sheet of a paper, that’s always been open to help from parents, siblings or now AI. The point is to set work where you can determine that the student is the one learning,” he said.
The larger class sizes in higher education make this even more pressing, in Prof Palmer’s opinion.
“In secondary school, a teacher knows their students’ capabilities. If a piece of work suddenly looks out of character, they’ll notice. At university, where a lecturer may have 300 students in a cohort, it’s much harder to detect.”
One of AI’s greatest promises lies in its potential for personalised learning. Prof Palmer highlighted how AI can accelerate differentiated approaches: students who are ahead can extend their learning independently, while those who struggle can receive targeted support.
“Teachers have always wanted to tailor learning,” he explained, “but it’s never been feasible to devise a curriculum for every individual student. AI makes that far more possible. The goal is not to abandon students to machines, but to monitor and guide them while AI provides structured pathways.”
Dr Lee pointed to experiments combining AI with virtual reality to create personalised digital tutors. Such immersive tools may help humanise the interaction and keep students engaged, while still allowing teachers to oversee progress. He described research showing avatars capable of tutoring students in real time, giving the illusion of a one-to-one mentor.
International models are pushing the boundaries further, including a Texas school where mornings are taught entirely by AI, while afternoons are devoted to passion projects with human teachers. While Prof Palmer doubted such a model would be workable for younger students, he acknowledged that in higher education – where learners are more autonomous – it could be feasible.
“The human in the loop remains essential,” he cautioned, especially for adolescents whose decision-making skills are still developing. “But at the university level, carefully designed AI integration could allow students to follow interests while still meeting curriculum goals.”
Much of the public discussion around AI in education has focused on reducing teacher workload – automating report writing, administration, and routine feedback. Prof Palmer agreed these are valuable benefits, freeing educators to spend more time teaching and mentoring. Yet both he and Dr Lee see the real innovation in pedagogy. They argue AI should not simply replicate existing tasks more efficiently; it should enable forms of teaching and assessment that were previously impossible.
One topic rarely raised in schools, but increasingly discussed in universities, is the environmental footprint of AI.
“The environmental cost of training AI models has increased significantly.The latest figures suggest training ChatGPT3.5 emitted as much CO2 as seven Mitsubishi Outlanders during their lifetime,” Dr Lee said.
Prof Palmer added that tech companies are even striking deals to bring nuclear power plants online to meet AI’s energy demands. “That’s a huge ethical issue in itself, but also a teaching opportunity,” he said. At present, universities are incorporating those discussions into coursework.
The University of Adelaide is embedding these debates directly into its curriculum. A new AI Common Core course will soon be available for all undergraduates, exploring ethics, responsible use, environmental impacts, and the boundaries of what AI should and should not do.
“Students will look at what you can do with AI, what you shouldn’t be doing with it, and how to genuinely learn with it,” Prof Palmer explained. “That’s central to our responsibility at the higher education level. Equally important is that AI is being used in industry, new jobs are being created specialising in AI and we must teach students how to use it effectively and correctly.”
Despite the sophistication of AI tools, both experts emphasised that educators remain the key safeguard of learning integrity. “Teachers and lecturers are trained to recognise student progress and capability. The challenge is ensuring that AI supports – rather than undermines – that professional judgment,” Prof Palmer said.
That may mean rethinking assessment, balancing efficiency with rigour, and openly discussing with students what responsible use looks like.
As Dr Lee put it, “The assumption that AI is helping is itself worth questioning. It’s changing education, certainly – but whether it helps depends entirely on how we design for it.”
AI in higher education is not a passing trend. From personalised learning and assessment innovation to ethical debates about energy consumption, it is reshaping the sector at every level. For secondary school leaders preparing students for university pathways, these conversations matter.
As Prof Palmer concluded, “The trick isn’t stopping students from using AI – they will. The challenge is ensuring it deepens their learning.”




