Expert commentary by Professor Ritesh Chugh,
CQUniversity Professor in Information and Communications Technology
In the first week of term, a teacher sets a writing task. One student uses a generative AI tool to generate a polished response. Another uses the same tool to test ideas, get feedback on a draft, and learn how to improve it. The technology is identical. The learning is not.
As students return to classrooms, generative AI tools such as ChatGPT are already shaping how they approach writing, research and problem-solving. For schools, the question is no longer whether students will use AI, but whether they will be taught how to use it well.
The first weeks of term matter. This is when expectations are set, habits form, and misunderstandings harden into practice. If schools do not address AI use early, they risk confusion about academic integrity, uneven learning outcomes, and avoidable conflict between teachers, students and families.
A useful starting point is practical AI literacy for students. This goes beyond warning students not to cheat. Students need to understand how to ask useful questions of AI tools, how to check outputs for errors or bias, and how to take responsibility for the work they submit. Treating AI as a thinking partner, rather than an answer machine, helps students see that effort, judgement, and learning still sit with them.
Teacher confidence is equally important. Many teachers return to school knowing AI is present, but may be unsure how to address it consistently across subjects. At the start of term, teachers need clear, usable guidance and targeted, practice-focused professional learning, not abstract policy statements. This should include shared language about what AI can and cannot be used for, practical classroom examples, and support to model appropriate use confidently in front of students.
Homework is where the boundary is tested most often. When rules are unclear, students fill the gaps with their own interpretations. Some use AI as a tutor to explain concepts or improve drafts. Others use it to generate answers with little understanding. Schools that define AI as a learning assistant and explain where support ends and substitution begins could reduce academic integrity risks and student anxiety.
Parents also need clarity. Many will want to support learning at home but may be unsure what appropriate AI use looks like. Schools can help by sharing simple guidance on when AI can be used for homework, what crosses the line into doing the work for the student, and how students should show their thinking. When expectations are consistent between home and school, it becomes easier to keep AI use focused on learning rather than shortcuts.
Subject-specific approaches also matter. In ICT and digital technologies, AI can help analyse code, debug logic, or compare design alternatives, provided students still produce and explain their own solutions. In literacy and humanities subjects, AI can support planning, language refinement, or critical comparison, but it should not replace reading, argument construction, or evidence selection. Cross-curriculum consistency helps students transfer good practice rather than guess different rules for every class.
What is often missing from public debate is that blanket bans on generative AI can be difficult to enforce and often do not prevent student use. Instead, they tend to push the use underground and widen the gap between students who understand the tools and those who do not. Teaching responsible use early is more effective than relying on detection or punishment later.
Back to school is therefore a moment of opportunity. Schools that use the first weeks of term to build shared understanding of generative AI can support learning, protect academic integrity, and reduce teacher workload over time. Schools that avoid the conversation may find that students set the norms instead, and those norms may not serve learning.
Professor Ritesh Chugh is an ICT academic and researcher at CQUniversity’s School of Engineering and Technology. His work focuses on Generative AI in higher education, academic integrity and AI literacy for staff and students. He also researches digital transformation and the responsible use of emerging technologies in professional practice. He regularly leads professional development and produces practical guidance to help educators adopt AI safely and effectively.
CQUniversity Australia is a trading name of Central Queensland University
ABN: 39 181 103 288
RTO Code: 40939
CRICOS: 00219C
TEQSA: PRV12073