Government framework aims at enhancing education’s effectiveness, but experts caution about a deepening learning gap.
ChatGPT is set to be officially introduced in all Australian schools this year, following the endorsement of a framework by education ministers to guide AI usage.
From adopting the technology as an educational tool to implementing complete bans and reverting to traditional pen-and-paper exams, the education sector has faced challenges in determining how to handle the chatbot since its launch in late 2022.
Now that it has established a permanent presence, here’s what you should be aware of regarding schools in the era of AI.
What does the framework include?
Released in December, the framework delineates principles for the use of emerging technologies, incorporating privacy and security standards, as well as considerations of equity and accessibility.
Developed by the national AI schools taskforce, the framework underwent consultation with school sectors, education unions, experts, and First Nations Australians.
Under the stipulations, schools are required to involve students in understanding how generative AI tools function, including acknowledging their “potential limitations and biases,” with teachers designated as “subject-matter experts” in the classroom.
Student work, including assessments, must clarify the appropriate or inappropriate use of generative AI tools, ensuring proper attribution.
Additionally, the framework recognizes AI’s potential to assist students with disabilities, those from diverse backgrounds, and in rural and remote communities, as long as it is accessible and equitable.
Australia’s Reaction to ChatGPT
Last year, all states and territories, with the exception of South Australia, implemented temporary restrictions on ChatGPT in public schools due to mounting concerns about privacy and plagiarism. Meanwhile, some private schools incorporated it into their teaching and services.
Subsequently, the education minister, Jason Clare, initiated an inquiry into the use of generative artificial intelligence, examining the opportunities and risks it presents for students and teachers.
Clare emphasized to Guardian Australia that his primary concern is ensuring that schools do not utilize generative AI products that sell student data.
If we successfully navigate this, generative AI has the potential to individualize education and enhance the effectiveness of learning,” he stated. “We will continuously reassess the framework to stay abreast of advancements.
Julie Birmingham, a spokesperson for the Department of Education, informed the inquiry that while technology was advancing rapidly, Australia had been at the forefront of its response.
Early research indicates that AI could offer intelligent tutoring systems, improve personalization, provide more precise learning materials, and assist in educating at-risk students, she noted.
“The key question is how do we implement [the taskforce] and support teachers and schools in addressing the challenges,” she added.
Who Leads the Way?
South Australia stands out nationally by choosing not to impose a ban on the technology upon ChatGPT’s release.
Blair Boyer, the Minister for Education, Training, and Skills in South Australia, emphasized to Guardian Australia that schools would be doing a “remarkable disservice” to young people if they failed to educate them about the appropriate use of AI. He noted, “AI will be a part of our work and lives in the future.”
Since the launch of ChatGPT, South Australia’s Department of Education has created a generative AI chatbot app called EdChat, utilizing the same language model as ChatGPT but incorporating built-in safeguards to ensure student privacy and avoid inappropriate content.
Currently undergoing testing in 16 public schools, EdChat does not store or use students’ input for learning purposes, distinguishing it from ChatGPT.
The trial aims to guide the integration of AI into the state’s curriculum and has contributed to the formulation of Australia’s framework.
Are other states and territories participating?
Queensland has conducted a limited trial in state high schools, testing an AI teaching and learning tool named Cerego with 500 students.
Cerego, an adaptive learning platform employing AI to generate quiz-based learning tailored to individual student needs, is set to be introduced to all state schools later in 2024.
Victoria was among the first states and territories to remove restrictions on accessing ChatGPT, doing so in the second term of the previous year.
A spokesperson for the Department of Education in the state mentioned that any incorporation of AI into the curriculum would be determined by schools while adhering to the “overarching principles” of safe and ethical use.
Tasmania is also in the process of formulating its own policy, procedures, and materials for the 2024 school year, highlighting that the latest version of the Australian curriculum includes methods for integrating AI education.
Western Australia is contemplating AI trials conducted in other states to streamline lesson planning, grading, and assessment procedures, aiming to reduce teacher workloads. The state has emphasized AI’s potential to automate tasks such as excursion planning, meeting preparation, and general correspondence.
The Australian Capital Territory is adopting a cautious approach, acknowledging the persistent risks of algorithmic bias and unauthorized use of student data by tech companies. A spokesperson from the ACT education directorate stated, Our current priority is to establish a robust educational framework to guide teachers in how students use AI tools responsibly.
Murat Dizdar, the secretary of the New South Wales education department, expressed the state’s commitment to national collaboration and active participation in discussions to prepare learners for a future where generative AI is part of everyday life.
What risks persist?
Leslie Loble, a UTS academic and former deputy secretary in the NSW education department, highlighted that while the initial “shock of ChatGPT’s emergence has subsided, the issues surrounding generative AI are “by no means settled and resolved.
“The transition from seven states banning it to the current state indicates how much schools and systems have progressed in understanding the potential benefits and risks, she remarked. However, Loble emphasized the need to establish clearer standards and expectations for what AI should deliver and how it should be defined and governed.
She acknowledged that the framework serves as an “excellent foundation,” but key risks endure, with a particular focus on the persistent challenge of equity and the digital divide.
A significant number of students still lack access to basic computer resources and Wi-Fi,” she noted, pointing out funding disparities between government and independent schools in terms of resources.
A distinct disparity exists in the adoption of advanced generative AI tools, with discrepancies between households and schools possessing resources and those that do not.
The potential positive impact of these technologies on learning is significant, but the existing divide is concerning and is likely to exacerbate the learning gap.
While Australia’s framework has successfully linked AI to teaching and learning on a global scale, Leslie Loble emphasized the importance of not adopting a “set and forget” approach. She stressed the necessity of consistently integrating AI into teacher-led programs, which requires investments in professional learning and support.
Loble pointed out the substantial and growing workload of teachers and warned that without improved information about these tools, AI might end up increasing teacher workload as they become responsible for deciding how to use it.