Helping parents and carers understand AI in education
Advice for parents and carers on AI and its use in NSW public schools.
Understanding the role of AI in education
Artificial Intelligence (AI) is becoming part of how students learn and interact with technology. Support is here to help parents and carers understand how AI might be used in NSW public schools in line with our ethical standards, how we’re keeping students safe and how you can support your child’s learning.
Used safely and ethically, AI can:
- help students explore new ideas and improve writing
- support teachers with planning and administrative tasks
- facilitate creative and critical thinking
- assist parents in supporting their child's learning at home with educational resources.
The department’s approach to safe AI
The department has clear guidelines and processes in place to manage AI safely and ethically in schools. This includes:
trialling our own secure and safe GenAI tool, NSWEduChat
only approving AI tools for department use that meet our safety and privacy standards
restricting access to tools that don’t meet these standards
providing statewide professional learning for teachers
supporting schools with resources and student modules on safe AI use.
Understanding risk and safe use
When using AI, it’s important to be aware of potential risks.
Protecting data privacy: AI tools may unintentionally store or share personal data. It’s important to remove personal details before using them.
Alignment with our NSW context: Many AI tools are trained on global data that may not reflect local culture, values or the NSW Education Curriculum.
Accountability and transparency: AI systems don’t always explain how they work, and users may be responsible for how the content is used.
Bias and misinformation: AI outputs can be inaccurate, biased, outdated, or culturally inappropriate, especially if tools aren’t specifically tailored for Australian students.
Content risks: AI-generated content may occasionally include harmful or inappropriate material, particularly if not closely monitored.
The department encourages key minimum safety practices and ethical checks for any AI use.
It's important to avoid sharing private information when using AI tools. This includes things like names, addresses, health details, financial information, or school records. Always review AI-generated content for accuracy, fairness, and appropriateness before sharing or using it, ensuring it aligns with school expectations and values.
Using AI in an ethical way means making sure it’s safe, fair and respectful. It protects privacy, reduces harm, and helps build trust in how AI is used. When talking with your child, you might explore these guiding questions together:
Oversight: Am I paying attention to what the AI is generating, or just accepting it without thinking?
Diversity and bias awareness: Does this use of AI reflect a range of perspectives, or is it reinforcing the same views?
Explainability: Can I clearly explain how this content was created and what role AI played?
Knowledge boundaries and expertise: Am I confident in using this AI tool, or should I ask someone with more knowledge to check it? Could I have done this without AI, and if not, am I comfortable taking responsibility for the result?
Critical evaluation: Is the information accurate, useful and appropriate? Have I checked it carefully before relying on it?
Respect for others: Am I using AI in a way that’s respectful of people’s privacy, identity and experiences?
Community alignment: Would this use of AI feel right to my school, my family or the wider community?
FAQs and more about department AI safety standards
No. We have strict policies to protect student information. Data collected by department-approved applications is not used to train AI models. Our privacy measures are designed to ensure data remains secure and confidential.
We follow rigorous safety and privacy standards to protect staff and students, including compliance with state and federal regulations. We carefully select AI tools to make sure they’re safe and appropriate for schools. Our process involves assessing AI tools against criteria drawn from the Australian Framework for GenAI in Schools and the NSW AI Assessment Framework. This includes:
Checking for safety: All AI tools must be tested carefully to prevent bias and ensure it provides safe, helpful information.
Keeping things clear: Students and teachers always know when they're using AI and can easily understand its decisions.
Emphasising ethical use: We deliberately choose AI tools that are assistive and complementary, keep students emotionally safe and reinforce human-led education.
Conducting rigorous assessments: All AI tools must meet strict fairness, quality, and content-safety standards.
You don’t need to be an expert in AI to support your child. Here are some ways you can help:
Talk about it: Ask your child if and how they’re using AI at school. Encourage curiosity but also remind them that not everything AI says is accurate.
Encourage balance: Help them think of AI as a tool to support them, not a shortcut or a replacement for their own thinking.
Know what’s allowed: Remind your child to only use department-approved tools for schoolwork and in line with school assessment guidance. Free AI tools may not be safe or appropriate.
Ask questions: If you’re unsure about anything, speak with your child’s teacher or school principal.
Only students in select trial schools currently have access to NSWEduChat, our secure and safe department-built GenAI tool. This controlled environment helps us monitor and evaluate AI applications safely and effectively before considering broader implementation.
While the department manages the safety and security of AI tools used in public schools, we understand that many families have questions about the AI tools their children may use at home. While we do not manage AI use on personal devices, there are some important things to be mindful of:
Data rights: Ensure you understand what personal information these tools collect and how it is used. Information on this matter is usually referenced in the Terms of Use, typically found on the company website.
Age restrictions: Many AI applications have age guidelines. Check these to confirm suitability, and if parental consent is required.
Content generation: Be aware that some AI tools can produce harmful or explicit content, especially in unsupervised environments.
More information and advice
The department has released guidance to help families understand the risks associated with emerging technologies. This includes information about deepfakes, AI companions, and general AI safety tips.
The eSafety commissioner also has released a detailed risk analysis on AI chatbots and companions. In addition, they have specifically released their security and privacy assessment for popular AI chatbots such as Talkie, Character.AI, Replika, Linky and Google Gemini.