Key considerations when building a suitable AI policy for your school
The government believes AI can support every child and young person when used safely, effectively and with the right infrastructure in place, but more than 56% of teachers are concerned it might prevent children thinking for themselves. There will always be pros and cons to AI, so to help implement it responsibly and ethically while harnessing its full capabilities in the classroom, schools should consider creating an AI policy that sets out clear expectations for staff and students. In 2024, less than 10% of schools had an AI use policy in place. Why is a policy important, what should be included and how can schools get started?
Why do schools need an AI policy?
AI can benefit students, teachers and the wider school community, but its use needs to be regulated and monitored to ensure safe access and optimum results. If clear rules are not applied to AI use, schools face risks from security breaches, safeguarding issues, data leaks, plagiarism and more.
A good AI policy will outline when AI can be used, but also how, tackling the inclusion of AI in the curriculum and delving into use cases for everyone in the school. Various stakeholders will have an active interest in its implementation — for example, 82% of teachers feel that students should be taught how to engage critically with generative AI tools. A robust policy will address concerns and provide an effective path forward, acting as an important resource for leaders, teachers and support staff.
Want to upskill yourself on all things AI in education?!
What should be included in a school AI policy?
Policies should assess the potential impact of AI and its various use cases, covering:
- Safeguarding. This should include the steps to take for effective safeguarding as well as outlining who is the designated safeguarding lead. To formulate actionable safeguarding practices, schools will first need to complete a risk assessment, then build policies to help mitigate these risks.
- Approved tools. Any approved tools will need to be assessed for GDPR compliance as well as adherence to internal and external regulations. Tools that staff and/or pupils can use should be listed in the policy, along with any disclaimers or caveats.
- Data protection. Schools must be hyper aware of their obligations to protect personal data, and it’s recommended that personal data isn’t entered into generative AI tools. Staff must also be transparent about how AI is used to process data.
- Security. Any systems used should be assessed for security, considering encryption, access control and authentication. Open AI systems may pose a higher security risk, while closed systems are more likely to comply with security requirements.
- Acceptable use cases. An effective policy will outline when and how AI can be used, in the classroom as well as outside of school and in back office functions. Over 59% of 13- to 18-year-olds are already using AI for homework, so schools must consider their stance. The policy may also contain sanctions or further actions to be taken if rules are not respected and followed.
Some inclusions will apply to all schools and are primarily needed to maintain safety and compliance. Others will be more nuanced, such as acceptable AI checker tools and their implementation, and the course of action to be taken if rules are not followed. Anything that goes into the policy should be agreed upon by relevant stakeholders and finalised by senior leadership to ensure a unified approach across the school.
Tips for producing a robust AI policy
- Understand the tools that are out there. To build an appropriate policy, schools first need to understand what tools are available, how they’re used and their risk-reward profile. Take the time to conduct thorough research into AI tools and form a view on their use in your school.
- Update the policy regularly. AI policies may need to be reviewed more regularly than other documentation given the fast-changing nature of the technology, and schools should expect to make some fundamental changes. Include a date for the next review within the policy.
- Work with the right professionals. New policies should be produced and reviewed by appropriate stakeholders such as IT and data protection officers, headteachers and the board of governors. It may be sensible to form a steering committee to review policy updates and take in requested changes from staff. You may need to seek legal input from an appropriate professional to ensure safeguards go far enough and the policy will adequately protect the school, pupils and staff.
- Build in flexibility. There will always be some grey areas and policies can’t cover every eventuality, especially as technology advances. A degree of flexibility will be needed and your policy should emphasise the importance of good judgement, placing some onus on individuals to understand their obligations and remain vigilant.
- Take a balanced approach. AI has good points as well as downsides. Your policy should reflect this, indicating its potential and outlining the drawbacks for everyone’s awareness.
- Produce and circulate a policy as soon as possible. It will likely require future revisions, but when it comes to AI in schools, often the biggest risk is doing nothing. Make it a priority to formulate the policy so everyone in the school has a point of reference.
Training staff on school AI policy
An AI policy is just the starting point; staff must fully comprehend their responsibilities and be able to act on them in their day-to-day jobs. Conduct training to make sure staff know how they can use AI, and what they must be aware of if they do. While tools will be vetted for their compliance during risk assessments, staff must still be mindful of data use — for example, teachers may need to be educated on when and how they input identifying information, and when this needs to be avoided altogether.
Staff must also be acutely aware of their responsibilities around content they generate using AI, as they will be directly responsible for it. The content has to be suitable for its audience, and must not use any copyrighted material. If staff have any doubts, they should be encouraged to err on the side of caution and choose not to use AI tools. They should be prepared to be transparent about when they have used AI, and to justify their reasons for taking this approach.
Nail down responsible AI use in your school
Understand how AI can work for schools and what protections are needed so you can start shaping your bespoke policy. Explore AI-powered solutions for schools, learn about the role of AI in education or contact an expert to find out more.
AU & NZ
SG
MY
US
IE