How Schools Can Create a Simple AI Policy

A step-by-step guide for headteachers and school leaders to create a practical, proportionate AI policy — covering staff use, pupil use, and academic integrity — in plain English.

Policybeginner10 min read·Updated 10 April 2026

Why Schools Need an AI Policy Now

AI tools are already in schools — whether officially or not. Pupils are using ChatGPT and similar tools to help with homework and coursework. Teachers are using AI to save time on lesson planning, report writing, and resource creation. Support staff are using it for communications and admin. An AI policy isn't about deciding whether to allow AI: it's about making explicit what's appropriate, what isn't, and why.

Without a policy, schools face several risks: pupils submitting AI-generated work without understanding the academic integrity implications; staff entering pupil data into tools without appropriate data protection safeguards; and inconsistent practice across departments creating confusion and unfairness. A well-constructed policy resolves these issues and gives the whole school community clarity.

Ofsted has indicated that inspectors will look at how schools are responding to AI, and the Department for Education has published guidance on generative AI in education. Governors also have an interest in AI governance as part of their data protection and safeguarding oversight responsibilities. Creating a policy is both practically useful and demonstrates responsible leadership.

What a School AI Policy Should Cover

A school AI policy doesn't need to be lengthy — a clear, two-to-four page document is more useful than an exhaustive treatise that nobody reads. The key areas to cover are:

  • Scope: Which staff, pupils, and contexts does the policy apply to? Does it cover personal devices used for school work?
  • Approved tools: Which AI tools are permitted for staff? Which, if any, are permitted for pupils and under what conditions?
  • Data protection: What pupil or staff data can be entered into AI tools? How does this align with your school's GDPR responsibilities and privacy notice?
  • Academic integrity: What constitutes appropriate versus inappropriate use of AI by pupils in assessed work? How does this vary by key stage and assessment type?
  • Staff guidance: What standards apply to AI-assisted content produced by staff — lesson plans, reports, parent communications?
  • Review: When will the policy be reviewed? AI is changing quickly; a six-monthly review cycle is more appropriate than an annual one.

Handling Pupil Use and Academic Integrity

This is the most sensitive area for most schools, and the one that requires the most nuanced thinking. The blanket answer of "pupils must not use AI" is increasingly difficult to enforce and may not even be desirable — AI literacy is itself a skill pupils will need. A more considered approach distinguishes between different contexts.

For formal assessments and coursework that contribute to qualifications, the position is clearer. Exam boards including AQA, OCR, and Edexcel have issued guidance that AI-generated content submitted as a pupil's own work constitutes malpractice. Schools should ensure pupils and parents understand this clearly, and that teachers know how to detect and respond to suspected AI use in assessed work.

For everyday learning activities — research, drafting, brainstorming — the picture is more nuanced. Some teachers are finding that controlled AI use can support learning, particularly for pupils with SEND or those developing English as an additional language. The policy can acknowledge this flexibility while establishing clear principles: AI should support learning, not replace the thinking and effort that develops understanding.

Whatever your school's position, the most important thing is that it is clearly communicated to pupils, parents, and staff — and that pupils receive some education about what AI is, how it works, and why academic honesty matters.

Data Protection Considerations for Schools

Schools handle a large amount of sensitive personal data — pupil records, SEND information, safeguarding notes, family details. UK GDPR applies in full, and the ICO has published specific guidance on AI and data protection that is directly relevant to schools. The key principle is that personal data about pupils or staff should not be entered into AI tools unless appropriate data processing agreements are in place and the privacy implications have been considered.

In practice, this means staff should not type a pupil's name, year group, and behavioural notes into ChatGPT to get advice. They should not ask an AI to help write a report for a named child using details about that child's family circumstances or learning difficulties. The risk isn't hypothetical — consumer AI tools may store these inputs in ways your school cannot control.

Schools that are part of a trust or local authority may have access to enterprise tools with better data protection guarantees. Otherwise, the safe approach is to ensure staff are trained to anonymise or de-identify information before using AI for any task that involves real individuals. This is a practical habit that can be established through a brief training session and reinforced in the policy itself.

Writing the Policy: A Practical Process

The most effective AI policies are written collaboratively, not handed down. A working group that includes senior leaders, a classroom teacher, a SENCO, the designated safeguarding lead, and a governor will produce a more balanced and workable document than one written in isolation by the headteacher. Pupil voice — through the school council or a pupil survey — is also worth incorporating, particularly in secondary schools.

Use our AI Policy Generator tool to produce a draft tailored to your school's phase and context, then bring it to the working group for discussion and refinement. Present the final policy to the full governing body for approval — it's both a data protection matter and a safeguarding-adjacent one, both of which are areas of governor oversight.

Once adopted, communicate the policy clearly to all staff at a briefing or INSET session, to parents via your newsletter or parent portal, and to pupils through tutor time or assemblies. Include it in the staff handbook and the school's online policy library. And set a review date: the AI landscape is changing fast, and a policy written today may need significant updating within six to twelve months.

Frequently Asked Questions