Staff factsheet: AI in schools

Get your head around the basics of generative AI, including ChatGPT and Google Gemini, and what it means for your school. Share this information with your staff and governors or trustees, to get everyone on the same page.

Updated
on 9 May 2024
See updates
School types: AllSchool phases: AllRef: 46559
Contents
  1. What is AI?
  2. How to respond to AI
  3. Ofsted will judge your use of AI
  4. Share this information with your staff and governors
  5. Try out The Key's new AI-powered assistant

What is AI?

AI isn't new

Artificial intelligence (AI) is the use of computer systems to solve problems and make decisions. It’s already a part of everyday life – you’ve probably come across it in the form of personalised suggestions on social media, shopping sites or route-planning apps.

However, the technology is developing rapidly and throwing up many new applications and challenges for schools.

What is generative AI?

Generative AI takes a written prompt and runs it through an algorithm to generate new, ‘natural’-seeming content – i.e. it seems like it was created by a person, not a computer. Tools include:

  • Chatbots such as ChatGPT, Google Gemini and GrammarlyGO, which generate text
  • Text-to-image programs like DALL-E and Midjourney, which create images

How to respond to AI

Review your homework and exams policies

Consider how you will approach homework and whether you need to revise your homework policy to take into account pupils’ access to generative AI tools. The DfE suggests this in its policy paper on generative artificial intelligence in education (under the heading 'The limitations of generative AI tools').

Use and adapt our sample text to add to your own homework policy.

The Joint Council for Qualifications classifies AI misuse – where a pupil submits AI-written work as their own – as malpractice. Check whether your policies that cover exams, assessment, coursework and plagiarism reflect this, and revise them if needed.

Explain your rules on AI use to pupils 

Make sure pupils know that using AI without crediting it is not allowed in exams, coursework or any work that’s internally assessed to count towards a qualification. Remind pupils of this when they have exams and coursework coming up.

If you’re extending this policy to homework and other independent study, communicate this clearly to all pupils.

Have an open dialogue with pupils about how and when AI tools can be used to support learning, and when they shouldn’t be. Support pupils to find age-suitable tools and resources and use them appropriately, without relying on them too much (DfE policy paper, page 5). For example:

  • Use a PSHE or computing lesson to teach pupils how and when to use an appropriate tool
  • Discuss the issue if a pupil brings it up in class or submits AI-generated work

Never enter sensitive information into an AI tool 

Continue to follow your data protection principles and rules and be aware that any text entered into an AI tool is potentially being made public. If you’re using AI for any reason, don’t enter any personal or sensitive data.

You may also be targeted by fraudulent emails, such as 'phishing attacks', which are AI-generated and very convincing. Use our data protection training resources to keep your staff vigilant to cyber threats. 

AI could save you time …

You can use AI to cut down on some of your workload. For example, it could help you:

  • Create a comment bank to use when report writing
  • Come up with ideas for charity fundraising activities
  • Write quiz questions to check pupils’ knowledge

… but it’s not always reliable …

AI tools are only as accurate as the information they’re trained on. They may generate responses that are incorrect, biased, or inappropriate.

Many tools are based on a defined set of information, so won’t be able to accurately give you answers about anything that has changed after their data was inputted – for example, new statutory policy requirements or current events. 

… so it's important to check AI-generated results carefully 

You can use AI tools as a starting point, but you should always check and adapt the results so they are:

  • Taking the best interests of staff, pupils and the school into account
  • In line with your school/trust policies, procedures and guidelines that cover generative AI 

Ofsted will judge your use of AI

Inspectors will consider how schools' use of AI contributes to the criteria set out in the inspection framework, such as the quality of education and safeguarding. Ofsted won't be directly inspecting the quality of AI tools, and you won't be penalised for not using AI.

If you do use AI, Ofsted expects you to:

  • Make sure the AI solutions are safe and secure, and protecting users' data
  • Be transparent about your school's/schools' use of AI and make sure you understand the suggestions it makes
  • Use AI only when it's ethically appropriate to do so
  • Closely monitor the AI you use for bias
  • Identify and correct any bias or problems, where appropriate
  • Give staff clear roles in monitoring, evaluating, maintaining and using AI tools
  • Make sure that staff are empowered to correct and overrule suggestions made by AI
  • Respond appropriately to any complaints about errors made by AI

Ofsted itself will also be using AI – for example, to help decide whether a school that was judged 'good' at its last inspection should receive a graded or ungraded inspection.

See Ofsted's policy paper on AI for details.

Share this information with your staff and governors

You can download this article as a document that you can adapt to reflect your school’s context and approach to AI.

Share it with your staff and governors or trustees, to make sure everyone’s confident on the basics of AI.

KeyDoc: AI in school – staff factsheet DOC, 200.5 KB

Try out The Key's new AI-powered assistant

Visit our AI Playground (beta) and: 

AI Playground is designed specifically for school leaders, and it's a safe space for you to experiment with gen-AI. 

Article Updates

9 May 2024

We updated this article to include Ofsted's approach to AI, set out in its policy paper.

What did you think?

Rate this article

Why did you give this rating?

Your feedback helps us to ensure our articles are helpful to all members.

Our researchers read every comment.

Can't find what you need? Try searching, or ask us a question.