Artificial intelligence (AI): dos and don'ts for data protection

Take a look at our tips to make sure you stay compliant with data protection around the use of artificial intelligence (AI) tools. Share them with your team to make sure everyone is on the same page.

Updated
on 2 July 2025
See updates
School types: AllSchool phases: AllRef: 46617
Contents
  1. Do understand how AI uses the data you input
  2. Don't enter personal information into an 'open' AI tool
  3. Do only used trusted 'closed' AI tools to process personal data
  4. Do think about the impact of new AI tools
  5. Do review your policies
  6. Do teach pupils about using AI safely
  7. Do make sure staff training is up to date
  8. Don't prepare specific material on AI data protection for Ofsted
  9. Try out KeyGPT, our AI-powered assistant

Do understand how AI uses the data you input

The DfE's guidance on AI and data protection explains the difference between 2 types of generative AI tools:

Open generative AI tools use the data you input to learn and improve their outputs. This is known as 'training'. This means they might share any information you give them with other users, including personal data.

Closed generative AI tools don't use your data for training, so are generally more secure, as nobody else can access the data you input.

It's not always clear whether an AI tool is open or closed. If you're unsure, you can:

  • Ask your data protection officer (DPO) or IT lead
  • Read the privacy policy for that product

Don't enter personal information into an 'open' AI tool

Use ChatGPT to write anything that contains pupils' names Enter sensitive