skip to main content

Your guide to using generative artificial intelligence in education

3rd Jul 2025 | Commercial Law | Data Protection | Digital & Technology | Education
A boy holding an electronic tablet standing beside a large chatbot, with a blue background

The rapid growth of generative Artificial Intelligence (AI) is creating more opportunities to streamline processes and increase productivity. However, using generative AI is not without risk, especially in the education sector.

In this article, Alex Craig, partner in our commercial team, and Ryan Douglas, paralegal in our education team look at DfE’s latest guidance on using AI which was updated in June 2025. You can find the guidance here.

Background

The Department for Education (DfE) has admitted that evidence on the benefits of generative AI and how it is used in schools is still limited. DfE continues to work with those in the sector to gather more evidence to ensure that generative AI can be used safely and effectively in schools.

In particular, content generated by AI can be:

  • inaccurate;
  • inappropriate or unsafe;
  • biased;
  • taken out of context;
  • taken without permission (intellectual property infringement);
  • out of date or unreliable;
  • low quality; and
  • entirely fabricated (known as “hallucinations” in the AI sector)

Therefore, while schools are permitted to make their own decisions regarding using generative AI in classrooms, it is essential that teachers and school leaders continue to comply with their legal obligations surrounding safeguarding, data protection and intellectual property law.

The Online Safety Act 2023 has created new legal requirements for online platforms to protect their users, especially children.

In addition to schools following DfE’s guidance and complying with relevant legislation identified in this article, schools should also be aware that any potential AI software providers they work with should be compliant with the requirements of the Online Safety Act.

Safety and effectiveness

DfE is clear that safety needs to be the top priority when considering whether to use generative AI in your school.

There should be clear benefits to using AI that outweigh the risks involved and safety must not be compromised.

Risk assessments are encouraged and should include protocols for where generative AI is used in an unauthorised or unintended way.

If your school is considering allowing pupils to use any AI at school, your school will need to consider:

  • the age of the pupils;
  • the subjects AI can be used in;
  • the ability to supervise the use of AI; and
  • the ability to filter and monitor the features of the AI software.

Schools are also encouraged to review homework policies and other types of unsupervised study to account for the possibility that pupils may have access to AI. Schools may wish to develop their own guidance on when AI can and cannot be used.

Using AI responsibly

Data protection

Schools must handle personal data in accordance with data protection legislation. This includes the Data Protection Act 2018,the UK General Data Protection Regulation, the Data (Use and Access) Act 2025 (which came into force on 19 June this year), and any relevant guidance issued by the Information Commissioner’s Office (ICO) (now known as the Information Commission with the passing of the Data (Use and Access) Act).

The ICO has guidance dedicated to children’s personal data, which you can find here.

DfE’s recommendation is that schools should not use personal data in any AI tools.

If it is necessary to use personal data in AI tools, schools must ensure they remain compliant with data protection law. This includes:

  • ensuring data subjects (pupils, parents or guardians) understand and ,where relevant, consent to their personal data being processed using AI;
  • complying with your own internal policies on using AI; and
  • updating privacy policies to ensure that the use of AI is included.

Schools should also keep in mind the potential sensitivity of the data they are processing, especially regarding children.

Intellectual property

Copyright law is distinct from data protection law. Schools must ensure compliance with both when using AI.

Material that is protected by copyright can usually only be used to train generative AI if there is permission from the copyright holder.

It is a generally low standard for material to be considered copyright. Examples could include:

  • written academic work created by pupils; and
  • lessons planned by teachers.

Schools must therefore not allow any copyright material to be used to train AI without permission, unless a statutory exception applies. 

Statutory exceptions to copyright law are generally limited, however some of them could possibly apply to schools, depending on how material is used. These could include fair dealing, or educational use.

Schools should continue to exercise caution with using copyright material and seek professional advice if in doubt of their legal obligations regarding intellectual property.

Schools are also at risk of secondary infringement if they repurpose material created by generative AI (for example, by posting AI generated images on their website) without permission.

Assessments

The Joint Council for Qualifications has published separate guidance on AI use in assessments.

Schools are reminded that they must take reasonable steps to prevent malpractice in examinations and assessments involving the use of AI.

For any questions regarding your school’s legal obligations when using AI, please contact Alex Craig using 0191 211 7911 or [email protected]

 

Frequently Asked Questions
What is Artificial Intelligence (AI)?

Artificial Intelligence (AI) is algorithmic systems that solve and perform tasks that normally require human involvement and human thinking. 

Share this story...