Skip to main content
Share via Share via Share via Copy link

Estyn to review the use of AI within schools in Wales

04 February 2025
Trish D'Souza and Claire Archibald

Whilst the technology surrounding Artificial Intelligance (AI) itself has encountered unprecedented growth, governments are still attempting to stay apace by striking a delicate balance between promoting innovation and ensuring accountablity. The same can be said for the Welsh government, specifically within the education sector.

AI is already available to us in the technology we use on a daily basis such as text to speech tools and translation tools. The huge rise in the access to AI tools, and generative AI in particular, has led to concerns about how this could be used appropriately within the education sector and the impact it has on learners.

To further understand the scope of generative AI use within schools, Estyn, the education and training inspectorare for Wales, has been tasked by the Welsh government to conduct a review.

Cabinet Secretary for Education Lynne Neagle has said: “Artificial Intelligence presents a huge potential for schools; the technology is evolving quickly, and it is vital that schools are supported to navigate change.” .

Are schools well-equipped to embrace AI?

The review is primarily motivated by the Welsh government's commitment to ensure that schools and public spaces are well-equipped to embrace new technologies like AI. It aims to prepare schools to address, understand and harness the opportunities the technology provides, whilst also developing strategies to mitigate any potential risks.

Estyn’s role will predominantly focus on the current use of generative AI and explore the potential benefits to schools, whilst also considering the challenges it poses. They will assist schools and further education colleges by developing proactive and practical guidance in understanding AI.

“Safe, ethical and responsible”

As schools and other settings are now able to use AI tools to support learning, it is essential to ensure all such use is safe, ethical and responsible. Estyn will also ensure all practitioners develop the skills and knowledge to use this technology in supporting learners to thrive.

The first phase towards this will include a survey for schools and pupil referral units asking for their views and experience, followed by more in depth engagement with teachers. The findings are expected to be published in summer 2025.

Owen Evans, His Majesty’s Chief Inspector at Estyn has said “…Having a clearer understanding of the integration of AI in schools at a national level will enable government to better support and guide the education community in the use of this powerful technology

Whilst the focus is on generative AI, it is hoped that the review will be extended to include other types of AI, including predictive models which also pose significant risks.

The use of AI within Education

There are already various uses of AI within the public and education sectors, including uses such as biometric fingerprint and facial recognition, use for pupil attaiment prediction, assessing pupil IT activity to monitor for safeguarding concerns, as well as supporting activties such as pupil report writing, marking work and lesson planning.

The Welsh government’s recent published guidance on Generative artificial intelligence in education (10 January 2025) discusses how generative AI can offer further opportunities for the education sector when used correctly.

Current education tech is also being upgraded to include elements of AI functionality within them, allowing users to experiment without worrying about the risk of whether it fits into their organisational strategy and risk tolerance.

AI tools to support learners

The use of AI tools, whether generative or other forms (such as predictive) in learning should be purposeful, as with all tools in the classroom. Its use should ultimately support learners to become ambitious, capable learners; enterprising, creative contributors; ethical, responsible citizens; and healthy, confident individuals.

The aim here is, when used responsibly, safely and purposefully, AI tools provided to schools and further education providers have the potential to reduce workload and to support with a variety of tasks such as:

  • Supporting the development of content for use in learning.
  • Assisting with some routine administrative functions.
  • Helping provide more personalised learning experiences.
  • Supporting lesson planning, marking, feedback and reporting.
  • Providing contexts to support development of critical thinking skills.
  • Supporting school-level curriculum and assessment design.

Challenges to consider 

Although there are various reasons why AI would aid learning and support the sector, there is always an element of concern with the risks if AI is not understood and not used in the correct way.

  • Data protection: AI systems often require access to large datasets, which may include personal data. Models can be trained on this personal data during the building phase, meaning that people may be unaware of and unable to access their rights in relation to this processing. Furthermore, there can be data protection risks when using personal data in the implementation of AI tools, including lack of transparency, risks in relation to automated processing and the potential for inputs to further train models.
  • Security and safeguarding: AI systems can be vulnerable to hacking and other cyber threats. Malicious actors could potentially manipulate AI systems to behave unpredictably, by creating harmful outputs to access information that was used to create the model itself, or to provide content that is inappropriate for the emotional maturity or health of the user.
  • Bias and discrimination: AI tools may reflect and amplify the biases and stereotypes that exist in the data upon which they are trained. As a result, some generative AI systems may produce content that can be offensive and harmful. Schools may wish to focus on learners’ development of critical thinking and digital literacy skills to support them to engage critically with the outputs of these tools. It is important that learners and practitioners pro-actively question and challenge the assumptions and possible implications of the content generated by AI systems.
  • Ethical concerns: AI poses ethical questions, particularly around use where it affects vulnerable groups, such as students. This lack of transparency can make it difficult to understand how educational recommendations or decisions are made, raising questions about accountability. Engaging in open conversations about the possibilities and limitations of AI technologies should go hand-in-hand with discussing the importance of ethical and responsible use.
  • Other regulatory challenges: The rapid development of AI technologies poses challenges for regulators trying to ensure that AI is used safely and ethically. Intellectual property rights are a particular concern when entering pupil work in AI marking/assessment tools, and these rights are at risk if their work is used to train the model. It is almost inevitable that laws and regulations may fail to keep apace with technological advancements. This is something that Estyn will need to consider its in review.

The Future of AI

Integrating AI tools into education presents many opportunities. However, their use must prioritise safety, responsibility, ethics, trust and inclusivity. Before integrating AI, schools should perform a thorough risk analysis, similar to the process used for other digital tools, software or services.

This is essential to evaluate the suitability of AI, identify potential risks and ensure a safe and secure learning environment. Schools should continue to prioritise learners’ safety, security and well-being when considering how best to respond and adapt to emerging technologies in education.

Learners and teachers should know when AI is used in creating content or assisting with tasks, ensuring everyone understands its role. This helps build trust and ensures that learners are aware of the strengths and limitations of AI, allowing them to use it responsibly and effectively.

You can find out more about Estyn’s Review of Generative AI here and we’ll provide an update on Estyn’s findings following completion of the review in Summer 2025.

Expert advice and support

We advise schools, trusts, colleges and universities on the legal and practical concerns of using AI in within the education environment, please get in touch to find out more. We also offer a support pack to equip schools with governance resources to safely implement AI.

Key contacts

Key contacts

Trish D'Souza

Legal Director

trish.d'souza@brownejacobson.com

+44 (0)330 045 2193

View profile
Can we help you? Contact Trish

Claire Archibald

Legal Director

claire.archibald@brownejacobson.com

+44 (0)330 045 1165

View Profile
Can we help you? Contact Claire

You may be interested in...