Half of school leaders are using dedicated artificial intelligence tools in educational settings, with one in five (20%) using them on a regular basis, according to a new survey by Browne Jacobson.
Some 41% of respondents to the UK and Ireland law firm’s latest School Leaders Survey reported a positive experience with using AI as it helped to ease workload, speed up everyday tasks and personalise learning. Only 5% provided negative feedback.
Yet the research also finds that three-quarters (75%) of leaders feel there is insufficient AI expertise in their organisation, while concerns linger around issues such as plagiarism, safeguarding and legal compliance.
Scrutiny is increasing on schools using emerging technologies, with the Department for Education strengthening requirements for safe usage of generative AI in new guidance published last month.
Browne Jacobson’s survey, which was conducted during the first term of this academic year, captures the views of more than 200 leaders – including CEOs, executive headteachers, trustees and governors – representing about 1,650 schools that are collectively responsible for nearly one million pupils across England.
Claire Archibald, Legal Director specialising in data protection within the education team at Browne Jacobson, said: “While a sizeable proportion of leaders have not yet engaged with AI tools, many others are optimistic about its potential; already using AI for a variety of functions, including assessment, pupil report writing and even recruitment processes.
“Leaders also tell us they are concerned about data privacy and security, bias and fairness, safeguarding, and quality control. This reveals a contradiction – their actions, such as trying out new technologies without fully understanding the implications, don’t always reflect these concerns.
“The approach to adopting technology must be planned with a clear AI governance strategy, which will be a key part of operational success and safe deployment for schools and academies.”
Key findings in the School Leaders Survey include:
- The most common use cases of AI in schools include creating or enhancing resources (used by 44% of respondents); managing and reducing workload (36%), summarising content such as emails and reports (36%); personalised learning (19%); and website and social media content generation (16%).
- Among further areas where AI could be of most benefit to schools and trusts, a third (34%) cited assistive technologies supporting children with special educational needs, 26% suggested improving assessment and feedback, and 13% said governance support and policy management.
- ChatGPT was the most common AI tool cited by respondents, with other software including Gemini, Microsoft Copilot, Claude and TeachMateAI.
- 54% of school leaders believe their organisation is prepared to effectively implement AI – although only 9% have an agreed AI strategy, with 31% saying a strategy is in development.
- Malpractice and plagiarism is the biggest area of concern regarding the use of AI in education (cited by 65% of respondents), followed by adequate training (62%), quality control (58%), data privacy and security (57%), bias and fairness (54%), safeguarding (44%), and intellectual property protection (26%).
- Almost half (47%) of leaders are at least slightly confident that concerns can be managed effectively with appropriate mitigation, although 22% are not confident at all.
Bethany Paliga, Senior Associate specialising in data protection within the education team at Browne Jacobson, added: “Although the adoption of AI isn’t yet seen as a top operational priority, bodies such as the DfE, Ofsted, Information Commissioner’s Office (ICO) and the Joint Council for Qualifications expect schools to be engaging with the risks of AI, which may already be part of the way in which individuals and students work.
“The safe use of AI is among numerous competing priorities for education leaders. That’s why we’re urging schools and academy trusts to carefully consider which AI tools are used to ensure they properly consider compliance risks in order to use AI safely and effectively.
“Embracing AI in education is not just about staying ahead technologically; it’s understanding the unique complexities and challenges that come with adopting new technology in a school environment.”
A live session, titled 'AI and safeguarding: Identifying risks and embracing opportunities', will be held during Browne Jacobson’s EdCon 2025 virtual conference, which runs from 3 to 28 March and is free to attend.
Key contact

Kara Shadbolt
Senior PR & Communication Manager
kara.shadbolt@brownejacobson.com
+44 (0)330 045 1111