Best Practice

Biometric technology in schools: Lessons learned from ICO regulation

New research shows that half of schools regularly use artificial intelligence but that many are putting pupil data and biometric information at risk. Data protection expert Claire Archibald describes a dedicated process for the safe adoption of AI-driven technologies
Are you compliant? Last year, the ICO issued a public reprimand to a school for failing to comply with the law when installing a facial recognition system in its canteen - Adobe Stock

While the opportunities that artificial intelligence presents for education are exciting, schools must consider potential issues when adopting new technologies.

In Browne Jacobson’s recent School Leaders Survey – which captured the views of leaders representing 1,650 schools – half of the respondents said they are using AI tools “regularly” or “often”.

However, at the same time, only 9% said they had an agreed AI strategy, while three-quarters felt there was insufficient AI expertise in their organisation.

These numbers are concerning because of the potential risks that AI presents to schools, particularly as recent developments from the Information Commissioner’s Office (ICO) puts education in the data regulator’s crosshairs.

During the ICO’s annual conference in October, the Information Commissioner John Edwards highlighted that the focus for 2025 was safeguarding children's personal data and ensuring AI and biometric technologies undergo thorough risk assessments (see ICO, 2025).

The following week, he urged all organisations to enhance their proactive and reactive measures against personal data breaches. Notably, the education sector accounted for 14% of all data breaches reported to the ICO in 2023 – positioning it just behind the health sector.

The spotlight is particularly intense on biometric technologies like facial recognition systems, which many may not even realise is a frontrunner application of AI.

Facial recognition is commonly used in education for things like streamlining school meal payments. It could also be used in automating attendance registration for students and staff.

The ICO has made it clear that schools must carefully consider the implementation of such technologies to avoid regulatory repercussions.

 

ICO reprimands over facial recognition

Facial recognition technology is growing in popularity within busy schools. The ability to rapidly identify individuals to pay for school catering offers huge efficiencies in queue management.

But it also has potential to exhibit racial bias, particularly with darker skin tones, and its intrusive nature makes it a contentious choice for schools.

The Department for Education has issued guidance (DfE, 2022) advising that the use of facial recognition should be carefully evaluated to determine its necessity and proportionality. Its warning that it is “often not appropriate in schools and colleges if other options are available” should not go unheeded – and schools should be aware of this when procuring or upgrading biometric technology.

The ICO has intervened in the use of facial recognition in schools on two notable occasions.

In January 2023, it issued a public letter to North Ayrshire Council to raise schools’ awareness of the data protection implications of the procurement of facial recognition technology (ICO, 2023).

Then in July 2024, the ICO issued a public reprimand to a school in Essex for failing to comply with the law when installing a facial recognition system in its canteen. The school didn’t consult its data protection officer or carry out a data protection impact assessment before installing the technology (ICO, 2024).

With this renewed focus on children’s personal data and biometric processing, schools should now be on alert for harsher regulatory action if they fail to safeguard pupil biometric data.

AI and biometric technology suppliers should also ensure they support education clients on data protection issues. The ICO’s reprimand of the Essex school was accompanied by naming the edtech vendor.

 

Maintaining data protection compliance

Lessons from rolling out facial recognition technology in an unchecked way should be heeded for any new technology. AI poses risks of bias and discrimination, which may cause harm. To navigate the complexities of data protection law, schools should follow a dedicated process whenever they want to roll-out or update technology.

 

1, Project plan

Know what you are doing from the outset and make sure any change happens deliberately and thoughtfully. Don’t carry out any project or allow vendors to rush you into adoption without proper due diligence and risk assessment first.

This can happen with biometrics because when fingerprint scanners stop working effectively, the vendor may offer the opportunity to “upgrade” to facial recognition technology.

Busy staff may not pause to consider how having cameras to identify an individual fundamentally changes the way in which they collect and process personal data, leaving them open to new regulatory and ethical risks.

 

2, Include the data protection officer from the beginning

Schools should involve their DPO from the outset of a procurement exercise, or even a change to an existing process. They can bring a focus on privacy issues, the extent of the data-processing, associated risks, and mitigation strategies.

Involving a DPO early can also help to frame a school or trust’s approach to the market, ensuring potential providers are required to set out how they will comply with their data protection obligations and how they can help with the school’s own compliance needs.

If privacy and processing issues are treated as an afterthought, once contracts have been signed and timelines agreed, the DPO could face pressure not to delay the project.

Instead, allow the DPO to be an enabler to safe innovation. They can engage with vendors and internal stakeholders, such as the IT lead, in the completion of a risk assessment and will be able to identify risks and mitigations before they surface.

For example, thinking about issues such as data retention and deletion can be easily overlooked unless they are considered at the outset.

 

3, Carry out a thorough risk assessment

A data protection impact assessment (DPIA) acts like a health and safety risk assessment for processing personal data. Schools wouldn’t dream of taking students on a trip without carrying out that risk assessment, and the use of personal data should be considered in the same way.

A DPIA involves identifying the potential risks to personal data, evaluating the necessity and proportionality of the processing activities, and implementing appropriate measures to address those risks.

This is a legal requirement, under the UK’s General Data Protection Regulation (GDPR), for activities that are likely to result in a high risk to people’s data protection rights and freedoms.

Additionally, when processing biometric data, under the Protection of Freedoms Act 2012, the school must receive the consent of at least one parent, carer or legal guardian of each child whose data it intends to process. If the other parent or pupil in question objects, the data can’t be used.

 

4, Ensure your DPO is sufficiently supported

Having a DPO is a legal requirement for schools, but they should also be willing to invest in this position with training and development.

Enrolling your DPO through CPD programmes will ensure they are up-to-speed with the constantly changing environment of data and technology.

Schools should also be aware of the dangers of having one person as the single source of truth and knowledge within their organisation. The DPO should work as part of a team that is also equipped with this knowledge so others can step in to support where needed, or even replace the DPO should they leave the organisation.

 

5, Determine appropriate levels of access

Schools may need to consider their IT data tenancy arrangements and how this might be affected by AI integration. Data tenancy refers to the ability to isolate data and allow specific access rights/access only to authorised users in a piece of software or technology.

Microsoft Copilot – one of the cited AI tools in the School Leaders Survey – features a sliding scale of generative AI sophistication, from a “freemium” model comprising an Edge chatbot plug-in to the 365 Copilot programme that integrates AI fully into the organisational ecosystem for enhanced productivity.

It is this latter example where tenancy rights may require managing in order to effectively balance access and process improvements.

 

6, Build a purpose-led culture

Creating a cultural awareness about the importance of purpose will help a school to identify what it wants to achieve and how it will get there within any project, whether technology-related or otherwise.

This will also prevent it from being reactive to commercial opportunities offered by vendors. If a school decides what it wants from a new piece of technology and then goes to market for the best solution to fit its needs, the project is more likely to be successful.

 

Final thoughts

While it may seem counter-intuitive, slowing down can effectively help schools to go faster by innovating responsibly, ensuring legal and ethical compliance from the outset, and avoiding hidden issues at a later date, when they can be difficult and costly to resolve.

  • Claire Archibald is legal director specialising in data protection within the education team at UK and Ireland law firm Browne Jacobson. She specialises in data protection, information governance and freedom of information matters.

 

Further information & resources