Mishcon de Reya page structure
Site header
Main menu
Main content section

ICO takes action on facial recognition in schools

Posted on 2 February 2023

The Information Commissioner's Office (ICO) has recently written to a Scottish council, which had introduced facial recognition technology (FRT) to facilitate cashless payments in canteens in nine schools under its control. Media coverage of this issue first emerged in 2021, and in the face of concerns raised, North Ayrshire Council (NAC) had in fact already stopped using the technology. Nonetheless, the ICO took the view that it was important to "draw out key learnings" from the issue and make them public.

When used to identify or single out individuals FRT inevitably involves the processing of their data, and in so doing, engages the provisions of the UK GDPR. Furthermore this is processing of "biometric data", and is subject to the restrictions on such special categories of data.

What the ICO found

The ICO found that although FRT might be capable of being deployed lawfully in schools, NAC had failed to do so. Notably, NAC had not been able to identify a lawful basis for the processing under Article 6 of the UK GDPR. It initially claimed that the processing was necessary for the performance of a task carried out in the public interest, but then altered its position to say it relied on children's consent. Neither of these, though, in the circumstances, had been available to NAC. The ICO said that children's consent could in principle provide the basis, but NAC had not taken adequate steps to gather it, nor to inform children about the processing. 

On the issue of consent, the ICO noted that children had not been given a "genuine choice": they were simply informed that the FRT was being introduced and would be used for authentication of "all pupils". Added to the clear power imbalance between the schools and the children, this meant that any purported consent was not valid.

For the same reason, there was no condition available to overcome the Article 9 restrictions on processing the special category biometric data.

As a consequence of the lack of a legal basis for the processing, it was unlawful, unfair and not transparent, and therefore an infringement of (among other provisions) Article 5(1)(a) of the UK GDPR.

Although NAC had undertaken a Data Protection Impact Assessment (DPIA), the ICO found that this was deficient because it did not contain enough detail, for instance on the assessment of risks and possible mitigation measures.

Children's data protection consent in schools

It should be noted that the issue of children's consent, particularly in a schools context, is an area where Scottish (and Northern Ireland) law differs slightly from that of England and Wales. In Scotland, under specific provisions of the Data Protection Act 2018, children aged over 12 are presumed able to provide their own consent. If parental consent for the processing of such children's data is also sought - as it had been by NAC with children who were aged between 12 and 14 - it can only be where it has been determined that the child lacks competence to take its own decision and only on a cases by case basis.

By contrast, in England and Wales, no such age for deemed consent by children applies (except in the limited circumstances of access to information society services). Rather, the Protection of Freedoms Act 2012 specifically deals with the use of biometric processing, and provides that such processing can only take place with parental consent but that children may still object, and their decision would override the parental decision. The same Act also requires that (where a child withholds its consent to processing) it should be offered a reasonable alternative means of accessing the service.

What schools should consider in relation to FRT

Although the legal provisions discussed above apply primarily to state schools and academies, any school considering the introduction of FRT should undertake a DPIA. In the event of a challenge by parents or by children themselves, failure to do so is likely to result in investigation and potential enforcement action by the ICO.

As part of the DPIA process schools will need carefully to assess whether the use of FRT is justified or whether it might be held to be a disproportionate means of delivering the service or achieving a desired outcome. The DPIA should also carefully consider what risks the technology might present to children, and whether there are measures which could be introduced to remove or mitigate the risk.

For them to be robust (and effective) DPIAs should be undertaken before a decision is made to purchase and deploy FRT.

How can we help you?
Help

How can we help you?

Subscribe: I'd like to keep in touch

If your enquiry is urgent please call +44 20 3321 7000

Crisis Hotline

I'm a client

I'm looking for advice

Something else