The use of Facial Recognition Technology (FRT) is likely to be one of the key areas where new technology clashes with data protection rights. On the one hand, FRT has a wide range of potential use and users, from law enforcement agencies and civil enforcement bodies, to even – according to a recent Financial Times report – schools providing lunches to pupils. However, unless they want to risk legal challenges and regulatory investigations, schools considering the use of such technology (and the use of other systems that involve "biometric" processing) will need to take a number of things into account.
The processing of children's biometric data (defined as "personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data") is covered by data protection law. For schools in England and Wales, it is also covered by the Protection of Freedoms Act 2012 (it is worth noting that the schools referred to in the Financial Times article were primarily in Scotland). Combined, those laws require that such processing can only be undertaken in schools (including academies) if "each parent" has been notified in advance and at least one parent consents to the processing.
But notably (and no doubt with a nod to the United Nations Convention on the Rights of the Child), if the child refuses to participate in, or continue to participate in, anything that involves the processing of their biometric information, or otherwise objects to the processing of that information, then their wishes override those of the parent. Furthermore, in those circumstances where children refuse to participate in the processing, reasonable alternative arrangements must be put in place.
The above is just the starting point – under data protection law, any processing must also be lawful, transparent and fair and, under Article 35 of the UK GDPR (when read alongside regulatory guidance), schools must undertake a "Data Protection Impact Assessment". This means that a school would need to consider - in a structured analysis – whether, for instance, the use of such technology is a proportionate measure to achieve the aims it seeks to achieve, or whether the interference with a child's rights is of a level which renders the use of the technology unacceptable.
Such analyses can be complex, but failure to do them runs the risk of regulatory enforcement action – including the potential for fines – and legal claims by or on behalf of children. This will particularly be the case where "live" FRT is deployed (the schools in the Financial Times article do not appear to be using this). Live FRT involves real-time monitoring of anyone who comes into a camera's range, and is likely to be more intrusive, and more problematic, than standard FRT, and any schools would be well advised to consider even more carefully whether such technology should be adopted in or around their premises. In June 2021 the Information Commissioner issued a formal "Opinion" on the use of live FRT in "public places", which defined the latter extremely broadly as "any physical space outside a domestic setting, whether publicly or privately owned". This would certainly encompass schools (as well as pretty much anywhere else).
It is notable that the ICO's Opinion also emphasised that those using live FRT must "pay close consideration to transparency and the necessity and proportionality of the processing…particularly…when children…make a significant group covered by the system". Given that the UK GDPR stresses that children merit specific protection when it comes to their data, it is highly likely that the ICO would take the same position with all uses of FRT – live or not.