The Information Commissioner’s Office (ICO) has issued a statutory reprimand, under the UK GDPR, to an academy school in Essex, in relation to how it introduced and operated facial recognition technology (FRT) to take cashless canteen payments from students.
There are some key takeaways from the ICO's action, some of which extend beyond schools and into a wider business and retail context.
FRT systems uniquely identify individuals by capturing an image of their face in real time and matching it with a pre-existing database of images. This constitutes “processing” of the individual’s personal data, but, because of how the FRT works, it also qualifies as “biometric processing”, and it therefore requires extra measures under data protection law to be taken to ensure it is done fairly and lawfully.
Prior to doing such biometric processing - and therefore prior to implementing FRT - a data controller must undertake a Data Protection Impact Assessment (DPIA) - a type of risk assessment which needs to take into account the necessity and proportionality of the processing, and the risks to the rights and freedoms of the data subjects. In the case of the Essex school, a DPIA had only been undertaken after the FRT system had been introduced.
If FRT systems are used in schools, and in the workplace (for instance for access control to certain spaces, service or items) the data controller has to have a lawful justification for doing so. And it may be unlikely that there are sufficiently compelling reasons to impose the system, without asking the students, or the workers, for their explicit consent. In the Essex school example, the ICO found that, instead, it had merely been inferred that students consented, in the absence of a parental opt-out. The ICO pointed out that this was insufficient: "the law does not deem ‘opt out’ a valid form of consent and requires explicit permission". And, furthermore, the ICO noted that most of the students were of an age and competence that meant they were able to take their own decision on whether to partake in the scheme, regardless of their parents' wishes. In fact, the ICO did not refer, as it could have done to the fact that under the Protection of Freedoms Act 2012, students in schools have their own express rights, and even if a parent has said that it is ok, or not ok, for the school to use biometric processing, the student still has the right to override that parental decision.
When employers look to introduce FRT in the workplace for access control reasons, they should also bear in mind that data protection law considers that where there is an “imbalance of power” - for instance between a worker and an employer - it can be difficult to rely on the worker’s consent to the processing. At the very least, the employer will need to offer the option not to be exposed to the FRT, which may well mean providing an alternative means of access - for instance through the use of codes or PIN numbers.
It is also important that those purchasing and deploying FRT systems and software exercise due diligence. Suppliers might seek to provide advice and reassurance, but they are not the ones who will be liable for any data protection infringements, and investigation by the ICO.
Although the ICO chose not to impose a fine on the school, perhaps because it has a policy of rarely, if ever, fining public bodies, it is not impossible that a more punitive sanction could be imposed on a private sector organisation that failed to undertake a DPIA before introducing FRT, or failed properly to take into account the legal issues around consent. Anyone proposing to use of FRT should consider the legal issues carefully, and take advice where appropriate.