In February, the Information Commissioner's Office (ICO) ordered two Serco entities and seven associated community leisure trusts (together, Serco), to stop using facial recognition technology (FRT) and fingerprint scanning to track employee attendance.
The law
Employers monitoring their employees must comply with data protection law:
- Personal data ("information relating to an identified or identifiable natural person" e.g. employees' contact or payroll details) - Data must be processed "lawfully, fairly and in a transparent manner" (Article 5(1)(a) UK GDPR) and there must be a lawful basis for processing (Article 6 UK GDPR).
- Biometric data ("personal data resulting from special technical processing relating to the physical, physiological or behaviour characteristics of a natural person, which allow or confirm someone's unique identification of that natural person" e.g. a scan of an employee's face or fingerprint) - Employers must also identify a special category processing condition (Article 9 UK GDPR).
The contravention
The ICO found that Serco had unlawfully processed the biometric data of more than 2,000 employees at 38 leisure facilities to check their attendance and pay them accordingly.
The ICO rejected Serco's suggestion that the processing of biometric data was "necessary" (either under Article 6(1)(b) (contractual necessity), Article 6(1)(f) (legitimate interests) or Article 9(2)(b) (necessity for the purposes of carrying out obligations / rights under employment law). It explained that, while "necessity" does not mean that the processing must be "absolutely essential", it must be more than just "useful" and must be a targeted and proportionate way of achieving the purpose. In this case, the processing could not be considered "necessary" when less intrusive means – such as ID cards or fobs – could be used to achieve the same result.
Although Serco argued that alternative methods were open to abuse, the ICO found that Serco had failed to provide evidence of widespread abuse, nor explained why other methods – such as disciplinary action against employees found to be abusing the system – had not been considered appropriate.
The ICO also criticised Serco's attempts to carry out the balancing exercise required by Article 6(1)(f) (i.e., weighing Serco's "legitimate interests" against the interests or fundamental rights and freedoms of its employees). It noted that Serco had given insufficient weight to the fact that:
- biometric data is "inherently sensitive" due to its uniqueness to the person to whom it relates. Data breaches of biometric data can lead to an indefinite loss of control (data subjects could change a compromised password, but cannot alter their biometric data) and may allow access to further sensitive data such as bank accounts;
- Serco's employees were not given clear information about how they could object to the processing, or about any alternative methods of monitoring attendance; and
- there is an inherent imbalance of power between Serco and its employees. Even if they had been informed that they could object to the processing, they may not have felt able to do so.
Of particular note is the fact that, although Serco appears to have undertaken a Data Protection Impact Assessment (DPIA), which is mandatory where biometric processing is involved, the ICO found it to be deficient in terms of how it dealt with the Article 9 UK GDPR issue.
The ICO therefore issued enforcement notices against Serco, concluding that, in contravention of Articles 5(1)(a), 6 and 9 UK GDPR, Serco had failed to establish a lawful basis and special category personal data processing condition for the processing of biometric data.
As a result, the ICO instructed Serco to stop all processing of biometric data for monitoring employees' attendance at work, as well as to destroy all biometric data that they are not legally obliged to retain. The ICO imposed a deadline of three months from the enforcement notices being issued; failure to comply with the notices may lead to a fine of up to £17.5 million or 4% of Serco's total annual worldwide turnover, whichever is higher.
Comment
Notably, the ICO's investigation was prompted by one of the ICO's own employees, who noted the use of FRT at one of Serco's facilities. This indicates that the ICO may proactively investigate any employer which is using FRT or other biometric processing, and that an investigation does not necessarily require a complaint from an affected employee. This decision therefore serves as a warning to employers, many of whom are adapting to new ways of working and keen to embrace new technologies, that biometric systems "cannot be deployed lightly". Among other obligations, those processing biometric data must employ a "data protection by design" approach when putting in place biometric recognition systems; they must first complete (adequately) a DPIA and may need to consult employees when doing it; and they must assess what impact the system will have on the people whose information it will process (and in some cases, wider society).
The ICO has issued further guidance on processing biometric data and has made clear that it will "closely scrutinise organisations and act decisively if we believe biometric data is being used unlawfully… We will intervene and demand accountability, and evidence that [biometric recognition systems] are proportional to the problem organisations are seeking to solve."
At Mishcon de Reya, our specialists in employment law, data protection, privacy and reputation protection, regularly advise both employers and employees. We understand the importance of getting these sensitive decisions right: to ensure legal and regulatory compliance; to protect employees and maintain trust; and to safeguard businesses' reputation.