The Information Commissioner’s Office (ICO) has recently issued data protection guidance to organisations procuring, or looking to procure, Artificial Intelligence (AI) tools for the purposes of recruitment.
The guidance follows a number of audits by the ICO of developers and providers of AI-powered sourcing, screening, and selection tools used in recruitment. Although the audits resulted in almost 300 recommendations for those providers and developers, the ICO has distilled the findings into six key recommendations for those processing and deploying such tools.
The recommendations are closely tied to the core data protection principles and obligations in the UK GDPR.
Conducting a DPIA
Article 35 of the UK GDPR, in conjunction with the ICO’s own rules, requires that those using innovative technologies, such as AI, for processing personal data must conduct a Data Protection Impact Assessment (DPIA) prior to beginning the processing. DPIAs help identify and minimise the data protection risks of a project or processing activity. We have prepared a helpful checklist to guide organisations through the steps of conducting a DPIA, ensuring that you not only comply with the UK GDPR, but also integrate best practices into your data processing activities.
What is your lawful basis?
Article 6(1) of the UK GDPR requires that all processing be underpinned by at least one lawful basis. If the use of AI for recruitment purposes also involves “special categories” of personal data (such as racial or ethnic origin, or health information), then an appropriate condition from Article 9 must also be identified.
Controllers and processors
The ICO points out that where an organisation is using a provider of an AI tool it is essential to establish whether that provider is a “processor”, and, if so, that the procuring organisation must set explicit and comprehensive written instructions for the provider to follow, documented in an appropriate contract, and which provides for audit and compliance metrics.
Has the provider mitigated bias?
The ICO audits revealed some areas of concern around fairness and discrimination, with some tools having the ability to filter candidates on the basis of characteristics protected under equalities law. The ICO recommends seeking “clear assurances” from AI providers that tools have been and can be monitored for “potential or actual fairness, accuracy or bias issues”. Although the ICO does not specially recommend this, organisations may well want to ensure that they have contractual terms in place which ensure that this is the case.
Transparency
The guidance stresses that recruiting organisations must inform candidates how an AI tool will process their personal data. This is a legal obligation under Articles 13 and 14 of the UK GDPR and is normally done by way of a “privacy notice”. Organisations should ensure that such a notice is fit for purpose, updated where necessary to deal with any use of AI, and provided to candidates at the time that their data is collected. The ICO says that this should include information on why the tool is being used, and the logic involved in making predictions or producing outputs that may affect people. Candidates should also be informed how they can challenge any automated decisions made by the tool.
Keeping processing to a minimum
One of the key principles of data protection is “data minimisation” - collecting as little data as possible and limiting what is done with it only to what is necessary. The ICO audits “revealed that some AI tools collected far more personal information than necessary and retained it indefinitely to build large databases of potential candidates without their knowledge” - this is contrary to the data minimisation principle and would put recruiters at risk of potential regulatory action and legal claims.
Perhaps unsurprisingly, the new ICO guidance does not go into great detail - this is a complex and rapidly developing area. Limiting guidance to a statement of key principles and basic recommendations is sensible. However, recruiters who need expert support can call on our AI and Data lawyers and our range of services and information sources.