AI in Recruitment Screening: Legal Risks for UK Employers
Using AI to screen CVs and shortlist candidates carries real legal risks. Understand your obligations under the Equality Act 2010 and UK GDPR before automating recruitment.
AI recruitment screening tools can reduce the time you spend reviewing applications, but using them without understanding your legal obligations creates significant risk of discrimination claims and ICO enforcement action.
What AI Recruitment Screening Involves
AI screening tools range from simple keyword-matching systems that filter CVs by specified criteria, to sophisticated machine learning tools that score candidates based on patterns in historical hiring data. Common uses include:
- Parsing CVs to extract qualifications and experience
- Scoring candidates against a job description
- Automated shortlisting or ranking of applicants
- Video interview analysis (facial expression, tone of voice)
- Psychometric or cognitive assessment tools
The legal risks vary depending on how much weight the AI output carries in the final decision, and what data the system was trained on.
The Equality Act 2010 Risk
The primary legal risk with AI recruitment screening is indirect discrimination. The Equality Act 2010 makes it unlawful to apply a provision, criterion, or practice that puts people with a protected characteristic at a particular disadvantage compared to others, unless you can justify it as a proportionate means of achieving a legitimate aim.
AI screening tools trained on historical hiring data inherit the biases present in that data. If your past hires were predominantly male, a tool trained on their profiles may disadvantage female applicants. If candidates who attended certain universities were historically successful, the tool may screen out applicants from lower-ranking institutions - which correlates with socioeconomic background and can amount to indirect discrimination.
Protected characteristics under the Equality Act 2010 include age, disability, sex, race, religion or belief, sexual orientation, and pregnancy and maternity. Each carries risk with AI screening.
Specific risks by characteristic:
- Age: Tools that filter by years of experience or graduation year can discriminate against older or younger candidates
- Disability: CV gaps or career breaks used as negative signals may disadvantage disabled candidates
- Race: Name-based matching or postcode filtering can produce racially biased outcomes
- Sex: Role-specific language in job descriptions can affect how AI tools score candidates of different genders
You do not need to intend to discriminate for a claim to succeed. If the outcome is discriminatory and you cannot justify it, you are liable.
UK GDPR Requirements
Using AI to process candidate application data triggers obligations under UK GDPR and the Data Protection Act 2018.
Lawful basis: Candidate data processed during recruitment is typically justified under UK GDPR Article 6(1)(b) (processing necessary for steps prior to entering a contract) or Article 6(1)(f) (legitimate interests). You must document your chosen basis.
Transparency: Your recruitment privacy notice must inform candidates that their data will be processed by an automated system, what data is used, how decisions are made, and what the consequences are. This must be provided before you collect their data.
Article 22 - Automated decision-making: If the AI makes or significantly influences a decision that produces a legal or similarly significant effect on candidates, Article 22 applies. Shortlisting is considered a significant effect. You must:
- Not make the decision solely through automated means without safeguards
- Implement the right for candidates to request human review
- Allow candidates to contest the decision
- Be able to explain the logic behind the AI's output
Data minimisation: Only feed the AI the data necessary for the specific screening purpose. Collecting additional data "in case it's useful" is not compliant.
Retention: Delete candidate data as soon as you no longer need it. The ICO recommends documenting your retention period in your privacy notice.
ICO Guidance on AI in Recruitment
The ICO has published guidance on AI and automated decision-making that applies directly to recruitment. Key points:
- You must be able to explain how your AI tool reaches its outputs in terms candidates can understand
- You cannot use AI outputs as a black box without understanding the factors involved
- You must carry out a Data Protection Impact Assessment (DPIA) before deploying AI tools that process candidate data at scale
- You must conduct ongoing monitoring to ensure the tool is not producing discriminatory outcomes
The ICO can issue fines of up to £17.5 million or 4% of global annual turnover, whichever is higher, for serious breaches.
What You Must Do Before Using AI Screening Tools
Before deploying any AI recruitment screening tool, complete these steps:
- Review the vendor's documentation: Ask how the tool was trained, what data it uses, whether it has been audited for bias, and where data is processed (particularly if the vendor is US-based)
- Carry out a DPIA: Required under UK GDPR where processing is likely to result in high risk to individuals. AI-powered recruitment screening almost certainly meets this threshold
- Update your privacy notice: Include specific information about automated processing in recruitment
- Test for bias before going live: Run a sample of historical applications through the tool and check shortlist outcomes across protected characteristics
- Establish a human review process: Document how candidates can request human review and who handles those requests
- Set up ongoing auditing: Monitor shortlist outcomes by protected characteristic on an ongoing basis
Using AI Screening Legally in Practice
AI recruitment tools are not prohibited. You can use them lawfully if you:
- Use them as one input into a human-led process, not as the sole decision-maker
- Choose vendors who can demonstrate bias testing and explainability
- Monitor outcomes and act on evidence of discriminatory patterns
- Inform candidates and honour their right to human review
- Document every stage so you can demonstrate compliance if challenged
Avoid tools that analyse facial expressions or voice tone in video interviews - the ICO and Equality and Human Rights Commission have both raised concerns about these, and the risk of producing discriminatory outcomes is high.
What Candidates Can Do If They Believe They Were Discriminated Against
Candidates who believe an AI tool produced a discriminatory shortlisting outcome can:
- Raise a subject access request to obtain the data used about them
- Request human review of any automated decision
- Bring an employment tribunal claim for discrimination (even before employment begins - discrimination in recruitment is unlawful)
- Report concerns to the ICO
Claims can be brought within three months of the discriminatory act. Pre-employment discrimination claims are taken seriously by tribunals.
This is guidance, not legal advice. Employment law and data protection law are complex, and the application to AI recruitment tools is an evolving area. If you are uncertain whether your use of AI in recruitment is compliant, take advice from an employment solicitor or data protection specialist before proceeding.
Related answers
AI Bias and Discrimination in Employment: Employer Liability
Algorithmic bias can produce unlawful discrimination under the Equality Act 2010. UK employers are liable for discriminatory AI outcomes, even when using third-party tools.
AI and HR Data: UK GDPR Compliance Guide
Using AI tools that process employee data triggers specific UK GDPR obligations. Understand data controller duties, DPIAs, employee transparency requirements, and international transfer risks.
Equality Act 2010: Employer's Guide
Understanding the Equality Act for employers. Protected characteristics, types of discrimination, reasonable adjustments, and avoiding claims.
Frequently Asked Questions
- Is it legal to use AI to screen CVs in the UK?
- Yes, but only if you comply with the Equality Act 2010 and UK GDPR. You must ensure the AI tool does not produce discriminatory outcomes across protected characteristics, and candidates must be informed their application may be subject to automated processing. You must also be able to provide a human review if requested.
- Do I have to tell candidates I'm using AI to screen their applications?
- Yes. Under UK GDPR, you must include this information in your privacy notice before or at the point of collecting application data. If your AI makes or significantly influences the shortlisting decision, candidates have the right to request human review of that decision under Article 22.
- Who is liable if an AI recruitment tool discriminates against candidates?
- You are, as the employer. Using a third-party AI tool does not transfer your legal liability under the Equality Act 2010. If the tool produces discriminatory shortlists, you are responsible for the outcome. Always carry out due diligence on AI vendors and audit results for bias.