AI in Sickness Absence Management: Legal Considerations
AI tools that flag absence patterns and predict absence carry disability discrimination and data protection risks. Understand what's proportionate before automating absence management.
AI absence management tools promise to reduce absence costs and ensure consistent application of absence policies. They can deliver on both promises, but only within a legal framework that requires human judgment at every consequential decision point.
What AI Absence Management Tools Do
AI tools in absence management fall into several categories:
Absence tracking and flagging: Systems that monitor absence records against trigger points (Bradford Factor scores, number of absences in a rolling period, total days lost) and flag employees who have met or exceeded thresholds.
Pattern analysis: Tools that identify patterns in absence data - particular days of the week, proximity to weekends or bank holidays, frequency and duration patterns - and highlight these for management attention.
Predictive absence tools: More sophisticated tools that attempt to predict future absence likelihood based on historical patterns, combining individual data with workforce-wide trends and sometimes external data.
Return-to-work support: Tools that generate return-to-work interview prompts, track completion of return-to-work conversations, and maintain records of the outcomes.
Occupational health integration: Systems that track referrals to occupational health, recommendations received, and whether adjustments recommended by occupational health have been implemented.
Absence Data Is Health Data
The foundational legal issue with AI absence management is that absence data is health data for the purposes of UK GDPR. Absence records show when employees were unwell, for how long, and - where reasons are provided - why. This is special category data under UK GDPR Article 9.
Processing special category data requires:
- An Article 6 lawful basis (as with all personal data)
- An additional Article 9 condition (typically Article 9(2)(b) - processing necessary for employment law obligations, or Article 9(2)(h) - processing necessary for occupational medicine)
- Documentation of both the basis and the condition
Before deploying an AI absence management tool, document your legal basis for processing absence data through that tool. This must be included in your employee privacy notice.
The Disability Discrimination Problem
The central legal risk with AI absence management is disability discrimination under the Equality Act 2010.
A disability is a physical or mental impairment that has a substantial and long-term adverse effect on an employee's ability to carry out normal day-to-day activities. The Equality Act imposes obligations on employers once they know (or should know) that an employee has a disability:
- A duty to make reasonable adjustments
- A prohibition on discrimination arising from disability
- A prohibition on indirect discrimination
How AI creates disability discrimination risk:
AI absence management systems flag employees based on absence frequency or patterns without any knowledge of whether those absences are disability-related. An employee with a condition that causes unpredictable flare-ups (multiple sclerosis, fibromyalgia, Crohn's disease, mental health conditions) may have an absence pattern that an AI system flags as high risk.
If the system triggers formal absence management procedures without any human review of whether disability is a factor, and the resulting procedure fails to make reasonable adjustments, you have committed discrimination arising from disability under Section 15 of the Equality Act 2010.
The fact that an algorithm triggered the process, not a human decision, is not a defence.
Reasonable adjustments before absence management: Before initiating any formal absence management procedure, you must assess whether the absences are related to a disability and, if so, what reasonable adjustments could reduce future absence. This assessment must be human-led and may require occupational health input.
The Bradford Factor and disability: The Bradford Factor (which weights frequent short absences more heavily than long periods of absence) is widely used in absence trigger systems and has been incorporated into many AI tools. It has the potential to disadvantage employees whose disability causes unpredictable short absences more than employees with long periods of certified illness. If challenged, you must be able to justify its use and demonstrate that disability-related absences were appropriately excluded from or adjusted within the calculation.
Predictive Absence: A Specific High-Risk Category
Tools that attempt to predict future absence likelihood based on historical data present particular risks:
Reliability: Absence prediction is complex. Tools that claim to predict which employees are likely to be absent are producing probabilistic outputs based on correlations, not certainties. Acting on such predictions - for example, by denying overtime, excluding employees from projects, or initiating informal conversations - creates risk of treating employees differently based on uncertain data.
Pre-emptive discrimination: Acting against an employee based on predicted future absence before any actual absence occurs is difficult to justify legally. You are treating the employee less favourably based on an inference about their future behaviour rather than on actual conduct or performance.
DPIA required: The ICO guidance on automated decision-making identifies predictive profiling of individuals as a high-risk processing activity requiring a DPIA before deployment. If you are using or considering a predictive absence tool, a DPIA is mandatory.
Disability inference risk: A predictive tool trained on absence patterns may effectively predict which employees are likely to have a disability (because past absences correlated with disability are in the training data). This creates a risk of discriminatory treatment based on a protected characteristic.
What Proportionate Use of AI in Absence Management Looks Like
Used proportionately and within a sound legal framework, AI tools can assist with absence management without creating legal risk:
Automated tracking and flagging: Permissible. An AI system that tracks absence and flags employees who have met defined trigger points is a legitimate administrative tool, provided the triggers are disclosed in your absence management policy and the flag triggers a human review rather than automatic disciplinary action.
Pattern visibility: Permissible. Providing managers with visibility of absence patterns (without automated conclusions about what the patterns mean) helps managers have informed conversations. The manager must make the assessment, not the AI.
Return-to-work prompts: Permissible. AI tools that remind managers to hold return-to-work conversations and generate structured prompts for those conversations are useful administrative tools.
Occupational health tracking: Permissible. Tracking whether occupational health referrals have been made and whether recommended adjustments have been implemented is legitimate record-keeping.
Predictive absence management: Not recommended. The legal and practical risks outweigh the benefits for most employers.
The Human Review Requirement
Every consequential decision in an absence management process must involve genuine human review:
- Before the trigger threshold is applied to determine whether disability is a relevant factor
- Before initiating formal absence management procedures
- Before any disciplinary action is taken
- Before any dismissal for capability (absence-related) is considered
The human reviewer must consider: Is this employee disabled? If so, is the absence related to their disability? Have reasonable adjustments been made or considered? Has occupational health been involved where appropriate? Is a formal procedure proportionate in all the circumstances?
These questions cannot be answered by an AI system. They require a manager who has knowledge of the individual employee and who can take responsibility for the decision.
Building a Compliant AI Absence Management Framework
Step 1: Carry out a DPIA for any AI absence management tool before deployment.
Step 2: Update your absence management policy to disclose that AI tools are used to track and flag absence, what data is used, and how it affects the process.
Step 3: Update your employee privacy notice to reflect absence data processing through AI tools.
Step 4: Configure the AI tool's triggers to reflect your absence management policy's trigger points, not just the tool's defaults.
Step 5: Train managers on the human review obligations: every flag must trigger a human conversation that includes assessment of disability and reasonable adjustments before any formal step.
Step 6: Document decisions made following AI-flagged absences, including the human review process and the basis for the decision.
Step 7: Audit the tool's outputs regularly for discriminatory patterns - check whether particular groups are being flagged disproportionately.
This is guidance, not legal advice. Disability discrimination claims in the context of absence management are among the most common employment tribunal claims. If you are using AI tools in absence management or are facing a challenge from an employee about absence management procedures, take advice from an employment solicitor.
Related answers
AI and HR Data: UK GDPR Compliance Guide
Using AI tools that process employee data triggers specific UK GDPR obligations. Understand data controller duties, DPIAs, employee transparency requirements, and international transfer risks.
AI in Performance Management: Employer's Legal Guide
Using AI for performance tracking and automated scoring in the UK triggers UK GDPR Article 22 rights. Understand your obligations before automating performance management.
Equality Act 2010: Employer's Guide
Understanding the Equality Act for employers. Protected characteristics, types of discrimination, reasonable adjustments, and avoiding claims.
Frequently Asked Questions
- Can I use AI to automatically flag employees with high absence rates?
- Yes, but with important caveats. Automated absence flagging is permissible if it is disclosed in your absence management policy, is based on objective criteria, and is used to trigger a human-led review rather than automatic disciplinary action. If an employee's absences are related to a disability, you must make reasonable adjustments before initiating absence management procedures - an automated flag does not discharge this obligation.
- Is using AI to predict which employees are likely to be absent legal in the UK?
- Predictive absence tools that attempt to forecast future absence based on patterns are legally risky. They process health-related data (absence history is considered health data), they can produce discriminatory predictions based on characteristics that correlate with disability, and acting on predicted absence before it occurs risks discrimination and unfair treatment. The ICO would classify such processing as high risk, requiring a DPIA before deployment.
- What if an employee's frequent absences are triggered by a disability?
- If an employee's absence is related to a disability, the Equality Act 2010 requires you to make reasonable adjustments before taking formal absence management action. An AI system that flags the employee for a formal warning based on absence triggers without any assessment of whether disability is involved risks indirect disability discrimination, or potentially direct discrimination if the absence pattern is itself a manifestation of the disability. You must carry out a human review that includes an assessment of whether disability is a factor.