AI Tools

AI for UK HR teams in 2026: BambooHR AI, Workday AI, Personio, and the things to avoid

AI in UK HR is more cautious than other categories — for good legal and ethical reasons. The right deployment is augmentation in specific workflows; the wrong deployment is automated decisions about people.

By James Walker · · 3 min read
Share
AI for UK HR teams in 2026: BambooHR AI, Workday AI, Personio, and the things to avoid

UK HR teams in 2026 should be adopting AI more cautiously than marketing or sales teams — and the reason isn't conservatism. It's that employment law treats automated decision-making about employees with specific scrutiny. UK GDPR Article 22 restricts decisions made entirely by automated processes that have legal or similarly significant effect. The Equality Act 2010 catches indirect discrimination. The ICO's 2024-25 guidance on AI in recruitment has explicit warnings.

The honest finding: the right AI deployment in HR is augmentation in specific workflows; the wrong deployment is automated decisions about people. AI assists; humans decide; the human decision is documented. That's the defensible posture, and it's where every HR team should start.

What's safe to use AI for

Job description writing: use Claude Pro or ChatGPT Plus to draft, refine, and improve job descriptions. Plain prose that humans edit before posting. No legal risk because no automated decisions.

Policy document drafting: drafting employee handbooks, policy updates, internal comms. Always reviewed by qualified HR or legal before publication.

Candidate communication: drafting rejection letters, interview confirmations, offer letters. Templates with AI personalisation save real time without crossing legal lines.

HRIS-native AI features: BambooHR's AI summary features, Workday's AI features, Personio's AI summary. These are bundled with platforms HR teams already use. Generally appropriate for in-platform productivity use cases.

Survey analysis: summarising free-text responses from employee surveys. Genuinely useful for "what are people saying?" without making decisions about specific people.

What's risky to use AI for

CV screening / sifting. The legal exposure is real. Even if "the AI is just a filter," demonstrating non-discrimination is hard if the AI was trained on historical hiring data (which embeds existing biases). If you must use AI for CV screening, the ICO's recommended approach is: AI generates a shortlist; a human reviews and makes the actual progress decisions; the human decision and reasoning are documented.

Performance review generation. Managers must own performance reviews. AI-generated reviews, even with manager edits, risk creating performative documents that don't reflect actual performance.

Salary or promotion recommendations. Automated decisions about pay or promotion are a high-risk category under employment law. AI-recommended ranges are fine; AI-decided final salaries are not.

Predictive attrition analysis. UK GDPR Article 22 specifically applies. Predicting which employees will leave and using that to make decisions about them is the kind of automated decision-making the law restricts.

UK GDPR Article 22 plus the Equality Act 2010 plus emerging AI guidance creates a layered risk:

  • Article 22: individuals have a right not to be subject to decisions based solely on automated processing that have legal or similarly significant effect
  • Equality Act: indirect discrimination if an AI system disadvantages protected characteristics (age, race, gender, disability)
  • Information Commissioner guidance: explicit warnings about AI in recruitment and HR

The risk pattern: AI screens 1,000 CVs, recommends top 50, all 50 happen to be young men. Employment tribunal sees this. Defending the AI's decision becomes difficult, and "the algorithm did it" is not a recognised defence.

Six practical steps for HR teams using AI

  1. Document your AI use — what tools, what data, what decisions
  2. Keep humans in the loop for any decision affecting employees
  3. Audit for bias in AI outputs, particularly demographic patterns
  4. Train HR team on UK GDPR Article 22 implications
  5. Consult employment law before deploying any AI that touches recruitment, performance, or compensation
  6. Update privacy notices if you're using AI on employee data

How I'd actually advise picking

For HR teams: Claude Pro for general writing assistance. HRIS native AI features for in-platform use. Avoid dedicated AI hiring or HR tools that promise automated decisions.

For HR leaders: read the ICO's AI guidance and the Employment Rights Bill / proposed AI legislation updates. The regulatory picture is evolving fast through 2026-27.

What I'd swerve: AI tools that promise to "screen 10,000 CVs in minutes" or "predict employee performance." Even where these tools work technically, the legal and ethical exposure is real, and the cost of getting this wrong vastly outweighs the time saving on faster sifting.


This article is general information for UK HR practitioners, not employment-law advice. UK employment law is complex; consult an employment-law specialist for material decisions about AI deployment in HR.

Affiliate disclosure: Morningfold has affiliate partnerships with several HR platforms. See editorial standards.

Filed under: AI Tools · Productivity & Work
James Walker

James Walker

Editor of Morningfold. Spent over a decade in product and operations roles before turning years of "what tool should we use" questions into a public newsletter. Tests every product for at least a week before recommending. Replies to reader emails personally.

More from James Walker →