Can generative AI really solve hiring bias?

ALEX LAMONT • 08 Jul 2025

We recently published a guide on AI vs IA: what's best for recruitment? and it's got me thinking about the reality of generative AI and the claims that it could reduce the spectre of bias that hangs over every job description.

We're now firmly in the age where AI is a part of whatever software you use in your day-to-day, including applicant tracking systems (our speciality!) This means it's essential to recognise that such tools aren't a magic wand that waves away bias. In some cases, we've seen AI amplify it!

Using generative AI to write job adverts

Modern ATS systems increasingly offer AI-powered capabilities to draft job adverts, screen resumes, and tailor outreach.

How it works:

  • Integrates seamlessly into ATS dashboards to suggest or auto-generate job descriptions and outreach messages

  • Tools can create inclusive, persona-targeted adverts—claiming to boost reach and response rates.

  • Efficiency gains are evident: recruiters save time and speed up hiring, with some tools completing tasks in days rather than months

But there’s a problem: bias amplification.

A study of 1,439 GPT‑4 job ads found they were 29.3% more biased overall than human-written equivalents, showing systemic issues across age, gender, disability and more. AI tends to encode subtle stereotypes into supposedly “neutral” copy, which could deter underrepresented groups.

So, AI can help, but must be used discriminatingly.

Best practice is to use AI to draft first versions, then apply human oversight - especially with inclusive language tools. Data-backed platforms like Datapeople combine outcome analytics with AI to improve both clarity and fairness, rather than relying solely on generative AI.

Creating AI-generated visuals for job adverts: is it wise?

Recruitment teams also explore AI image-generation to produce visuals for job opening, such as team photos or event images. But visual bias is just as risky as textual bias:

  • AI image tools disproportionately generate white, male, professional stereotypes, even when prompts are generic (e.g. "CEO").

  • OpenAI’s Sora, for instance, consistently depicted male CEOs and female support staff, reflecting systemic biases

  • Research shows that even "neutral" prompts for people in occupations embed amplified stereotypes in AI visuals.

Unless AI image models are trained and audited specifically for inclusive representation, their outputs risk reinforcing narrow stereotypes. It’s essential for recruiters to:

  1. Use diverse prompts, intentionally seeking varied imagery.

  2. Vet all visuals for representation accuracy.

  3. Consider alternative sources for inclusive stock imagery where AI isn’t yet robust.

AI vs IA guide pixelated pages (1)Can generative AI truly be unbiased in recruitment?

Absolutely not. Bias is baked in by humans at multiple levels:

  • Training data reflects historical representation gaps.

  • Generative algorithms replicate and often amplify these biases.

  • Prompts and instructions are written by humans with their own blind spots.

In recruitment, these biases can influence which candidates are attracted or even selected, potentially violating discrimination laws.

So what guardrails can we implement?

Here’s what hiring teams should implement:

  1. Human-in-the-loop authoring
    Always review and refine AI-drafted job adverts. Use inclusive-language checkers to flag biased wording, especially around gender, age and ability.

  2. Bias testing and auditing
    Run AI outputs through tools that measure inclusivity and identify skew. Partner with vendors that verify performance via third-party audits or red‑teaming.

  3. Train recruiters on AI outputs
    Provide guidance on responsible use—for instance, how to evaluate AI visuals and text, recognise bias, and tailor communications.

  4. Maintain prompt governance
    Document and version-control prompt templates. Regularly audit their performance and diversify training examples to reduce stereotype compliance.

  5. Respect data laws
    Ensure prompts don’t expose personal or sensitive data, especially during candidate screening. Be ready to defend decisions with transparency.

In 2025, 40% of UK businesses are expected to use Al-powered tools for hiring and recruitment processes

Quick guide: should recruiters use generative AI?

Use case Benefit Risk Best practice
Drafting job descriptions Saves time, boosts output volume Biased language reducing pool diversity Use as draft, refine, and check for bias
Writing candidate outreach emails Personalised at scale Feels generic, may misrepresent Customise the tone and align with employer brand
Generating advert visuals Covers specialty imagery needs Reinforces stereotypes Vet imagery, push for diversity in prompts
Resume screening / sorting Faster shortlisting Legal risk from biased selection Combine AI screening with human oversight

 

Generative AI can add value to your hiring process. It can streamline copywriting and outreach. Yet, without proper controls, it can embed bias into every stage of the candidate experience.

To use these tools responsibly in recruitment:

  • Establish clear governance around prompt templates and versioning.

  • Mandate bias audits and human review of outputs.

  • Provide team training to recognise AI limitations.

With these guardrails, recruiters can leverage AI’s speed and scale, while ensuring fairness, inclusivity, and legal compliance.

Do you know the difference between Artificial Intelligence and Intelligent Automation?

Artificial Intelligence (AI) and Intelligent Automation (IA) – often known as Robotic Process Automation (RPA) – are changing the way organisations attract, assess and hire people. But when it comes to recruitment, how do you know which one is right for your needs and the scenario?

In our practical guide, we break down the differences between AI and IA in recruitment, highlight the best use cases for each and explore how these tools can help you reduce time to hire, improve the candidate experience and increase efficiency – without over-complicating your processes.

Artificial Intelligence vs Intelligent Automation
Download the full guide