Hold your yawns! We need to talk about data privacy.
In 2025, data privacy has taken centre stage in recruitment. New legislation, increasing reliance on artificial intelligence, and shifting candidate expectations have reshaped how organisations must handle personal data. For recruiters and HR professionals, these changes aren’t just legal considerations – they’re central to building trust, protecting reputation and remaining competitive in the labour market.
So, what’s changed – and what should organisations be doing to stay on the front foot?
New legislation: The Data (Use and Access) Bill
A major development in the UK is the introduction of the Data (Use and Access) Bill, designed to modernise data protection post-Brexit. Although it aims to simplify compliance and support innovation, it also introduces responsibilities that directly affect recruitment.
Key points include:
- Refined lawful basis for processing: The bill clarifies what counts as ‘legitimate interests’. For recruiters, this means greater certainty when processing applicant data without consent – for example, during shortlisting or CV screening. However, it also demands that these interests are balanced carefully against the rights of the candidate.
-
Changes to automated decision-making: The bill tightens controls around solely automated recruitment decisions (like using AI to reject candidates). There must now be human oversight in decisions that have legal or similarly significant effects – such as hiring or rejection.
-
Candidate rights: Enhanced transparency is a recurring theme. Candidates must be informed clearly and simply about how their data will be used, who it will be shared with, and how long it will be kept.
What you should do:
-
Review your privacy notice for candidates to ensure it's clear, concise and reflects the latest legal changes.
-
Audit your applicant tracking system (ATS) or recruitment tools for any automated processes. Add human review where needed to remain compliant.
AI and fairness in recruitment
The rise of AI-driven recruitment tools has made hiring more efficient – but it has also raised concerns about fairness, accuracy and accountability.
In 2025, many organisations are reassessing their AI usage following a wave of regulatory scrutiny and ethical concerns.
The EU’s AI Act (which, while not UK law, influences best practice) classifies recruitment AI as “high risk”, requiring extra transparency and human involvement.
Common concerns include:
-
Algorithms amplifying bias due to flawed training data
-
Candidates being filtered out unfairly by automated CV screening tools
-
Lack of clarity on how AI makes decisions
What you should do:
-
Conduct regular impact assessments on any recruitment AI you use. These assessments should test for bias, explainability, and reliability.
-
Train your HR teams to understand how AI decisions are made – and how to challenge them when needed.
-
Choose vendors and systems (including your ATS) that offer explainable AI features and allow you to maintain human control.
Data subject access requests (DSARs) on the rise
In 2025, jobseekers are far more aware of their data rights. There has been a sharp increase in DSARs – where candidates request access to all personal data an organisation holds on them.
Responding to these requests can be time-consuming and complex, particularly where multiple recruitment tools or agencies are involved.
What you should do:
Cross-border recruitment and international data transfers
Remote and hybrid work has created global hiring opportunities – but with that comes greater scrutiny of how personal data moves across borders.
The UK's data adequacy agreement with the EU is under review. If the UK diverges too far from EU standards, it may lose this status – which would significantly complicate EU-UK data flows.
What you should do:
-
For now, continue to treat EU candidate data with the same high level of protection.
-
Use Standard Contractual Clauses (SCCs) when transferring candidate data outside the UK or EEA.
-
Monitor the status of the UK’s adequacy agreement and be ready to adapt quickly if the legal landscape changes.

Privacy-enhancing technologies (PETs)
In response to stricter privacy standards, privacy-enhancing technologies (PETs) are gaining traction in recruitment. These are tools and techniques that allow data to be used and analysed while reducing privacy risks.
Examples include:
-
Data anonymisation: Ideal for recruitment analytics, this strips identifying details from candidate data so trends can be studied without exposing individuals. We offer anonymous shortlisting as part of our ATS. Worth a thought!
-
Federated learning: Allows AI models to train on decentralised data (e.g. across different branches of an organisation) without centralising all the raw data.
-
Differential privacy: Introduces small 'noise' into datasets to protect individual entries while preserving overall accuracy.
What you should do:
-
Ask your ATS provider or technology vendors what PETs they support.
-
Introduce anonymisation or pseudonymisation when sharing candidate data internally, especially for reporting or decision-making not directly linked to hiring.
Building a privacy-first recruitment culture
Privacy isn’t just about tools and legal compliance – it’s a cultural shift. Candidates increasingly choose to apply for roles with organisations they trust to handle their data ethically.
Steps to take:
-
Train your teams: Regular data protection training should include real recruitment examples – like what to do when a manager saves CVs on a personal device.
-
Champion transparency: Clearly explain to candidates how and why their data is being used. Offer opt-ins where possible.
-
Limit retention: Avoid keeping candidate data “just in case”. Set clear deletion policies and automate them where you can.
In 2025, data privacy in recruitment is no longer a background consideration. It’s central to how employers attract, engage and retain top talent – and to how candidates decide where to apply.
From new legal frameworks to smarter technologies and rising expectations, the landscape is evolving quickly. Our applicant tracking system (ATS) is already built with these privacy-first principles in mind – helping organisations stay compliant, efficient and candidate-friendly.
For HR teams and recruiters, the message is clear: get ahead now, and you won’t just be ticking boxes – you’ll be building better, fairer hiring experiences that stand the test of time.