The Digital Complication
← Back to the piece About

The Screening Machine

AI Hiring Bias Simulator

Every day, thousands of CVs are fed into AI screening systems that score, rank, and filter candidates before a human ever sees them. The companies that build these systems say they're objective. They say the algorithm doesn't see age, race, gender, or disability. They're wrong.

The algorithm doesn't need to see your date of birth. It can infer your age from your graduation year, the technologies you list, the number of roles on your CV. It doesn't need to see your ethnicity. It can infer it from your name, your postcode, your university. It doesn't need to know you have a disability. It can see the gap in your employment history. And it penalises all of it.

In 2018, Amazon scrapped its AI hiring tool after discovering it had taught itself to penalise CVs containing the word "women's." It was trained on a decade of hiring data that was predominantly male. The machine learned that male was the default. Amazon caught it. Most companies don't check.

This simulator lets you build a candidate profile and watch the score change in real time. The penalties shown are illustrative, but they reflect patterns documented in academic research and employment tribunals. The Equality Act 2010 protects against discrimination on the basis of nine characteristics. AI doesn't care about the Equality Act. It cares about patterns. And the patterns are biased.

Candidate Profile
Experience
Education
Skills Match

EQUALITY ACT 2010 — THESE SHOULD NOT AFFECT SCORING
Age
Gender
Ethnicity
Name Style
Disability
NO
Career Gap
NO
Trans / GRC
NO
Religion

WHO IS HOLDING THE DIAL
Government
DEI Policy
Sector

HOW THE ALGORITHM ACTUALLY READS BETWEEN THE LINES
Postcode
School Type
Photo on CV
NO
Screening Result
ATS Compatibility Score
Ref: --
--
Awaiting scan
Reject Review Shortlist Interview
--
Submit a candidate profile to begin screening.
Equality Act 2010 - Protected Characteristics

Age. Disability. Gender reassignment. Marriage and civil partnership. Pregnancy and maternity. Race. Religion or belief. Sex. Sexual orientation.

These nine characteristics are protected by law. Discrimination on the basis of any of them is illegal in the UK. An AI system that uses proxies for these characteristics to filter candidates is, by definition, discriminating. The fact that it does so through pattern matching rather than explicit rules does not make it legal. It makes it harder to prove.

Disclaimer

This is a satirical art piece. No real CV screening takes place on this page. No personal data is collected, processed, or scored. The numbers, penalties, and policy modifiers shown are illustrative and designed to make invisible patterns visible. They are informed by published research and documented cases but do not represent the output of any real ATS or recruitment system. If you are building or deploying AI screening tools, audit them. If you are being screened by one, know your rights under the Equality Act 2010 and GDPR Article 22.