Predictive hiring

8 min reading

What Lesson Can We Learn from Amazon's Sexist Algorithm?

Three years ago, Amazon experimented with automating resume screening. The algorithm systematically excluded women's resumes. What lessons can HR learn from this failed experiment?

Summarize this article with:

Table of Contents:

  1. BREAKING NEWS
  2. #LongStoryShort... what is this really about?
  3. An algorithm isn't born "Sexist"... it becomes one.
  4. Should we abandon algorithms in recruitment altogether?
  5. You can automate your pre-screening without the "Amazon effect"
  6. The results of predictive recruitment
If you work in HR in any capacity (and especially in recruitment), you've surely seen the news...

BREAKING NEWS

Image: LinkedIn Screenshot

(https://media-exp1.licdn.com/dms/image/C4E12AQFdj7J0Jlx-ZQ/article-inline_image-shrink_1500_2232/0/1539515104318)

Dozens upon dozens of posts like this have been published since last weekend (news sites, LinkedIn...) to spread the word...

#LongStoryShort... what is this really about?

Three years ago, Amazon conducted an experiment to automate resume-based pre-screening.

The idea: create an algorithm capable of analyzing 100 resumes, rating them on a 5-star scale, and selecting the top 5 profiles with the highest probability of success at the company.

And then... incredible!

It turned out it didn't work.

Even worse, the algorithm was systematically filtering out women's resumes.

#NotFair #ExposeYourAlgorithm #AiIsTheNewPig

As HR professionals, what lesson should we take from this "failed experiment"?

An algorithm isn't born "Sexist"... it becomes one.

An algorithm is never biased, sexist, racist, or discriminatory "by nature."

It is merely the formalization of a process configured... by humans.

In Amazon's case, the AI was trained on data from resumes of people already employed at the company.

It therefore reproduced the decisions made by HR teams over several years.

1 — First, the resume question

By training an AI on resumes, you're only providing it with:

– academic backgrounds (degrees, schools)

– professional backgrounds (positions, duration, sectors)

2 — The nature of the sample

At the time, Amazon's workforce was over 60% male.

An AI trained on a predominantly male sample logically learns... male-biased rules.

3 — The human biases of the HR team

Humans are fallible and subject to numerous biases.

The algorithm simply modeled this system, and therefore reproduced these biases.

Important:

If you applied the same AI to resumes from other companies, you would likely get algorithms that discriminate against:

– women

– people over 45

– people with disabilities

– people with foreign-sounding names

... and others.

Should we abandon algorithms in recruitment altogether?

Missing out on 50% of talent is a massive problem.

But concluding that algorithms are "dangerous" or "not ready" would be a mistake comparable to saying we should ban cars because a drunk driver had an accident...

Isn't the real problem... resume-based screening itself?

Everyone agrees that resumes are not a good predictor of on-the-job success.

Yet virtually every company continues to use them as the primary filter.

Image: "What the f*** is wrong with you people?"

(https://media-exp1.licdn.com/dms/image/C4E12AQHyx6aj7tqWnA/article-inline_image-shrink_1000_1488/0/1539534799488)

By using weak data (resumes), from a male-dominated sample, and running them through an AI... it's hard to expect anything other than biased results.

Moreover, even in CAC40 companies with Data Scientists, HR often remains the "poor relation": plenty of resources for marketing, business... but rarely for HR.

Yet today, any company can do genuinely better pre-screening.

Let's compare:

On one side:

the resume

(academic and professional background)

On the other:

who the person really is

(intellectual agility, learning ability, motivations, personality, workplace behaviors)

Which of these two categories truly predicts performance, integration, and cultural fit?

You can automate your pre-screening... without reproducing the "Amazon effect"

Using recruitment tests alone isn't enough.

They need to be paired with a predictive model, meaning the definition of:

– the expected criteria

– and their link to on-the-job performance

How do you create a predictive model?

Step 1:

Assess current employees using questionnaires covering:

– personality

– motivations

– reasoning

Step 2:

Evaluate their performance level on the criteria you're trying to predict.

"We don't know, we don't have the data!"

If a company can't evaluate performance, no tool will be able to improve its hiring quality.

It may improve the candidate experience, generate statistics... but it will never predict success.

If an "expert" claims they can help you without asking for this data, run.

"What if my company has historically hired mostly men?"

No risk of reproducing an "Amazon-style" bias.

Psychological and behavioral criteria are very well distributed across the population, regardless of gender, age, or background.

The results of predictive recruitment

On average, companies using predictive recruitment:

– select employees who are 20% more performant

– accelerate their processes by 30%

– reduce their costs by approximately 20%

– decrease turnover by 50% on sensitive positions

Is this magic?

Not at all.

They simply started recruiting by focusing on who people really are, rather than their resume alone.

And guess what?

When you genuinely focus on people... you get better results.

Wild, right?

Similar articles

Predictive hiring
Big Five vs MBTI: Why Recruiters Prefer Predictive Models

Dec 22, 2025 · Written by David Bernard

Predictive hiring
Structured vs. Semi-Structured Interviews: Which Is Best for Hard Skills?

Nov 21, 2025 · Written by David Bernard

Predictive hiring
Hard Skills: Why 360-Degree Evaluation Isn't Enough

Nov 20, 2025 · Written by David Bernard