In an age where artificial intelligence is rapidly reshaping how companies hire employees, one major tech firm is finding itself in hot water.
Workday, a widely used recruitment software company, is now facing serious legal action over claims that its AI-driven hiring tools are biased—especially against older job applicants.
A Job Seeker’s Frustration Turns Into a Nationwide Lawsuit
It all started with Derek Mobley, a Black man in his 40s who also lives with anxiety and depression.
After getting rejected from more than 100 jobs over the course of seven years—with little to no feedback—Mobley began to suspect something more than bad luck was at play.
He decided to take legal action against Workday in 2024, claiming the company’s AI-powered recruitment technology played a role in repeatedly screening him out.
Since Mobley filed his lawsuit, four more plaintiffs over the age of 40 have joined the case, all alleging similar experiences of age discrimination.
The Judge Says It Can Move Forward
Earlier this month, U.S. District Judge Rita Lin gave the green light for the lawsuit to proceed as a collective action—akin to a class action lawsuit.
That means it could potentially represent a broad group of people who believe they’ve faced similar treatment.
According to the lawsuit, these job seekers were rejected quickly and automatically, sometimes in under an hour, leading them to believe no human ever reviewed their applications.
Court documents claim that Workday’s algorithm “disproportionately disqualifies individuals over the age of 40 from securing gainful employment.”
Workday Pushes Back on the Allegations
Workday has strongly denied all of the claims. The company, which supplies its recruitment tools to around 11,000 businesses worldwide, insists that it doesn’t actually make hiring decisions.
Instead, Workday says its AI tools, like “HiredScore AI,” simply help employers streamline the hiring process by ranking applicants and filtering resumes more efficiently.
A company spokesperson told DailyMail.com that the current ruling is merely procedural, not a judgment on the case’s merits.
“We continue to believe this case is without merit,” they said.
“This is a preliminary ruling, and we’re confident that once the facts are fully presented, the claims will be dismissed.”
AI Bias: A Growing Concern in the Hiring World
While Workday maintains its innocence, this case highlights a larger and more troubling question: Can AI really be unbiased?
Experts and civil rights organizations, like the ACLU, have long warned that algorithmic hiring systems risk replicating and amplifying existing discrimination in the workforce.
They say that even if the bias isn’t intentional, it can still have damaging effects.
In fact, Amazon had to shut down a recruitment AI tool back in 2018 after it began favoring male candidates over female ones.
More Job Seekers Share Their Frustrations
One of the co-plaintiffs, Jill Hughes, said she too experienced strange rejection patterns.
Applications were often turned down within hours or during times that suggested no human had even seen them.
In one case, she was told she didn’t meet the “minimum requirements” for a job she was fully qualified for—another red flag that led her to suspect automation was to blame.
Mobley, for his part, recalled submitting a job application at 12:55 a.m., only to be rejected less than an hour later—by 1:50 a.m.
And this wasn’t a one-off; it happened dozens of times, despite his strong academic record and years of professional experience.
The Stakes Could Be Huge
Now that Judge Lin has approved the case to proceed as a nationwide collective action, Mobley’s legal team can begin reaching out to others who may have had similar experiences with Workday’s recruitment software.
The plaintiffs are seeking monetary damages and a court order that would require the company to change how its technology is used.
While the legal process could still take months—or even years—this case may set a major precedent for how courts view algorithmic hiring.
It could also put more pressure on companies to ensure their AI tools are not just efficient, but fair and inclusive.