top of page
Search

It’s Not Workday’s Fault. It’s Your Hiring Data.

Updated: Aug 4


Why the Workday Discrimination Lawsuit Should Be a Wake-Up Call for HR Leaders.


When the news broke about a class-action lawsuit accusing Workday of enabling discrimination through its AI hiring tools, it sent shockwaves through the HR and tech community. And it is positioned to set a major precedent with AI in Human Capital Management.

As someone who has spent over twenty-five years in talent strategy, technology, and inclusion, I had two immediate reactions:

  1. This was inevitable.

  2. Workday isn’t the problem. Your data is.


AI Doesn’t Create Bias. It Inherits It.

Let’s be clear: AI does not have a propensity for bias. Humans do. This is a well-established concept in psychology and cognitive science.

Algorithms, especially those in talent platforms, are only as good—or as flawed—as the data we feed them.

In this case, what most people don’t realize is that tools like Workday rely heavily on inputs such as:

  • Job descriptions

  • Historical hiring decisions

  • Resumes of past “successful” candidates

  • Company-defined filters and preferences


So, when bias shows up in the outcomes, we shouldn’t just blame the system—we should ask: What patterns and priorities have we hard-coded into our hiring culture without even realizing it? AI is NOT the problem. And once you understand what IS the problem, AI can provide several solutions.

 

The Real Culprit: Bad, Biased Job Descriptions

In my 25 years of working in the Human Capital Management outsourcing industry, I’ve had well over a hundred customers of all sizes and across all of the major industries. And one theme is constant: job descriptions are broken. I know, I’ve seen them.

They’re:

  • Vague and bloated with outdated requirements

  • Coded with gendered or exclusionary language

  • Trained on the subjective profiles of people who are already in the job

If you’ve ever written a job description that says “must have a degree from a top university” or “minimum 10 years of experience in a startup” … congratulations, you’ve just unintentionally biased the algorithm.

I know a little something about outsourcing to companies like Workday – but that does not mean you can outsource responsibility to them.

Blaming Workday or any HCM tech vendor for algorithmic bias is like blaming a spreadsheet for a bad budget. These tools amplify what’s already in your system—they don’t invent it.


Shift the Focus: From Blame to Better Design

Here’s how companies can respond constructively:

Audit Your Hiring Data

  • Analyze past hiring patterns: who gets through, who doesn’t—and why.

  • Clean up job descriptions to reflect real, inclusive skill requirements (use AI to help)

  • Job Description clean up not a viable solution at this time? Use AI to measure your bias, then give the model instruction to minimize/ eliminate

Stop Using "Success Profiles" Based on Yesterday’s Leaders

  • If your current leadership team is 90% white and male of a certain age, do you really want your AI optimizing for that?

Use Validated, Skills-Based Assessments in hiring

  • Tools like PowerSkillsAssessment™ (used in platforms like Ignis AI) provide structured, bias-resistant ways to evaluate what really matters: capability, not pedigree.

Train Your Teams, Not Just Your Tools

  • Hiring managers and recruiters still make final calls. Let’s make sure they understand where bias hides—and how to fight it.

  • Provide your hiring managers and recruiters the tools and insights to make science-backed, data driven decisions around hiring, not just how well algorithms match resumes to job descriptions (like what Ignis AI is building to address the future of the workforce).


The Workday Lawsuit Is a Wake-Up Call—Not a Warning Label against AI

This moment should push us not to fear AI, but to understand it better, and get serious about the quality of what we feed into it.

These technologies provide so many solutions and they keep getting better. But until you understand how to work with your own data—hiring histories, job specs, implicit definitions of “fit”—you’ll keep getting the same results from different tools.

It’s not Workday’s fault. It’s your hiring data. Fix that—and the tools will work just fine.


Final Thought

Of course, I must caveat that we do not know all the details of the Workday lawsuit yet; more information could be brought to light. But that does not change anything mentioned here. Having been in this business for as long as I have, I’ve seen fads come and go, I’ve seen other companies propose solutions, and I’ve seen how poor, outdated, inaccurate data is the cause of many issues and headaches in the industry.


Here at Ignis AI we are poised to deliver real solutions to existing workforce problems, including this one. If you’re looking for help in building a people-first, future-ready workforce, let’s talk.


Unlock potential. Transform careers. Elevate the future of work.

 
 
 
bottom of page