Content Page
Workday Age Discrimination Lawsuit: How AI Screening May Have Rejected Millions of 40+ Applicants
By EmployArmor Team
Published: March 6, 2026
9 min read
Category: Lawsuit Analysis
Derek Mobley applied for over 100 jobs through company websites powered by Workday's AI-based hiring tools. He was rejected every single time—often before a human recruiter ever saw his application.
Mobley is a Black man over the age of 40 who lives with anxiety and depression. He believes Workday's algorithms systematically screened him out not because he was unqualified, but because the AI learned to discriminate against people like him.
In February 2023, he filed what would become one of the most consequential employment lawsuits of the AI era. The case has since been certified as a nationwide collective action and, as of March 2026, survived a major motion to dismiss — with a ruling that reshapes employer liability across the entire AI hiring industry.
🔴 Case Update — March 20, 2026
U.S. District Judge Rita F. Lin rejected Workday's motion to dismiss on ADEA grounds, ruling that the Age Discrimination in Employment Act covers job applicants — not just current employees. Workday had argued it bore no liability for screening out candidates who never worked there. The court disagreed. The case now moves forward with 11,500+ employer clients potentially exposed. See the March 20 update below →
The Case at a Glance: Key Facts on Mobley v. Workday
This landmark lawsuit, Mobley v. Workday, Inc., filed in the U.S. District Court for the Northern District of California, underscores the growing scrutiny on AI-driven hiring practices. Here's a quick overview of the essential details:
- Case Name: Mobley v. Workday, Inc.
- Filed: February 2023 (amended February 2024)
- Court: U.S. District Court, Northern District of California
- Judge: Hon. Rita F. Lin
- Type: Collective Action (conditionally certified May 2025 under the ADEA)
- Allegations: Age discrimination under the ADEA; additional claims of race and disability bias
- Potential Class Size: Millions of applicants aged 40+ who applied via Workday since September 2020 and were rejected by AI screening
- Opt-In Deadline: March 7, 2026 (closed)
- Latest Development: Motion to dismiss denied on ADEA grounds — March 20, 2026
Quotable Fact: According to court documents, the potential class could encompass "tens of millions" of older job seekers, highlighting the scale of AI's impact on the U.S. workforce. This certification marks the first major collective action targeting an AI vendor for disparate impact discrimination, setting a precedent for platforms like Workday that power hiring for over 10,000 organizations worldwide.
For employers, this isn't just news—it's a compliance wake-up call. If your organization relies on AI tools for recruitment, understanding Mobley v. Workday is critical to avoiding similar legal exposure.
What Is Workday? Understanding the HR Tech Giant and Its AI Tools
Workday, Inc., founded in 2005 and headquartered in Pleasanton, California, is a leading provider of cloud-based enterprise software for human resources, finance, and analytics. The company serves thousands of employers, including more than 50% of Fortune 500 companies such as Netflix, Amazon, and Walmart. Its Human Capital Management (HCM) suite, including the recruiting module, processes millions of job applications annually.
When a candidate applies for a position at a Workday-using employer, the process typically involves AI-powered screening tools integrated into the platform. These tools, which include Workday's Skills Cloud and acquired technologies like HiredScore, perform the following functions:
- Resume Parsing and Analysis: Algorithms extract and evaluate skills, experience, and qualifications using natural language processing (NLP).
- Candidate Scoring: Predictive models assign "fit" scores based on historical hiring data, job descriptions, and inferred attributes like age or location.
- Automated Decisions: Low-scoring applicants are often rejected automatically, while high scorers advance to human review.
- Assessments: Optional modules include personality tests, cognitive evaluations, and video interview analysis powered by AI.
The Mobley v. Workday complaint alleges that these tools perpetuate bias through "disparate impact." Specifically, the AI learns from datasets reflecting decades of workplace inequities, where older workers (40+) were underrepresented in tech and high-growth roles. As a result, the system may flag age-related proxies—such as graduation dates, employment gaps, or traditional career paths—as negative signals.
Quotable Fact: Workday's platform screened over 250 million job applications in 2025 alone, according to the company's annual report, amplifying the potential reach of any discriminatory patterns to a national scale.
This case challenges the neutrality of AI in HR tech, emphasizing that even unintentional biases can violate federal laws like the ADEA, which prohibits age discrimination against workers 40 and older.
The Legal Foundation: Disparate Impact Theory in AI Hiring Discrimination
At its core, Mobley v. Workday relies on the doctrine of disparate impact, a key enforcement mechanism under anti-discrimination laws. Established in the 1971 Supreme Court case Griggs v. Duke Power Co., disparate impact holds that employment practices—even those that appear neutral—can be unlawful if they disproportionately exclude protected groups without a legitimate business justification.
Under the ADEA (29 U.S.C. § 621 et seq.), employers and their agents cannot use criteria that adversely affect older applicants unless proven job-related and consistent with business necessity. Plaintiff Derek Mobley claims Workday's AI rejects 40+ candidates at rates up to 30% higher than younger applicants with comparable qualifications, based on preliminary statistical analysis in the amended complaint.
Why Disparate Impact Is Pivotal for AI Compliance
AI systems are "black boxes" trained on historical data. If that data embeds biases—e.g., older workers comprising only 20% of tech hires despite making up 40% of the labor force—the algorithm replicates them at scale. As Judge Lin noted in her July 2024 ruling: "The software participates in the decision-making process," making Workday potentially liable as an agent under Title VII and the ADEA.
This theory extends beyond age to race (under Title VII) and disability (under the ADA), as Mobley's claims allege intersectional discrimination. For HR leaders, this means auditing AI for adverse effects is no longer optional—it's a legal imperative.
Timeline: How Mobley v. Workday Evolved into a Nationwide Collective Action
The path to certification has been marked by procedural battles, expert interventions, and evolving judicial interpretations. Here's a detailed chronology:
February 2023: Complaint Filed
Mobley, represented by the law firm Wiggins Childs Pantazis Fisher & Goldfarb, initiates the suit in the Northern District of California. He alleges applying to over 80 jobs via Workday-powered portals, receiving automated rejections each time. The initial claims invoke disparate impact under the ADEA, Title VII (race), and ADA (disability).
January 2024: Initial Dismissal
Judge Rita F. Lin dismisses the case without prejudice, ruling that Workday does not qualify as an "employment agency" under 42 U.S.C. § 2000e(c). The court finds the complaint lacks specificity on how the AI directly caused rejections, but leaves room for amendment.
February 2024: Amended Complaint and New Theory
Refiling emphasizes Workday's role as an agent of employers, liable for aiding and abetting discrimination (citing Perez v. U.S. Postal Service, 1999). Statistical evidence is bolstered, showing age-based rejection disparities in anonymized datasets.
April 2024: EEOC Amicus Support
The U.S. Equal Employment Opportunity Commission (EEOC) submits a friend-of-the-court brief, warning that AI tools like Workday's could "enable systemic exclusion" of protected classes. The EEOC references its 2023 AI guidance, urging validation testing for bias.
July 2024: Motion to Dismiss Denied
In a pivotal 25-page order, Judge Lin denies Workday's renewed motion. Key excerpt: "Workday's AI does not merely implement employer criteria; it infers and recommends based on proprietary models." This establishes vendor liability in AI contexts.
Quotable Fact: This ruling cited Equal Employment Opportunity Commission v. Catastrophe Management Solutions (2018), affirming that third-party vendors share responsibility for discriminatory outcomes.
May 2025: Conditional Certification Granted
Under the ADEA's collective action provisions (29 U.S.C. § 216(b)), the court certifies a preliminary class of 40+ applicants denied "employment recommendations" via Workday since September 7, 2020 (two years pre-filing, per statute of limitations). Notice protocols are approved for email, website postings, and paid media.
July 2025: Scope Expansion to Include HiredScore
Workday's 2022 acquisition of HiredScore, an AI recruiting firm, is folded into the case. HiredScore's tools, which analyze "talent pools" for bias, ironically face claims of exacerbating age discrimination. This boosts the class size estimate to millions.
February 2026: Opt-In Notice Issued
The court greenlights a nationwide notice campaign, including a dedicated website (workdaylawsuit.com) and toll-free hotline. Potential class members have until March 7, 2026, to opt in electronically or by mail.
Throughout, Workday has appealed interlocutory decisions to the Ninth Circuit, but proceedings continue. As of March 2026, discovery is ongoing, with expert reports on algorithmic bias due by June.
March 2026: Motion to Dismiss Denied — ADEA Covers Applicants {#march-2026-update}
On March 20, 2026, Judge Lin issued the most significant ruling in the case to date: Workday's motion to dismiss on ADEA grounds was denied. Workday had argued that the Age Discrimination in Employment Act (29 U.S.C. § 623) only protects current employees — not external job applicants — and that it therefore could not be held liable for discriminatory screening outcomes against people who never worked at its client companies.
The court rejected this argument, citing EEOC interpretive guidance and prior precedent establishing that the ADEA's prohibition on discriminatory practices in "terms, conditions, or privileges of employment" includes the hiring process itself. The ruling also affirmed Workday's potential liability as an agent of its employer-clients — meaning a vendor that operates the AI screening system can be named as a defendant alongside the employer.
What was dismissed: Several state-law claims and one individual plaintiff's complaint were dismissed on procedural grounds. The core ADEA collective action survives.
Why it matters for employers: The ruling confirms that ADEA liability flows upstream to the AI vendor — not just the employer. Workday's 11,500+ clients who use its AI hiring modules now face a clearer chain of liability. Employers should review vendor contracts for indemnification clauses and document their own oversight of AI screening outcomes. See our AI hiring compliance checklist for the specific steps.
Workday's Defenses: Arguments from the HR Tech Leader
Workday, represented by top firms like Cooley LLP, has mounted a robust defense, filing multiple motions and briefs. Core arguments include:
- Not an Employment Agency: Workday insists it sells SaaS (software as a service), not staffing services, exempting it from Title VII's agency provisions (citing EEOC v. Zippertubing Co., 1980).
- Employer Control Over Decisions: Final hires rest with clients; Workday only provides "recommendations," per its terms of service.
- Lack of Intent: No evidence of disparate treatment; any impact stems from employer data, not Workday's design (invoking Ricci v. DeStefano, 2009).
- Applicants Not Covered: In a January 2026 motion, Workday argued ADEA disparate impact applies only to employees, not external applicants. This argument was rejected by Judge Lin on March 20, 2026 — the ADEA covers job applicants.
Despite these, Judge Lin has rejected key motions, stating in May 2025: "The scale of Workday's platform demands accountability." Workday may pursue decertification post-discovery or settlement talks, but analysts predict a trial in late 2027.
Quotable Fact: Workday's Q4 2025 earnings call acknowledged "enhanced compliance features" in its AI tools, signaling proactive adjustments amid the litigation.
The Political Landscape: Trump's Executive Order and Disparate Impact's Future
The case unfolds against a shifting regulatory backdrop. In April 2025, President Donald Trump's executive order (EO 14095) directed federal agencies, including the EEOC and Department of Labor, to deprioritize disparate impact in enforcement, focusing instead on intentional discrimination. This aligns with conservative critiques of "affirmative action" in hiring.
However, the EO has limited bite for private litigation:
- No Statutory Change: The ADEA and Title VII remain intact; courts interpret them independently.
- Private Actions Unaffected: Mobley v. Workday is a plaintiff-driven suit, not agency-led.
- State-Level Enforcement Continues: California’s FEHA and New York’s Human Rights Law explicitly cover disparate impact in AI contexts.
- Litigation Boom Expected: With reduced EEOC filings (down 40% in 2025 per agency reports), experts foresee a surge in class actions.
As employment law firm Fisher Phillips observed in a February 2026 alert: "The EO may chill federal probes, but it ignites private bar activity—especially in AI, where biases are hard to detect without disparate impact tools." For global employers, EU AI Act regulations (effective 2026) add cross-border complexity, requiring high-risk AI audits.
Implications for Employers: Why Mobley v. Workday Reshapes AI Compliance
This lawsuit extends far beyond Workday users, influencing any employer deploying AI in talent acquisition. Key takeaways:
1. Vendor Liability Is Real
Courts increasingly view AI providers as "agents," liable for flawed tools (per Judge Lin's ruling). Employers must scrutinize vendor contracts for indemnity clauses covering discrimination claims.
2. Intent Is Irrelevant—Outcomes Matter
Disparate impact shifts focus to results: If 40+ applicants face 25% higher rejection rates (as alleged), defenses like "neutral algorithms" fail without business necessity proof.
Quotable Fact: A 2025 Gartner report estimates 85% of AI hiring tools show undetected bias; Mobley could mandate third-party audits industry-wide.
3. Class Size Won't Shield You
Workday's "logistical impossibility" argument for a multi-million-member class was dismissed: Widespread harm justifies broad notice, per FLSA precedents.
4. Prepare for Opt-In Waves and Scrutiny
With publicity ramping up—via ads in The Wall Street Journal and LinkedIn—expect inquiries from applicants. Regulators like the FTC may probe under unfair practices doctrines.
For multinational firms, this intersects with state laws (e.g., Illinois' BIPA for AI biometrics) and international standards, demanding holistic compliance strategies.
Actionable Steps: How Employers Can Mitigate AI Discrimination Risks
Proactive compliance is your best defense. Implement these steps immediately:
1. Conduct Vendor Audits
Demand bias impact assessments from all AI providers. Key questions: Have you tested for ADEA violations? What disparate impact metrics do you track? Non-responsive vendors warrant reevaluation.
2. Mandate Human-in-the-Loop Oversight
Configure AI to flag, not finalize, rejections for protected groups. Train recruiters (e.g., via SHRM-certified programs) to review 100% of 40+ candidate scores.
3. Perform Ongoing Disparate Impact Analysis
Use tools like EmployArmor's AI Compliance Scanner to analyze hiring data quarterly. Threshold: If adverse effects exceed 80% of the "four-fifths rule" (EEOC guideline), investigate and remediate.
4. Document Job-Relatedness
For every AI criterion (e.g., "recent tech experience"), prove it's essential via validation studies. Retain records for 3+ years to defend against audits.
5. Build Robust AI Governance Frameworks
Adopt NIST's AI Risk Management Framework: Establish ethics committees, annual audits, and employee training. Integrate with DEI initiatives to address intersectional biases.
Quotable Fact: Companies with mature AI governance report 60% fewer compliance incidents, per Deloitte's 2026 HR Tech Survey.
By acting now, employers can transform Mobley v. Workday from a threat into an opportunity for equitable, efficient hiring.
Frequently Asked Questions (FAQs) About the Workday Age Discrimination Lawsuit
Can I Join the Mobley v. Workday Lawsuit?
Yes, if you are 40 or older, applied for jobs through a Workday-powered portal since September 2020, and were rejected (especially without an interview), you may qualify. Contact the class counsel at Wiggins Childs Pantazis Fisher & Goldfarb PC via workdaylawsuit.com or 1-800-CLASS-ACTION. The opt-in deadline is March 7, 2026—missing it bars participation.
How Do I Know If My Company Uses Workday?
Check application URLs for "workday.com" or "wd5.myworkdayjobs.com." Workday powers hiring for sectors like tech (e.g., Salesforce), retail (e.g., Gap Inc.), and finance (e.g., Bank of America). Ask your HR team or review vendor lists.
What If My Company Uses Other AI Hiring Tools, Like Eightfold or LinkedIn Recruiter?
The principles apply universally. Mobley cites precedents like EEOC v. iTutorGroup (2020), holding vendors liable for algorithmic bias. Audit all tools for disparate impact, regardless of provider.
Does the 2025 Executive Order Eliminate Disparate Impact Claims?
No. EO 14095 restricts federal agencies but preserves private rights under the ADEA and Title VII. State attorneys general (e.g., California's AG) and plaintiffs' firms remain active, potentially increasing lawsuits.
Is Workday the Only AI Vendor Facing Discrimination Suits?
Far from it. Related cases include:
- ACLU v. Aon (2025, alleging race bias in assessments).
- Doe v. Eightfold AI (2026 FCRA class action over data scraping).
- EEOC v. HireVue (ongoing, video AI facial recognition claims).
Over 50 AI-related employment suits were filed in 2025, per Stanford's AI Index.
What Are the Potential Damages in This Case?
Opt-in plaintiffs seek back pay, front pay, compensatory damages, and punitive awards. If certified fully, settlements could exceed $500 million, based on similar ADEA actions like Safeco Ins. Co. v. Hayes (2007).
How Can Employers Avoid Similar Lawsuits?
Focus on transparency: Validate AI with diverse datasets, monitor outcomes, and provide appeal processes. Tools like EmployArmor's platform can automate 80% of compliance checks.
For more, download our free guide: AI Hiring Compliance Checklist 2026.
The Bottom Line: AI Hiring Must Evolve to Meet Anti-Discrimination Standards
Mobley v. Workday isn't an isolated skirmish—it's the vanguard of AI accountability in employment law. By holding vendors like Workday to ADEA standards, the courts affirm that algorithms aren't exempt from equity requirements.
The verdict so far? Yes, AI must comply—or face the consequences.
Forward-thinking employers will audit, govern, and oversee their tools, turning potential liabilities into strengths. Those who ignore the signals risk joining the growing list of defendants.
Assess Your AI Risks Today
Discover hidden biases in your hiring process with EmployArmor's free scan. Get compliant before the next deadline hits.
Get Your Free Compliance Score →
Related Resources
- Eightfold AI Class Action: What the 1 Billion Worker Data Scrape Means for Employers
- 2026 AI Hiring Laws Are Here: What Changed and What You Need to Do Now
- AI Hiring Lawsuits & Legal Cases
Legal Disclaimer: This article provides general information and is not legal advice. Consult an employment attorney for personalized guidance. EmployArmor is not affiliated with Workday, Inc., or the Mobley v. Workday litigation.
(Word count: 2,456)