Somewhere in a database, there may be a score between 0 and 5 attached to your name. An algorithm generated it. It was built from your social media activity, your location history, your device and browsing data — none of which you submitted on any job application. And it may have decided whether a human being ever saw your resume.
Two landmark lawsuits filed in early 2026 just ripped the cover off how AI hiring actually works. If you’ve applied to jobs at Microsoft, PayPal, Morgan Stanley, Starbucks, Chevron, or through any Workday-powered employer in the past six years, you may have been screened by a black-box scoring system you never knew existed, under consumer protection laws you were never told applied to you.
Here’s the quick version. The Workday collective action — an age discrimination claim under the ADEA covering applicants 40 and older — had an opt-in deadline of March 7, 2026. That window just closed. But the Eightfold AI class action is still open and arguably more consequential: it alleges that Eightfold secretly built credit-report-style dossiers on over one billion workers and sold those scores to major employers without the disclosures required by the Fair Credit Reporting Act.
Your options aren’t gone. But to use them, you need to understand what these lawsuits actually proved. Here’s everything the employer-focused legal coverage left out — what happened, which companies are involved, what rights you have right now, and exactly what to do next.
The Workday AI Lawsuit Explained: Two Cases, One Broken System
Let’s start with the facts, because the details matter here.
The Eightfold AI class action was filed January 20, 2026, in the Northern District of California (Case No. C26-00214). It was brought by former EEOC chair Jenny R. Yang and the nonprofit Towards Justice — two organizations that specialize in employment rights, not tech PR. The core allegation: Eightfold scraped data on over one billion workers from social media, location data, device and internet activity, and cookies, then ranked each person on a 0–5 “likelihood of success” score. None of this was disclosed. No authorization was obtained. No candidate could see their score, know it existed, or dispute it. (Outten & Golden, Fortune, January 2026)
Companies named in the Eightfold complaint or publicly confirmed as customers include: Microsoft, Morgan Stanley, Starbucks, BNY, PayPal, Chevron, Bayer, Vodafone, Coca-Cola Europacific Partners, EY, Amdocs, and Salesforce. One-third of Eightfold’s customers are Fortune 500 companies.
The Workday collective action (Mobley v. Workday) has a different legal theory. Lead plaintiff Derek Mobley — Black, over 40, and disabled — was rejected from over 100 jobs despite being qualified. The suit alleges Workday’s AI screening systematically deprioritized applicants based on age, race, and disability, in violation of the ADEA, Title VII, and the ADA. Workday’s own prior disclosure gave the case its starkest data point: 1.1 billion applications rejected by its software. (FairNow, National Law Review)
One critical distinction: the Eightfold lawsuit isn’t primarily a discrimination claim. It’s a consumer protection claim. The argument is that the algorithm operated in secret, in violation of rights that already existed under federal law. That legal framing is important — it’s actually a more tractable theory than proving statistical disparate impact, and it opens the door to statutory damages for every affected candidate.
Notice which article nobody ran. Every major legal outlet published a “6 things employers should do” piece after these filings. Not one published “6 things candidates should do.” That absence is its own indictment of whose interests the legal and HR media ecosystem serves.
Why This Matters for You Personally
This isn’t an abstract legal story. Let’s be concrete about who is likely affected.
If you applied to any named Eightfold customer — Microsoft, PayPal, Starbucks, Morgan Stanley, Chevron, Bayer, BNY, Vodafone, EY, Coca-Cola Europacific Partners, Amdocs, Salesforce — there is a reasonable probability that a 0–5 score was generated for you from data you didn’t submit and didn’t know was being collected. The algorithm pulled from your public social media, your location history, and your browser and device activity. It built a profile of you as a job candidate. You never knew it existed. The recruiter processing your application may not have known how it was built either.
If you’re over 40 and applied to any Workday-powered employer since September 2020, Workday’s HiredScore AI feature may have deprioritized your application. HiredScore is embedded across thousands of enterprise HR platforms — if the company’s application portal has “Workday” in the URL, this potentially applies to you.
The human cost gets lost when media covers these stories as “liability risks.” A Monster survey of over 1,000 job seekers found that 77% already worried their resume was being filtered before reaching a human. (AI Recruiting 2025 Year Review) These lawsuits confirmed those suspicions were correct. People weren’t being paranoid. They were accurately perceiving a system that had been built to work exactly that way.
Here’s something that rarely gets said: the recruiter who processed your application was probably deceived too. The person who marked your application as “not moving forward” may have believed sincerely that their system was neutral. The deception ran both directions. Eightfold and Workday sold “efficiency” to HR departments. The shadow infrastructure — the scoring, the data scraping, the dossiers — was buried in vendor contracts that no one read and no candidate ever saw. This isn’t a story about bad-faith hiring managers. It’s a story about AI vendors who built something neither side of the hiring table fully understood.
Your FCRA Rights After an AI Hiring Rejection
The Fair Credit Reporting Act has been federal law since 1970. It regulates “consumer reporting agencies” — companies that compile and sell information about people used in decisions about employment, credit, housing, and insurance. The Eightfold lawsuit argues, persuasively, that Eightfold is exactly that.
The CFPB clarified this directly in Circular 2024-06, issued September 2024: algorithmic hiring scores and background dossiers qualify as FCRA consumer reports. (Consumer Financial Protection Bureau) Eightfold was already operating in violation of existing law — the CFPB just made that explicit in writing. The fact that a federal regulator had to issue a circular in 2024 simply to clarify what has been illegal since the 1970s tells you how brazenly these vendors operated.
Under FCRA, if a company used a consumer report to reject you, they must: provide a copy of the report, give you a summary of your FCRA rights, and identify the company that produced the report. If you were rejected by a named Eightfold employer and never received any of that, that is itself a potential FCRA violation. Statutory damages run $100–$1,000 per willful violation with class action mechanisms available. (Epstein Becker Green Workforce Bulletin)
Here are your concrete steps:
- Step 1: Send a written FCRA disclosure request to Eightfold AI requesting all consumer reports, scores, and data files they hold about you. Address it to their registered agent in California.
- Step 2: If you applied to a named Eightfold employer and were rejected, send a written FCRA adverse-action inquiry to both the employer and Eightfold asking whether a consumer report was used in the hiring decision.
- Step 3: If you’re 40+ and applied via Workday since September 24, 2020, visit workdaycase.com for ongoing case developments — the ADEA opt-in deadline has passed, but separate claims for race or disability discrimination may still be viable.
- If you’re in New York City: NYC Local Law 144 requires employers using Automated Employment Decision Tools to notify candidates and conduct annual public bias audits. If you applied to a New York employer using Eightfold or Workday AI without receiving notice, a separate local claim may exist.
Consult an employment attorney for advice specific to your situation before filing anything.
Our Take: This Is What a Broken Hiring System Looks Like
The mainstream coverage framed these lawsuits as “liability risks for employers.” That framing is accurate and completely inadequate.
These lawsuits aren’t exposing edge cases or software bugs. They’re exposing features of an automation model that treats candidates as data to be processed and employers as paying customers — not as participants in a hiring ecosystem that should serve human interests.
Eightfold wasn’t just filtering resumes. It was operating a shadow credit bureau for employment decisions, drawing on data candidates never provided, building dossiers neither the candidate nor the recruiter could see, and charging companies for access. The parallel to the discredited pre-FCRA credit reporting industry isn’t accidental. The FCRA was written to stop exactly this — a world where opaque private companies compile secret files on people and sell those files to affect their economic lives without oversight or recourse. It took fifty years, but that industry is back in a new interface.
Workday’s algorithm was trained on historical hiring data. Which means it was trained to replicate the biases of every hiring decision made before its training cutoff. Scaling a biased process with AI doesn’t neutralize the bias. It entrenches it across 1.1 billion decisions, making it faster, cheaper, and harder to challenge.
Derek Mobley was rejected from over 100 jobs. He is one named plaintiff. Behind each of those 1.1 billion rejected Workday applications is a person who needed work and got a machine decision dressed up as a fair evaluation.
What comes next, in our view: more litigation using the FCRA framework (it’s a more tractable legal theory than proving AI disparate impact), regulatory pressure on AI vendors to disclose data sources and model training methodology, and state-level legislation expanding on NYC’s AEDT law.
And to the honest recruiter reading this: if your ATS uses AI screening you don’t fully understand, you are not protecting your company from bias. You’re outsourcing it and adding a layer of legal exposure your vendor won’t share when the lawsuit arrives.
What to Do Before Your Next Job Application
You can’t opt out of AI hiring entirely. But you can know what system you’re walking into before you submit your resume.
Research before you apply. Search “[company name] + Eightfold AI” and “[company name] + talent intelligence platform.” Corporate press releases and LinkedIn posts about “talent transformation” often name the vendor directly. If a company you’re applying to is an Eightfold customer, you now know your application may generate a dossier.
Check for Workday. If the application portal has “Workday” in the URL or on the login page, the employer may have HiredScore AI enabled. This doesn’t mean avoid — it means know.
Request your data now. Send a written FCRA disclosure request to Eightfold AI asking for all data files, scores, and consumer reports generated about you from past applications. You have the legal right to this information.
Optimize for algorithmic review. Use standard single-column resume formatting — no tables, text boxes, graphics, or multi-column layouts. Include the exact job title from the posting in your resume header. Use keywords from the job description in context, not stuffed. These aren’t tricks; they’re basics that poorly configured parsers routinely fail on. Reports from r/recruitinghell describe algorithms that “misinterpret specialized jargon or overlook unconventional candidates who don’t match exact keyword profiles” — a standard-formatted resume helps you clear that hurdle before the human review you’re aiming for. (AI Recruiting 2025 Year Review)
Know the red flags for algorithmic-first processes:
- AI video interviews (HireVue, Spark Hire) — r/Futurology users described “floating robot avatars, invasive 360-degree room scans to detect ‘cheating,’ and synthetic voices inserting fake ‘umms’ and ‘ahhs’ to mimic humanity,” with 17,614 upvotes on a thread about refusing to engage with them at all
- One-way video assessments with no human contact
- Rejection within minutes of submitting (a human physically cannot review an application in five minutes)
- Automated scheduling-only communication from first contact through offer
None of these individually prove algorithmic decision-making. Together, they tell you whether a human is actually in the loop.
For navigating what does exist — AI-powered interview prep tools can help you practice for the screenings you can’t avoid. And if you’re evaluating which companies use AI video interview tools like HireVue, knowing your rights before you start is just as important as preparing for the questions. You should also have access to salary benchmarking tools so that if you do make it through to an offer, you’re negotiating from accurate data rather than a company’s internal algorithm.
Frequently Asked Questions
Was I secretly scored by an AI before a human ever saw my resume — and how would I know?
Almost certainly yes if you applied to any named Eightfold customer (Microsoft, PayPal, Morgan Stanley, Starbucks, Chevron, Bayer, BNY, Vodafone, EY, Coca-Cola Europacific Partners, Amdocs, Salesforce) or through any Workday-powered employer with HiredScore enabled. You would not have known — that’s the core allegation. Your FCRA rights allow you to send a written disclosure request to Eightfold AI asking for all data and scores they hold about you. The fact that you have to ask them, rather than being told proactively, is the problem these lawsuits are trying to fix.
Is it too late to join the Workday age discrimination lawsuit if I’m over 40?
The ADEA collective action opt-in deadline was March 7, 2026 — that specific window has closed. The case continues, and the legal precedent it sets affects every AI hiring platform. Check workdaycase.com for updates. Separate claims for race or disability discrimination may still be viable. Consult an employment attorney for your specific situation before drawing any conclusions.
What are my FCRA rights if an AI screening tool was used to evaluate my job application?
Under FCRA — reinforced by CFPB Circular 2024-06 — if an AI system qualifies as a consumer reporting agency, you have the right to know a report was used, receive a copy if adverse action was taken, and dispute inaccuracies. If you were rejected and never received an adverse action notice identifying the reporting agency, that is itself a potential violation. Statutory damages run $100–$1,000 per willful violation, with class action mechanisms available. Send a written request to Eightfold AI for any consumer file they hold on you.
Which companies use Eightfold AI or Workday AI to screen candidates?
Eightfold companies named in the complaint or publicly confirmed as customers: Microsoft, Morgan Stanley, Starbucks, BNY, PayPal, Chevron, Bayer, Vodafone, Coca-Cola Europacific Partners, EY, Amdocs, Salesforce. One-third of Eightfold customers are Fortune 500 companies. Workday’s HiredScore is available to any enterprise on Workday’s HR platform — thousands of companies use it. Check whether a company’s application portal uses Workday by looking at the URL or login page before you apply.
What can I actually do today if I believe AI unfairly rejected my job application?
Three concrete steps: (1) Send a written FCRA disclosure request to Eightfold AI requesting all data and scores they hold about you. (2) If you’re 40+ and applied via Workday since September 2020, visit workdaycase.com for ongoing case updates. (3) If you’re in NYC and weren’t notified that an automated decision tool was used, NYC Local Law 144 may give you a separate local claim. Consult an employment attorney for advice specific to your situation.
You Deserved a Fair Shot
The Workday and Eightfold lawsuits didn’t reveal a flaw in AI hiring. They revealed that AI hiring was built this way — opaque scoring, secret dossiers, historical bias baked in at scale, and no disclosure to either the candidate or the recruiter who thought they understood the tools they were using.
Start by sending a written FCRA disclosure request to Eightfold AI to find out exactly what data they hold about you. Before your next application, search “[company name] + Eightfold AI” to know what system you’re walking into. And if you’re in New York City, you already have the right to ask — use it.
You deserved a fair shot at every job you applied for. Now you know who was standing in the way, and what you can do about it.