Mobley v Workday, Inc., N.D. Cal Case No. Case No. 23-cv-00770-RFL (May 2025)
On May 16, 2025, the Hon. Rita Lin of the Northern District of California granted a workers' preliminary certification motion of a collective action against Workday, Inc., an AI human resource software provider. Plaintiffs' theories of liability included claims under the Age Discrimination in Employment Act (ADEA) and Fair Labor Standards Act (FLSA).
In Mobley v Workday, Inc., N.D. Cal Case No. Case No. 23-cv-00770-RFL, plaintiffs challenge Workday's software that has an alleged policy or practice of job screening that "disproportionately disqualifies individuals over the age of forty (40) from securing gainful employment." The lawsuit alleges both collective and class action claims. Plaintiffs define the collective to include job applicants who were "denied employment recommendations" by AI. The plaintiffs allege that Workday uses "'artificial intelligence,' 'machine learning,' assessments, tests, and pymetrics to decide who to recommend for jobs and who to reject."1 Still further, plaintiffs allege that Workday's customers are employers who have delegated traditional hiring functions, including rejecting applicants, to the algorithmic decision-making tools. And, the lead plaintiffs assert that they and the opt-in class members are similarly situated because of Workday's algorithmic decision-making tools denied them employment.The lawsuit further claims that Workday aided and abetted employers to engage in unlawful discrimination. According to the lawsuit, "Workday's software is not simply implementing in a rote way the criteria that employers set forth, but is instead participating in the decision-making process by recommending some candidates to move forward and rejecting others" (Order Granting in Part and Denying in Part Motion to Dismiss).
Specifically, the lead plaintiff in Mobley claims he was denied employment for every one of the 100-plus applications that he submitted using Workday's platform. On behalf of a collective group, Mr. Mobley alleges that Workday has a "unified" policy whereby Workday's AI recommendation system to score, sort, rank, or screen applicants has a disparate impact on applicants over forty (40) – a violation of the ADEA and FLSA. In ruling in favor of the employees at the preliminary stage, Judge Lin stated: "If the collective is in the 'hundreds of millions' of people, as Workday speculates, that is because Workday has been plausibly accused of discriminating against a broad swath of applicants. Allegedly widespread discrimination is not a basis for denying notice." Discovery is now moving forward, and the class certification motion on the plaintiffs' other claims is calendared to be heard in 2026.
The Mobley-style legal theory based on AI has been used before. In late 2023, the EEOC settled its first AI hiring discrimination lawsuit against iTutorGroup, Inc. (Equal Employment Opportunity Commission v. iTutorGroup, Inc., E.D.N.Y. Case No. 1:22-cv-2565-PKC-PKY). Based on the ADEA, the lawsuit alleged that iTutorGroup engaged in age discrimination because the AI hiring program it used "automatically reject[ed] female applicants age 55 or older and male applicants age 60 or older."
Class Action or Collective Action: Should I Care?
As to the procedural devices, they're similar. Businesses should care because the risk is multiplied to potentially cover every applicant, or every employee, whose employment opportunities were negatively impacted by the use of AI. Collective actions under the ADEA and FLSA use the same certification procedural mechanism. At the early conditional certification stage, the plaintiff's burden is modest. The first step involves an analysis of the plaintiff's allegations with minimal or no discovery – where plaintiffs must simply show that the putative collective members are "similarly situated." After the close of discovery, a second step involves the defendant filing a motion for decertification. But after initial certification, courts typically order the employer to (i) produce to plaintiff's counsel a list of all putative members of the collective action with their contact information, and (ii) provide notice to be disseminated to all putative collective members. As further procedural background, under the ADEA/FLSA collective-action process, putative members must "opt in" to the action, and settlements have less stringent requirements for approval than class actions under FRCP Rule 23, which has an "opt out" process – but typically still require court supervision and approval.
What Should Employers Do?
If ultimately an AI software provider has liability for software-initiated employment decisions, so too may the employers who use the software. Thus, employers should: (a) thoroughly vet their AI human resource software vendors, and ask questions about compliance with anti-discrimination laws; (b) read the software licensing agreements closely, including an indemnity and duty to defend provisions; (c) review relevant insurance policies to evaluate whether they cover claims arising out of the use of artificial intelligence; and (d) educate HR and IT teams on the potential for AI discrimination, including the need to spot check for potential adverse impact and other discrimination claims.
Footnote
1. Pymetrics is a platform that uses neurosciences based games to help companies with hiring and talent management.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.