Seyfarth Synopsis: In Mobley v. Workday, the EEOC filed an amicus brief supporting a class-action plaintiff's theory that a Human Resources software company could be directly liable for employment discrimination allegedly caused by the vendor's artificial intelligence tool under theories that the organization was acting as an employment agency, indirect employer, or agent of the employer. Employers, HR vendors and service providers, and AI developers should take note because even though the allegations in the amended complaint may not reflect the reality of how most employers are using AI in hiring, the EEOC's position is likely to embolden plaintiffs to pursue similar claims. The EEOC's proposed amicus filing is part of the EEOC's ongoing focus on artificial intelligence issues in the workplace, which aligns with broader federal efforts to regulate AI.
The EEOC took a novel position in support of a class-action plaintiff's theory that an AI vendor could be directly liable under Title VII, the ADA, or the ADEA for employment discrimination caused by the use of the vendor's AI. In its proposed amicus brief filed on April 9, 2024 in the Mobley v. Workday proposed class action pending in the Northern District of California, the EEOC argues that if the allegations in Mobley's First Amended Complaint are true, then the plaintiff has articulated valid theories of direct AI vendor liability.
Since its inception, Title VII, the ADA, and the ADEA have provided for direct liability for employment agencies; however, the complaint in Mobley v. Workday charts new territory in its attempt to apply the definition of "employment agency" to AI vendors providing resume-screening services, and alternatively asserts that AI vendors bear direct liability under indirect employer or agency theories. The EEOC's support of these theories to alleged AI vendors is significant to employers and HR service providers even though the conclusory allegations in the Mobley complaint may not reflect the reality of the way that most employers are using AI tools in the hiring process.
Among other things, Mobley's First Amended Complaint alleges that employers "delegate their HR function" to Workday and its algorithms. The EEOC argues that by actively making automated decisions to reject or advance candidates before referring them to employers, Workday functions as an employment agency under the law. In its amicus submission, the EEOC draws an analogy to IRS rules stating that tax preparation software can be considered a tax preparer if the software does more than just provide "mere mechanical assistance." So too with algorithmic tools that go beyond that threshold in the employment context, according to the EEOC.
The allegations in Mobley's First Amended Complaint may sound far afield from how employers are actually using AI tools in hiring, because most vendors do not characterize their tools as making fully automated hiring decisions. Rather, most recruiters and HR personnel of the employer exercise significant judgment in deciding what to do with AI-generated recommendations and use them to assist human decision-makers -- not replace them.
Nevertheless, both employers using AI and their vendors should be aware that the EEOC's position signals the EEOC's willingness to entertain theories of direct liability in its investigation processes, and potential EEOC-initiated litigation. It further invites class-action plaintiffs' counsel to consider these theories when identifying potential class-action litigation targets. Even if an employer believes that its use of AI in hiring is nothing like what is being alleged in the Mobley Amended Complaint, employers and AI vendors should continue to proactively assess and monitor their AI-driven hiring processes, understanding that the degree of human involvement will likely face increased scrutiny in light of the EEOC's position.
Mobley v. Workday: Allegations and Litigation Status
In Mobley, the plaintiff alleges that Workday engaged in a pattern or practice of discriminatory job screening that disproportionately disqualifies African-Americans, individuals over 40, and individuals with disabilities from securing employment in violation of Title VII, Section 1981, the ADEA, the ADA, and California state law. Specifically, plaintiff alleges that Workday's AI and algorithms are more likely to deny job applicants who are African-American, over 40, and have a disability, and plaintiff asserted that Workday acted as an employment agency, or, in the alternative, as an indirect employer or agent of the employer. Mobley seeks class certification.
The original complaint was filed on February 21, 2023, and on January 19, 2024 Judge Rita Lin granted Workday's motion to dismiss, holding that Mobley had not pled sufficient facts to support plaintiff's theories that Workday was an indirect employer or agent, although Judge Lin's opinion observed that "it appear[ed] that Mobley could potentially amend the complaint to do so." Mobley filed a First Amended Complaint on February 20, 2024, and Workday's motion to dismiss the First Amended Complaint is now fully briefed.
On April 9, 2024, the EEOC's filed its proposed amicus brief in opposition to Workday's motion to dismiss, although the EEOC took no position on the accuracy of the factual allegations in the Amended Complaint, including the descriptions about the services Workday allegedly provides. Media reports confirm that the EEOC's amicus filing was approved on a 3-2 party-line vote, with the Commission's three Democrats voting to approve and its two Republicans voting to disapprove.
On April 23, 2024, Workday opposed the EEOC's amicus filing, arguing that the EEOC's brief is "inappropriately partisan," and that it "advocates for Plaintiff in place of Plaintiff's counsel," which, Workday argues, is not the appropriate purpose of an amicus brief. Workday's opposition pointed out that the EEOC guidance and other information cited in its proposed amicus brief were also cited by the parties, and that the parties, being represented by experienced counsel, were able to address "the relevant legal issues in their briefing," without the EEOC's assistance as amici.
The EEOC's Focus on AI in Hiring Will Continue
The EEOC's amicus filing in Mobley continues the EEOC's focus on AI issues. Artificial intelligence features prominently in the EEOC's strategic enforcement plan for fiscal years 2024-2028, in the category called "Eliminating Barriers in Recruitment and Hiring." The EEOC first announced its AI initiative in October 2021, and since then it issued two technical assistance documents emphasizing the applicability of the Americans With Disabilities Act and Title VII to the use of AI. Against this backdrop, employers and AI vendors should be mindful that any charge of discrimination filed with the EEOC that mentions the use of artificial intelligence — or any other technology in hiring — will likely qualify for priority handling and will receive additional scrutiny from EEOC management.1
Within the federal government, the EEOC is not alone in its focus on AI issues. In April 2023, the EEOC joined other federal civil-rights enforcement agencies in affirming that existing legal authorities applied "to the use of automated systems and innovative new technologies just as they apply to other practices" and that the agencies would use their existing statutory authorities in this regard. In April 2024, nine federal agencies, including the EEOC, reiterated and reaffirmed this commitment.
In October 2023, President Biden signed a comprehensive executive order setting forth a "whole of government" approach to AI regulation and enforcement. On the labor and employment front, President Biden's executive order directed the Secretary of Labor to develop and issue "principles and best practices for employers that could be used to mitigate AI's potential harms to employees' well-being and maximize its potential benefits." We expect that the Department of Labor will soon issue a "broader value-based document" that contains "principles and best practices" for both employers using AI and developers of the AI tools. Likewise, pursuant to President Biden's executive order, OFCCP is also working on a "promising practices" document regarding AI selection tools.
Implications for Employers and HR Service Providers
The rapidly evolving AI regulatory environment extends far beyond the EEOC's recent actions and federal law. In addition to advocating for the expanded application of existing laws to the use of AI, as demonstrated by the EEOC's proposed amicus brief, federal, state, and local regulators are continuing their ongoing efforts in this area. The use of AI in employment decisions will also face increased scrutiny from plaintiffs' attorneys seeking to test novel theories of liability.
Navigating this complex web of federal, state, and local laws and regulations, not to mention international and industry efforts, will require organizations to remain vigilant, proactive, and well-informed about the evolving legal landscape and emerging consensus surrounding the use of AI in the workplace.
We will continue to monitor these developing issues.
Footnote
1. For example, in August 2023, the EEOC entered into a settlement agreement with iTutorGroup, which many media reports and commenters characterized as the EEOC's "first ever" case involving artificial intelligence discrimination in hiring. While the underlying technology described in the EEOC's complaint had nothing to do with AI, the case squarely fit into the EEOC's focus on addressing "algorithmic" discrimination.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.