ARTICLE
22 September 2023

EEOC Settles Over Recruiting Software In Possible First Ever AI-related Case

AG
Akin Gump Strauss Hauer & Feld LLP

Contributor

Akin is a law firm focused on providing extraordinary client service, a rewarding environment for our diverse workforce and exceptional legal representation irrespective of ability to pay. The deep transactional, litigation, regulatory and policy experience we bring to client engagements helps us craft innovative, effective solutions and strategies.
On September 8, 2023, federal court approved a consent decree from the Equal Employment Opportunity Commission (EEOC) with iTutorGroup Inc. and its affiliates ("iTutor") over alleged age discrimination...
United States New York Employment and HR

On September 8, 2023, federal court approved a consent decree from the Equal Employment Opportunity Commission (EEOC) with iTutorGroup Inc. and its affiliates ("iTutor") over alleged age discrimination in hiring, stemming from automated systems in recruiting software. Arriving on the heels of the EEOC announcing its artificial intelligence (AI) guidance initiative, many are calling this case the agency's first ever AI-based antidiscrimination settlement.1 While it is not clear what, if any, AI tools iTutor used for recruiting, one thing is certain: We will soon see many more lawsuits involving employers' use of algorithms and automated systems, including AI, in recruitment and hiring.2

In the lawsuit, the EEOC claimed that the Shanghai China-based English language tutor provider used software programmed to automatically reject both female candidates over the age of 55 and male candidates over 60 for tutoring roles, in violation of the Age Discrimination in Employment Act (ADEA). The EEOC filed the case in May 2022 after iTutor failed to hire Charging Party Wendy Picus and over 200 applicants aged 55 and older, allegedly because of their age, according to the agency.3 The case is also notable because iTutor treats its tutors as independent contractors, not employees, and only employees are protected by the ADEA. Nonetheless, according to the consent decree filed on August 9, 2023 with the U.S. District Court for the Eastern District of New York, iTutor will pay $365,000 to over 200 job candidates who were automatically screened out by iTutor's recruiting software to resolve the EEOC's claims.4

In addition to monetary relief, iTutor must allow applicants who were rejected due to age to reapply and must report to the EEOC on which ones were considered, provide the outcome of each application and give a detailed explanation when an offer is not made.5

Antidiscrimination Training as Part of Settlement

The consent decree further includes a number of "injunctive relief" requirements imposed on iTutor if or when the company resumes hiring for the longer of five years, or three years, from the resumption date, including:

  • Prohibiting iTutor from requesting birth dates from applicants, or screening based on age, aside from confirmation applicants are over 18 for compliance with existing laws.6
  • Distributing a memo about federal antidiscrimination laws to all employees and independent contractors involved in the selection process, and posting or distributing the memo to all applicants as well.
  • Updating and distributing its antidiscrimination polices and complaint procedures applicable to screening, hiring and supervision.
  • Training supervisors, managers and other employees or contractors involved in the screening and selection process on the company's obligations under federal antidiscrimination laws using an EEOC-approved third party.7
  • Fulfilling pre-training notification and post-training reporting requirements.
  • Monitoring and reporting, including on the day iTutor resumes considering applicants and every six months thereafter, providing written notice to the EEOC on any discrimination complaints from employees or applicants.8

Using AI in Hiring

Just because there is a lack of comprehensive AI law in the United States does not mean the AI space is unregulated. Agencies like the EEOC, Department of Justice (DOJ) and Federal Trade Commission (FTC) and others have released statements on their intent to tackle problems stemming from AI in their respective domains. After a delay, New York City's new law governing AI in employment decisions took effect this July.

The proliferation of AI in recruiting and hiring means that many employers will find themselves on the frontlines of important compliance questions from the EEOC. With more legal actions and settlements on the way, employers will need a strategy for proper use of AI tools in candidate selection. While this case might not have involved AI decision making, both the EEOC and FTC have maintained that employers may be responsible for decisions made by their AI tools, including when they use third parties to deploy them. Employers need to understand the nature of the AI tools used in their hiring and recruiting process, including how the tools are programmed and applied by themselves and their vendors. Diligent self-audits, as well as audits of current and prospective vendors, can go a long way toward reducing the risk of AI bias and discrimination.

Footnotes

1. The case at issue appears to stem from a software that the EEOC claims was programmed for automated decision making, rather than generative or other AI. Nonetheless, the agency itself connects this case to AI in the press release, where EEOC Chair Charlotte A. Burrows refers to it as "an example of why the EEOC recently launched an Artificial Intelligence and Algorithmic Fairness Initiative."

2. The EEOC has discussed AI together with automated systems generally, See Equal Employment Opportunity Comm'n, Press Release, EEOC Releases New Resource on Artificial Intelligence and Title VII, at https://www.eeoc.gov/newsroom/eeoc-releases-new-resource-artificial-intelligence-and-title-vii (May 18, 2023) (the agency's technical assistance document on the application of Title VII of the Civil Rights Act to an employer's use of automated systems, including those that incorporate AI). The EEOC defines automated systems broadly to include software and algorithmic processes, including AI, that are used to automate workflows and help people complete tasks or make decisions. See EEOC Joint Statement on Enforcement Efforts Against Discrimination and Bias in Automated Systems, at https://www.eeoc.gov/joint-statement-enforcement-efforts-against-discrimination-and-bias-automated-systems (April 25, 2023).

3. Equal Employment Opportunity Comm'n v. iTutorGroup, Inc., No. 1:22-cv-02565-PKC-PK (ED NY, August 9, 2023).

4. Id. at 15.

5. Id. at 18.

6. Id. at 8.

7. Id. at 12.

8. Id. at 14.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Find out more and explore further thought leadership around Employment Law and Labour Law

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More