ARTICLE
24 January 2023

EEOC's Draft Enforcement Plan Prioritizes Technology-Related Employment Discrimination

AP
Arnold & Porter

Contributor

Arnold & Porter is a firm of more than 1,000 lawyers, providing sophisticated litigation and transactional capabilities, renowned regulatory experience and market-leading multidisciplinary practices in the life sciences and financial services industries. Our global reach, experience and deep knowledge allow us to work across geographic, cultural, technological and ideological borders.
Bias—especially pernicious bias against historically disadvantaged groups—is among the major risks posed by the widespread adoption of artificial intelligence.
United States Employment and HR

Bias—especially pernicious bias against historically disadvantaged groups—is among the major risks posed by the widespread adoption of artificial intelligence (AI). Indeed, algorithmic discrimination has attracted substantial attention from US federal, state, and local policymakers, and regulators are beginning to crack down.

On January 10, the US Equal Employment Opportunity Commission (EEOC) published for public comment a draft Strategic Enforcement Plan (SEP) for 2023–2027. The SEP aims to "focus and coordinate the agency's work" over a multi-year period to produce "a sustained impact in advancing equal employment opportunity." The draft SEP is the EEOC's first to address the use of automated systems for hiring, such as artificial intelligence and machine learning, and proposes to focus on how those systems may be used to "intentionally exclude or adversely impact protected groups."

As we explained in Avoiding ADA Violations When Using AI Employment Technology, many companies use AI-powered technologies in hiring, promotion, and other employment decisions. Examples include tools that screen applications or resumes; video-interviewing software that evaluates facial expressions and speech patterns; and software that scores "job fit" based on personalities, aptitudes, or skills. The SEP recognizes that these tools and technologies can have adverse effects on members of protected groups, as well as on "particularly vulnerable workers" outside of traditionally protected classes who may be unaware of their rights or reluctant to enforce them. Examples of "particularly vulnerable workers" identified in the SEP include immigrants and migrant workers, individuals with intellectual and developmental disabilities, individuals with arrest or conviction records, LGBTQI+ individuals, older workers, and persons with limited literacy or English proficiency.

In addition to many non-technology related priorities, the EEOC seeks to eliminate technological barriers that can lead to disparate impacts in recruitment, hiring, and promotion, such as:

  • the use of automated systems, including artificial intelligence or machine learning, to target job advertisements, recruit applicants, or make or assist in hiring decisions where such systems intentionally exclude or adversely impact protected groups;
  • restrictive application processes or systems, including online systems that are difficult for individuals with disabilities or other protected groups to access; and
  • screening or performance-evaluation tools that disproportionately impact workers based on their protected status, including those facilitated by artificial intelligence or other automated systems.

The draft SEP incorporates feedback from three listening sessions that occurred in 2022. The public is encouraged to submit comments on the draft SEP through the Federal eRulemaking Portal by February 9, 2023, after which the EEOC will adopt a final version.

However the SEP may be revised, employers and vendors of employment technology should expect much greater scrutiny of automated employment decision technology and tools in the years to come. Auditing current systems (and new ones before deployment) will be increasingly important to keep both regulators and the plaintiffs' bar at bay. For those eager to learn more, the EEOC will host a public hearing on January 31 at 10:00 AM EST on "Navigating Employment Discrimination in AI and Automated Systems: A New Civil Rights Frontier."

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More