This blog focuses on the EEOC's Technical Guidance on Artificial Intelligence and the ADA and the issue of "Screen Out." Specifically, what Automated Employment Decision Making Tool (AEDT) developers and employers use to lessen the frequency of Screen Outs and mitigate the exposure to liability.

WHAT IS A SCREEN OUT?

Let's begin with the basics. What is a "Screen Out?" The EEOC defines Screen Out as the following:

When a disability prevents a job applicant or employee from meeting or lowers their performance on a selection criterion and the applicant or employee loses the job opportunity as a result.

A Screen Out is unlawful when the applicant is qualified for the job. Remember, a Qualified Individual is someone who can perform the essential functions of the job with or without an accommodation.

EXAMPLES OF SCREEN OUTS

Let's look at a few examples of how these tools can expose the employer to liability under the ADA. Before reviewing these examples, it must be noted that these tools are incredibly powerful. These examples are not meant to deter employers from using AEDT's. Rather, they are meant to highlight employer duty to monitor the tools, and to work with the AEDT developers to ensure that the liability risk is mitigated.

Remember, just as employers had to work with their HR recruiters to mitigate exposure to liability, the same holds true with AEDT's.

Chat Software

Some employers use software that generates texts and e-mail communication to candidates. These chat tools can have an algorithm that reject certain individuals for any reason (ie- level of education, gap in work history, years of experience, etc.). Let's look at an example:

  • Jane Company hires AI Chat Tool to communicate with applicants for a position. The algorithm for ChatBox is set to reject anyone with significant gaps in their employment history.
  • Lily applies for a job at Jane Company, and is greeted with the AI Chat Tool generated questions via text.
  • Lily has a 10 month gap in her work history because of her cancer treatments.
  • The AI Chat Tool rejects Lily, because the algorithm is set to reject anyone with an employment gap longer than 3 months.

This Screen Out violates the ADA. Employers must remember that the ADA applies to employeesand candidatesfor employment. The above scenario is an unlawful Screen Out and the employer is exposed to liability under the ADA. Lily was out of work because she was going through cancer treatment. Rejecting her resume because of this gap violates the ADA.

Gamified Tests

Some employers utilize game-based testing to assess certain abilities. These "games" can be used to measure abilities, personality traits, and other qualities. Check out this example:

  • Jane Company uses AI Game Tool for candidates to measure and rate a candidate's memory.
  • Sally applies for the job.
  • Sally is blind.
  • Sally cannot use the AI Game Tool and thereby does not progress to the next step in the process.

This is a violative Screen Out. Sally's ability to play the AI memory game is not an accurate reflection of her memory. Jane Company needs to provide an alternative test for Sally.

Personality Tests

Some employers use tools that can measure personality traits. Some tools can test for focus or an individual to work and ignore distractions. Let's take a look at the following scenario:

  • Jane Company wants to hire laser focused people.
  • Jane Company uses Personality AI Tool to measure candidates' personality as it relates to focus.
  • Linda applies for the job.
  • Linda is a Veteran.
  • Linda suffers from PTSD.
  • Linda scored incredibly low on the Personality AI Tool for focus.
  • Linda was not selected to move on to the next round in the hiring process.

This is an unlawful Screen Out. Remember, a qualified individual is someone who can perform the job with our without a reasonable accommodation. Let's think about this. If the test were provided at the Jane Company headquarters, perhaps Linda would have scored better if she were given noise cancelling headphones. That is a simply accommodation that would enable her to knock the test out of the park. If the test was administered remotely, perhaps a notice would have been helpful. If Linda were given notice as to the nature of the test, she would have put on her airpods and aced the test.

Again, these tools are powerful and can be effective. Employers and AI developers just need to ensure that the tools: (1) give everyone a chance to showcase their skills; and (2) mitigate employer exposure to liability.

SOME THOUGHTS BEYOND THE ADA

While the EEOC focused their Guidance on the ADA, employers and developers must beware of the other protected classes. As seen in the above scenarios, there are other grounds for discrimination.

  • Lily: Disability (cancer); Gender (woman)
  • Sally: Disability (blind); Gender (woman)
  • Linda: Disability (PTSD); Veteran (USERRA)

There is plenty of case law explaining that the burden-shifting analysis of an ADA claim mirrors that of a Title VII claim. If the EEOC will accept a charge from an individual for discrimination from an AEDT based on a disability, it seems the EEOC would also accept a charge for discrimination from an AEDT based on another protected class.

WHAT CAN EMPLOYERS DO?

The EEOC offers guidance to employers on working with AI Developers.

  • If Using an AEDT Developed by a Vendor, Ask AI Developers the Following Questions:
    • Does the tool require applicants/employees to engage in an interface?
    • Is the interface accessible to as many individuals with disabilities as possible?
    • Is the AEDT available in other formats?
    • Are there disabilities that the developer cannot provide for?
    • Did the developer attempt to determine whether the use of the algorithm disadvantages individuals with disabilities?
    • Are there traits or characteristics that are measured and correlated with certain disabilities?
      • NOTE FOR EMPLOYERS: HR Departments must have a good relationship with the AEDT vendors/developers. Prudent employers will spend the time and resources to ensure that AEDT vendors are being asked these important questions.
      • NOTE FOR AI DEVELOPERS: Prudent developers will become well versed in the ADA. A deep look into the statute, regulations and case law provides guidance as to what conditions are in fact deemed a "Disability" under the law. This information is helpful in developing the AEDT.
    • If the Employer Develops Its Own Tool In-House.
      • Bring in experts. Employers (and developers) cannot develop and design the AEDT in a vacuum. The EEOC suggests that employers bring in experts to make sure the tool is as inclusive (compliant) as possible.
        • Example: The Commission provides a helpful example. If an employer is developing a tool that measures personality or cognitive or neurocognitive traits, then a prudent employer will reel in psychologists, neurologists, and neurocognitive experts. These professionals can issue spot and identify ways the AEDT may screen out individuals with such disorders or conditions.

TAKEAWAYS FOR DEVELOPERS AND EMPLOYERS

Regardless of who develops the AEDT, all parties (developer and employer) need to be aware of what must be done to mitigate the risk of liability:

  • Cleary indicate that reasonable accommodations are available for people with disabilities.
    • Employers and Developers must connect on how this will be communicated. Employers must be aware of whether it is possible to provide this notice through the AEDT itself, or whether the employer needs to separately provide this notice.
  • Provide clear instructions for requesting a reasonable accommodation.
    • Again, Employers and Developers need to understand where these instructions will be stored, and how they will be deployed.
  • Provide the applicant/employee with as much information as possible about the AEDT.
    • The EEOC Guidance identifies a baseline of information that should be provided that includes: (1) information about the traits and characteristics the tool is designed to measure; (2) the methods by which the traits and characteristics are measured; and (3) the disabilities (if any) that may cause a Screen Out.
    • This is helpful for the candidates/employees. After all, it allows the individual to learn about what is being tested and how. How can an individual know they need to request an accommodation when they don't know what the test is even testing for or how?
  • Only develop and use AEDT that measures abilities or qualifications that are TRULY NECESSARY FOR THE JOB.
    • Employers must take great care in their job descriptions and have a solid grasp on the essential abilities and qualifications for the job.
    • The classic example here is that of the cashier. If an AI Tool uses an algorithm that rejects anyone who cannot stand for long periods of time, then that is a violative Screen Out. As we call know, individuals who are not able to stand for long periods of time, can certainly perform the job of a cashier if provided a stool to sit on. Even more, individuals in wheelchairs are perfectly capable of working as cashiers.
  • Avoid indirect measurements.
    • The EEOC cautions employers against setting algorithms that make inferences about abilities and qualifications based on a characteristic. For instance, using a measurement of a personality trait, and then comparing it with a "typical or successful" personality trait for the job.

WHY THE EEOC GUIDANCE MATTERS

EEOC Technical Guidance is not precedent- it cannot be cited in a law brief. Its Guidance is usually a harbingers of what the EEOC will focus on. The EEOC has already issued a draft of its Strategic Enforcement Plan. As you will see, AI bias and discrimination is included.

This is an interesting time for HR and AI. We must pay attention to the information and guidance coming from the EEOC.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.