A growing number of employers are turning to artificial intelligence ("AI") tools to assist in recruiting and other employment decisions. According to Forbes, almost all Fortune 500 companies use talent-sifting software, and more than half of human resource leaders in the U.S. leverage predictive algorithms to support hiring. Widespread adoption of these tools has led to concerns from regulators and legislators that they may be inadvertently discriminating, for example, by:
- Penalizing job candidates with gaps in their resumes, leading
to a bias against older women who have taken time off for
childcare;
- Recommending candidates for interviews who resemble the
company's current leadership, which is not diverse; or
- Using automated games that are unfairly difficult for individuals with disabilities to evaluate employees for promotions, even though they could do the job with a reasonable accommodation.
New York City is one of the first jurisdictions to pass a law aimed at reducing bias in automated employment decisions, which becomes effective on January 1, 2023. The Automated Employment Decision Tool Law ("AEDT") places compliance obligations on employers in New York City that use AI tools, rather than software vendors who create the tools. Similar laws are likely to be enacted in other jurisdictions. Accordingly, companies should pay close attention to any AI tools or algorithms being used to manage human capital to ensure that they are compliant with these emerging requirements.
In this Debevoise In Depth, we discuss the key requirements of the new City law, the growing scrutiny of AI-based hiring tools in other jurisdictions, and practical steps companies can consider taking to reduce their legal and regulatory risks related to their use of these automated tools.
What Does the AEDT Require?
The AEDT applies to companies located in New York City, that are hiring or promoting City residents, for jobs that are located in the City using "automated employment decision tools" to "replace" or "substantially assist" decision-making in hiring or promotions. What is unclear is whether it applies in other circumstances (e.g., companies located outside of the City hiring New York residents, companies in the City hiring applicants from outside the city, etc.). Hopefully the full scope of the AEDT will be made clearer in the forthcoming rulemaking process.
The term "automated employment decision tools" is broadly defined as any "computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence" that "issues a simplified output." Given the breadth of this definition, a wide variety of automated tools will likely be covered by this law, even if they do not employ true AI, including many game-based tests and some resume review tools and automated personality assessments. Additionally, the decision need not be fully automated for the AEDT to apply. Any automated tool that "substantially assists" a human in reaching their decision (for example, by evaluating or recommending candidates) may fall within the scope of the law.
For companies subject to the AEDT, compliance obligations include:
- Conducting an Independent, Annual Bias Audit.
Companies must ensure that their automated employment decision
tools are subjected to a "bias audit," conducted no more
than one year prior to the use of the tool. It appears that this
must be done annually. Although the law provides very little
information on the substantive requirements of this bias audit, it
does define the audit as "an impartial evaluation by an
independent auditor" that includes testing the tool to assess
its "disparate impact" on persons based on gender, race,
or ethnicity. The term "disparate impact" is not defined,
nor does the law specify a methodology for conducting the bias
audit, or how companies should assess bias if they do not collect
data on the race, ethnicity or gender of their job applicants. One
possible approach would be to use the "four-fifths rule,"
as defined in federal anti-discrimination regulations on employee selection procedures
promulgated by the Equal Employment Opportunity Commission
("EEOC"), whereby a selection rate of a protected group
that is less than 80% of the rate for the group with the highest
rate constitutes evidence of "disparate impact." The AEDT
also does not specify the level of "independence"
required from the auditor (e.g., whether the internal audit
function is sufficiently independent and whether the auditor must
do more than review testing performed by the company or the
software provider).
- Providing Disclosures. Before using a covered
tool, companies must publish a "summary" of the results
of the bias audit on its website, along with the distribution date
of the tool. Additionally, the company must publish on its website,
or provide to a candidate or employee within 30 days of their
request, "the type of data collected for the automated
employment decision tool, the source of such data and the employer
or employment agency's data retention policy."
- Notifying Candidates or Employees. At least 10
days before using a covered tool, a company must provide notices to
candidates and employees residing in New York City: (i) that an
automated employment decision tool will be used to assess their
employment or candidacy; (ii) the job qualifications and
characteristics the tool will assessing; and (iii) that the
candidate may request an "alternative selection
procedure" or "accommodation."
- Providing an Accommodation or Alternative Selection Process. The AEDT requires that companies provide candidates or employees residing in New York City with the ability to request an alternative selection process or accommodation; it does not, however, state what these alternatives or accommodations should entail.
Companies that are found to not be in compliance will face penalties of $375 for a first violation, $1,350 for a second violation, and $1,500 for a third violation and any subsequent violations. Each day that a company uses a covered tool in noncompliance with the law constitutes a separate violation, as does the failure to provide any required notice. New York City's Corporation Counsel may bring proceedings to enforce the AEDT. While the AEDT does not include a private right of action, it does not preclude private plaintiffs from bringing civil actions related to a company's practices or automated tools (including, for example, discrimination claims). The New York City Commission on Human Rights may also enforce the law.
On June 6, 2022, New York City's Department of Consumer and Worker Protection conducted a hearing pertaining to the AEDT, and a further rule-making process is currently anticipated. At the time of writing, however, it remains unclear when this rule-making will occur or what topics it may cover.
Other Emerging AI Hiring Laws and Regulations
Outside of New York City, several states currently have enacted or proposed various laws or regulations that apply to AI tools used for hiring or promotions. For example:
- Washington, D.C. In 2021, Washington, D.C.
introduced legislation that would prohibit certain
companies from making algorithmic decisions about "important
life opportunities"—including employment offers—on
the basis of actual or perceived protected classes. This law would
require companies to audit their algorithmic determination
practices on an annual basis for potential disparate impact and
report this information to the Office of the Attorney General, as
well as to preserve an audit trail for five years. Additionally,
companies using vendor-provided models would need to obtain written
agreements that the vendor has implemented and maintained measures
"reasonably designed to ensure" that the company complies
with this law. The bill also contains a proposed private right of
action and is currently under D.C. Council review with a public
hearing scheduled for September 2022.
- California. On March 15, 2022, the California
Fair Employment and Housing Council published draft changes to their employment and
discrimination laws, which, if passed, would impose liability on
companies or third-party agencies administering artificial
intelligence tools that "screen out or tend to screen out an
applicant or class of employees on the basis" of a protected
characteristic and create a private right of action for those who
are discriminated against by the AI tools. The regulations are
currently pending and will be subject to a public comment period
before taking effect.
- Illinois. In 2019, Illinois passed its Artificial Intelligence Video Interview Act,
which gives job applicants the right to know and provide consent,
before their interview, that (a) that AI may be used to analyze the
video, and (b) what characteristics will be analyzed. The employer
is restricted in sharing the applicant's video and must also
destroy it within 30 days of the applicant's request. A recent amendment, effective January 1, 2022,
has also required employers that rely solely on AI video analysis
for determining who to interview in person to provide annual
reports of demographic data to the state's Department of
Commerce and Economic Opportunity, including "the race and
ethnicity of applicants who are and are not afforded the
opportunity for an in-person interview after the use of artificial
intelligence analysis; and . . . the race and ethnicity of
applicants who are hired."
- Maryland. In 2020, Maryland passed a law prohibiting employers from using facial recognition technology during pre-employment job interviews (including in the context of AI tools) without the applicant's written consent.
Whether or not covered by these specific AEDT Laws, most employers using AI are subject to other general anti-discrimination laws, including Title VII of the Civil Rights Act. Given the rise of AI-related hiring tools, the EEOC has stated that it remains focused on ensuring that AI does not "become a high-tech pathway to discrimination." Most recently, in May 2022, the EEOC issued its first non-binding technical guidance regarding how employers' use of AI may violate existing requirements under the Americans with Disabilities Act ("ADA"). Among other things, the EEOC recommends that employers give applicants or employees notice that they are undergoing an assessment by an AI tool, which traits or characteristics the tool is designed to measure, and that they may request a reasonable accommodation or exemption from the tool. The Department of Justice also joined the EEOC in warning of potential risk that the use of AI tools by employers may "result in unlawful discrimination against certain groups of applicants, including people with disabilities."
Outside of the United States, AI tools used for hiring and recruiting have drawn scrutiny from European lawmakers. As we have previously discussed, the European Commission's draft AI Act would place potentially onerous regulatory and disclosure obligations on any AI systems classified as "high risk," such as AI systems that are used for recruiting and workplace management, including evaluating candidates through interviews, making decisions concerning promotions or termination, or monitoring and evaluating employee performance or behavior. Although the AI Act is still in being refined through the European Union's legislative process, it is likely that at least some hiring and promotion systems will be classified as "high risk" in the final version when passed.
Four Tips for Complying with AEDT Laws
In light of these emerging requirements, employers using AI tools to hire or promote talent should consider the following measures to reduce their legal and regulatory risks:
- Identify Which Models, Algorithms, or Other Tools Are
Subject to AEDT Laws. Companies should determine whether
their employment tools are subject to AEDT Laws because they (a)
"replace" or "substantially assist" human
decision-making and (b) involve a simplified output from a
computational process, including AI, machine learning, data
analytics or statistical tools. Given the extremely broad
definition, many sophisticated hiring tools are likely to
qualify.
- Consider Whether to Leverage the Vendor's Bias
Audit(s) of the Tool. Although the AEDT itself squarely
places the burden of compliance on employers using AI hiring and
recruiting tools, the vendor providing those tools may be best
positioned to conduct an audit to assess the tools for potential
disparate impact. Companies will need to determine on a
case-by-case basis whether they should rely on the vendor's
bias audit, but at a minimum, such an audit must have been
conducted (a) by an independent auditor, (b) no more than 12 months
prior to the company's use of the tool, and (c) involve an
evaluation of potential disparate impact risks. Companies may also
want to consider whether the vendor's bias audit is applicable
to the way it uses the tool for its employment decisions.
- Determine What Other Testing and Evaluation Steps
Should Be Included in the Bias Audit. In addition to the
assessment of hiring tools for disparate impact, companies might
consider whether the bias audit should include a qualitative
assessment of the relevant policies and practices by the company or
the vendor of the AI tool, including:
- Whether the company and/or the vendor have a written policy on
responsible AI use that applies to tools used for hiring or
promotions;
- Whether relevant individuals at the company or vendor (such as
those who will be using the tool or who designed the tool) have
received training on detecting and preventing bias in the use of AI
hiring and promotion tools; and
- Whether, in order to reduce the risk of bias, the vendor has
identified criteria that should not be used when operating the
tool, (e.g., name, race, ethnicity, sex or gender, sexual
orientation, gender identity or expression, age, religion, national
origin, disability status, family or marital status, genetic
characteristics, information regarding a conviction for which a
pardon has been granted or a record suspended, or protected veteran
or other uniformed status), as well as proxies for these
characteristics (e.g., address, zip code, etc.).
- Whether the company and/or the vendor have a written policy on
responsible AI use that applies to tools used for hiring or
promotions;
- Evaluate What Sort of Accommodation or Alternative Selection Processes Should Be Given to Candidates or Employees. The AEDT requires employers to provide notice to candidates or employees that they may request an accommodation or alternative selection process. The AEDT is silent, however, as to what kinds of accommodations and alternative selection processes should be provided, and in what circumstances. Companies should nevertheless consider practical means of providing human-based selection systems to candidates or employees that elect to opt-out of the automated tool, especially for persons who may have a disability. For example, companies using game-based assessment tools with complex graphics may want to offer candidates or employees with visual impairments an alternative screening process. Indeed, any accommodations offered by employers should be evaluated for compliance with the ADA and other applicable laws.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.