Is Artificial Intelligence Sexist And Racist?

FL
Foley & Lardner

Contributor

Foley & Lardner LLP looks beyond the law to focus on the constantly evolving demands facing our clients and their industries. With over 1,100 lawyers in 24 offices across the United States, Mexico, Europe and Asia, Foley approaches client service by first understanding our clients’ priorities, objectives and challenges. We work hard to understand our clients’ issues and forge long-term relationships with them to help achieve successful outcomes and solve their legal issues through practical business advice and cutting-edge legal insight. Our clients view us as trusted business advisors because we understand that great legal service is only valuable if it is relevant, practical and beneficial to their businesses.
Last year, Amazon scrapped its machine-learning algorithm because it discovered it had a major problem—the artificial intelligence didn't like women.
United States Technology

Last year, Amazon scrapped its machine-learning algorithm because it discovered it had a major problem—the artificial intelligence didn't like women. The machine-based learning tool was designed to analyze resumes and compare potential applicants to Amazon's current work force. The algorithm was designed to take 100 resumes and filter out the top five applicants.

The problem was that there is a pre-existing gender gap in software developer and other technical posts. Therefore, when the artificial intelligence tool analyzed the patterns in Amazon's hiring practices over the prior 10-year period, it taught itself to favor men over women. Amazon ultimately disbanded the tool.

Amazon's artificial intelligence highlights an important limitation on machine-based learning tools—the tools are only as good as the information they are given. While artificial intelligence can quickly and more efficiently screen potential job candidates, such algorithms can inadvertently reinforce discrimination in hiring practices. In Amazon's case, tech-based job applicants were more likely to be male then female. The algorithm mistakenly interpreted this gender gap as a hiring preference for Amazon. Thus, instead of highlighting the qualified women, the algorithm screened out such candidates.

Employers these days have a panoply of tech-based tools at their disposal. Websites like Monster.com and Indeed.com advertise job openings and generate large numbers of applicants. Employers are turning to tech-based tools to reduce the time to hire and the costs of hiring. Such tech- based tools, however, are designed to mimic human decision-making. Therefore, when the tool relies on data that is inaccurate or biased, the tool can inadvertently discriminate against women or minorities. Studies have also found that tech-based tools can discriminate in more subtle ways as well. For example, an employer attempting to maximize work tenure found that those who lived closer to work tended to have longer tenures. However, screening applicants based on how close they lived from work tended to disproportionately screen out certain minority candidates.

Under Title VII of the Civil Rights Act of 1964 and analogous state and local laws, the employer is responsible for ensuring that it is screening job applicants in a nondiscriminatory manner. Therefore, if you are using or considering a tech-based tool to help you screen job applicants, you should take steps to ensure that such tools are not disproportionately screening out candidates based on gender, race, or other protected classes. Simply telling tech-based tools not to discriminate against minorities or women may be insufficient because such tools will attempt to identify candidates that reflect your existing hiring practices. Some helpful tips to consider when using tech-based hiring tools are:

  1. Do not rely exclusively on tech-based hiring tools. Most tools will rank candidates. Employers should review lower-ranked candidates and make independent assessments based on non-discriminatory criteria.
  2. Consistently review and update data provided to your hiring tool. Make sure the data your hiring tool relies on does not reflect discriminatory hiring practices.
  3. Independently audit the results and rankings generated by the hiring tool and make appropriate adjustments as necessary.

Over time, these tech-based hiring tools will likely improve and, hopefully, screen applicants free of any discriminatory bias. But until the technology is perfected, employers should take steps to make sure that members of protected classes are not disproportionately screened through uses of tech-based hiring algorithms.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More