ARTICLE
1 May 2019

Did You Know About The Use Of AI To Profile Minorities?

FL
Foley & Lardner

Contributor

Foley & Lardner LLP looks beyond the law to focus on the constantly evolving demands facing our clients and their industries. With over 1,100 lawyers in 24 offices across the United States, Mexico, Europe and Asia, Foley approaches client service by first understanding our clients’ priorities, objectives and challenges. We work hard to understand our clients’ issues and forge long-term relationships with them to help achieve successful outcomes and solve their legal issues through practical business advice and cutting-edge legal insight. Our clients view us as trusted business advisors because we understand that great legal service is only valuable if it is relevant, practical and beneficial to their businesses.
The New York Times reported profiling in China based on "documents and interviews show that the authorities are also using a vast, secret system of advanced facial recognition technology to track and control...
United States Technology

The New York Times reported profiling in China based on "documents and interviews show that the authorities are also using a vast, secret system of advanced facial recognition technology to track and control the Uighurs, a largely Muslim minority. It is the first known example of a government intentionally using artificial intelligence for racial profiling, experts said." The April 14, 2019 article entitled "One Month, 500,000 Face Scans: How China Is Using A.I. to Profile a Minority" included these details:

The facial recognition technology, which is integrated into China's rapidly expanding networks of surveillance cameras, looks exclusively for Uighurs based on their appearance and keeps records of their comings and goings for search and review.

The practice makes China a pioneer in applying next-generation technology to watch its people, potentially ushering in a new era of automated racism.

The technology and its use to keep tabs on China's 11 million Uighurs were described by five people with direct knowledge of the systems, who requested anonymity because they feared retribution.

The New York Times also reviewed databases used by the police, government procurement documents and advertising materials distributed by the A.I. companies that make the systems.

The big question is...what other countries are using AI for profiling minorities?

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More