Technological advancements constantly reshape America's banking and consumer finance ecosystem. Today, artificial intelligence ("AI") is among the most intriguing technologies driving financial decisionmaking. Powerful enough on its own to warrant significant investment, AI has even more transformative potential when coupled with industry momentum toward greater use of "big data" and alternative or non-traditional sources of information.

With material changes in banking processes on the horizon, regulators and industry participants brace themselves for the full impact of AI and big data. This article contributes to ongoing discussion by addressing the increasing regulatory focus on issues unique to, or heightened by, AI and big data. After exploring the rise of regulatory interest in these areas, we address specific regulatory risks under banking and consumer financial laws, regulations, and requirements, including: (i) the Equal Credit Opportunity Act ("ECOA") and fair lending requirements; (ii) the Fair Credit Reporting Act ("FCRA"); (iii) unfair, deceptive, and abusive acts and practices ("UDAAPs"); (iv) information security and consumer privacy; (v) safety and soundness of banking institutions; and (vi) associated vendor management expectations.

Regulators Are Increasingly Interested In AI and Big Data

As the use of AI and big data in financial services gradually becomes an industry norm, regulators have become increasingly interested and also have developed a more sophisticated understanding of the area. Federal and state regulators have now weighed in on various product types and banking processes. While doing so, they have exhibited movement from basic information gathering to a more sophisticated approach to understanding regulatory issues. Regulators have not yet promulgated material regulation specifically addressing AI and big data issues—and such active regulation appears to remain a ways off—but they have arguably moved past infancy in their approaches to such issues.

At the federal level, expressions of regulatory interest have come not only from core banking and consumer financial regulators, but also from calls by the Government Accountability Office ("GAO") for broader interagency coordination on issues related to AI and big data. The Consumer Financial Protection Bureau ("CFPB") has sought industry information on the use of alternative data and modeling techniques in the credit process in a February 2017 Request for Information,1 and members of the Federal Reserve's Board of Governors ("FRB") have spoken on fair lending and consumer protection risks.2 These regulators have focused, to date, on questions regarding process transparency, error correction, privacy concerns, and internalized biases, even as they see promise in AI and big data's ability to reduce lending risk and/or open credit markets to previously underserved populations. At the same time, the GAO has issued two reports (in March 2018 and December 2018) promoting or recommending interagency coordination on flexible regulatory standards for nascent financial technology ("Fintech") business models (including through "regulatory sandboxes") and the use of alternative data in underwriting processes.3

State regulators have also begun to involve themselves in the national discourse about AI and big data. In doing so, they have staked out similar positions to federal regulators with respect to data gathering and understanding technologies, while remaining skeptical of federal overreach in regulating (or choosing not to regulate) AI-driven processes. Various state Attorneys General, for example, have joined the discussion by opposing revisions to the CFPB's policy on no-action letters due, in part, to concern over the role machine learning could play in replacing certain forms of human interaction in overseeing underwriting questions such as "what data is relevant to a creditworthiness evaluation and how each piece of data should be weighted."4 In addition, the New York Department of Financial Services ("NYDFS") has moved perhaps as far as any regulator—albeit in the context of life insurance, rather than banking or consumer finance—by issuing two guiding principles on the use of alternative data in life insurance underwriting: (i) that insurers must independently confirm that the data sources do not collect or use prohibited criteria; and (ii) that insurers should be confident that the use of alternative data is demonstrably predictive of mortality risk, and should be able to explain how and why the data is predictive.5 NYDFS or other regulators may see the next logical step as applying similar requirements to the context of credit underwriting.


1 82 Fed. Reg. 1183.

2 Lael Brainard, Member, Federal Reserve Board, Speech at Fintech and the New Financial Landscape: What are we Learning about Artificial Intelligence in Financial Services? (Nov. 13, 2018) available at 1113a.htm.

3 U.S. Government Accountability Office, GAO-18-254, Financial Technology: Additional Steps by Regulators Could Better Protect Consumers and Aid Regulatory Oversight (Mar. 2018); U.S. Government Accountability Office, GAO-19-111, Financial Technology: Agencies Should Provide Clarification on Lender's Use of Alternative Data (Dec. 2018).

4 New York Office of the Attorney General, Policy on No-Action Letters and the BCFP Product Sandbox (Feb. 11, 2019), nt_final.pdf

To view the full article click here page 3 -13

Visit us at

Mayer Brown is a global legal services provider comprising legal practices that are separate entities (the "Mayer Brown Practices"). The Mayer Brown Practices are: Mayer Brown LLP and Mayer Brown Europe – Brussels LLP, both limited liability partnerships established in Illinois USA; Mayer Brown International LLP, a limited liability partnership incorporated in England and Wales (authorized and regulated by the Solicitors Regulation Authority and registered in England and Wales number OC 303359); Mayer Brown, a SELAS established in France; Mayer Brown JSM, a Hong Kong partnership and its associated entities in Asia; and Tauil & Chequer Advogados, a Brazilian law partnership with which Mayer Brown is associated. "Mayer Brown" and the Mayer Brown logo are the trademarks of the Mayer Brown Practices in their respective jurisdictions.

© Copyright 2019. The Mayer Brown Practices. All rights reserved.

This Mayer Brown article provides information and comments on legal issues and developments of interest. The foregoing is not a comprehensive treatment of the subject matter covered and is not intended to provide legal advice. Readers should seek specific legal advice before taking any action with respect to the matters discussed herein.