ARTICLE
29 January 2026

2025 Year-In-Review: Biometric Privacy Litigation

SP
Squire Patton Boggs LLP

Contributor

Squire Patton Boggs is a full service global law firm providing insight at the point where law, business and government meet, giving you a voice, supporting your ambitions and achieving successful outcomes.

With a multidisciplinary team of over 1,500 lawyers in over 40 offices across four continents, we are well-established geographically with strong local and regional positions in North America, Europe, Asia Pacific, the Middle East and Latin America, and our practice experience spans all key sectors.

One of the most significantly litigated areas of privacy law is biometric privacy. Tools that collect biometric information and biometric identifiers—including facial geometries, fingerprint scans, and voiceprints—are increasingly common for businesses across industries.
United States Privacy

One of the most significantly litigated areas of privacy law is biometric privacy. Tools that collect biometric information and biometric identifiers—including facial geometries, fingerprint scans, and voiceprints—are increasingly common for businesses across industries. Unfortunately, such tools in recent years have become focuses of the plaintiffs' bar.

2025 saw continued developments in litigation under Illinois' Biometric Information Privacy Act (BIPA), one of the first and most important biometric privacy laws in the country, as well as other, lesser-litigated biometric laws. Squire Patton Boggs' globally ranked "Elite" Data Disputes team is well experienced defending businesses and their data practices, including in the realm of biometric privacy, in both litigation and arbitration, including mass arbitration. See also https://www. privacyworld.blog/2025/12/2025-mass-arbitration-year-in-review/

In this article, informed by our practical experience litigating and arbitrating biometric cases, we: (I) provide a brief primer on BIPA and then take a look at some highlights of the 2025 biometric privacy litigation space, including (II) class action and mass arbitration activity under BIPA, (III) key questions regarding defenses to BIPA claims on appeal at the Seventh Circuit, (IV) a decision contrasting BIPA with New York City's biometric regime, (V) developments under other biometric laws enforced by attorneys general, and (VI) the intersection of AI and biometric privacy laws.

I. BIPA Background

Enacted in 2008, Illinois' BIPA regulates private entities' collection and use of biometrics. The law requires, among other things, that the private entity provide specific notice and obtain consent from the from theparticular personwhose biometric information is at issue. 740 Ill. Comp. Stat. 14/15. Significantly, unlike other privacy laws, BIPA includes a private right of action which permits "aggrieved" individuals to seek damages for violations of the law, with statutory damages of $1,000 for negligent noncompliance and $5,000 for willful or reckless misconduct. Since a 2024 amendment, BIPA's definition of "violation" has been limited so that an aggrieved party is limited to one recovery against a private entity for unlawful disclosures or uses of the same biometrics. 740 Ill. Comp. Stat. 14/20(b).

II. 2025 BIPA Class and Mass Action Activity

A. 2025 BIPA Class Action Activity

Because of its high statutory damages and no need to demonstrate actual harm, BIPA has been a favorite tool used by class action plaintiffs' lawyers. At least 100 putative class actions were filed in 2025 alleging violations of BIPA across a variety of industries and contexts. This year also saw some high-value class-action settlements:

  • As part of multidistrict litigation against a facial recognition company dealing with BIPA and other state law claims, a federal court in Illinois approved a nationwide settlement where the class of 65,000 to125,000 members would receive "a monetary amount equal to a 23% equity stake," approximately $51. 75 million. No. 21‑cv‑00135, 2025 WL 1371330, at *1 (N. D. Ill. May 12, 2025).
  • In September, an Illinois court granted final approval of a $47. 5 million settlement between a technology company and a class of at least 150,000 individuals who appeared in images processed by the company's facial recognition technology.
  • In October, an Illinois court approved a $8. 75 million class action settlement for a technology company's alleged collection of face models or voice models of approximately 660,000 Illinois students as part of its education platform product.

Not every class certification in 2025 came as a result of a court-approved settlement, however. Both states and federal courts in Illinois granted contested motions to certify class actions, finding that the plaintiffs in those cases had demonstrated the requirements of 735 Ill. Comp. Stat. 5/2-801 or Federal Rule of Civil Procedure 23. See, e.g., McGivney v. ITS Techs. & Logistics, LLC, 2025 IL App (1st) 241961-U, 2025 WL 1743891. These certification decisions, settlement amounts, and increased filings show that BIPA litigation still presents a significant risk to businesses going into 2026 and beyond.

B. BIPA Arbitration Activity

In addition to being a common source of class action claims, BIPA has also provided plaintiffs with a powerful tool in the individual arbitration and mass arbitration contexts.

Mass arbitration, as covered recently here on Privacy World, is a plaintiffs' strategy of bringing thousands of nearly identical arbitration claims against a single company as a way of forcing the company to accrue large amounts of arbitration fees. These fees often total in the millions of dollars and companies are typically on the hook regardless of the merits of the underlying claims. Plaintiffs' firms do this in the hope of pressuring companies into entering settlement negotiations—even for flawed claims which, were the cases to be litigated on the merits, would not be successful.

Often, plaintiffs' firms advertise their recruitment of mass arbitrations online. From a review of these recruitments and Squire Patton Boggs' experience with mass BIPA arbitrations, it is clear that BIPA has become a staple of mass arbitration claims in 2025. These claims are threatened against companies in a variety of industries, not technology companies, and it is highly likely that such claims brought before AAA and JAMS will persist going into 2026.

III. Seventh Circuit to Decide Issues on BIPA Exemptions

Despite its large liability risk for businesses generally, Section 25 of BIPA contains several enumerated exemptions where BIPA does not apply. This includes an exemption for any "financial institution or an affiliate of a financial institution that is subject to Title V of the federal Gramm-Leach-Bliley Act of 1999 and the rules promulgated thereunder," 740 Ill. Comp. Stat. 14/25(c) (the "GLBA Exemption") and an exemption for any "contractor, subcontractor, or agent of a State agency or local unit of government when working for that State agency or local unit of government," id. 14/25(d) (the "Government Contractor Exemption"). These exemptions are significant, and Squire Patton Boggs' team has obtained a number of dismissals of claims on these bases.

In 2025, cases addressing each exemption reached the United States Court of Appeals for the Seventh Circuit. In Cisneros v. Nuance Commc'ns, Inc., a plaintiff alleged that she called a financial advisor and that Nuance Communications, a provider for the financial advisor, created and used a voiceprint to identify plaintiff in violation of BIPA. The district court granted Nuance's motion to dismiss on the basis of the GLBA Exemption, holding that Nuance was a "financial institution" under the law because it engaged in a financial activity specified by the Federal Reserve Board: "authenticating the identity of persons conducting financial and nonfinancial transactions." No. 1:21-cv-04285, 2024 WL 5703970, at *4 (N. D. Ill. Oct. 4, 2024) (citing 12 C. F. R. § 225. 86(a)(2)(iii)). The plaintiffs appealed the applicability of the GLBA Exemption, and the Seventh Circuit heard oral argument on the issue this October. A decision is pending, and Privacy World will update you when it is made.

Separately, this year a district court granted a motion to certify an interlocutory appeal to the Seventh Circuit regarding the scope of the Government Contractor Exemption. See Payton v. Union Pac. R. R. Co., No. 24-cv-00153, 2025 WL 3012662, at *2 (N. D. Ill. Oct. 28, 2025). The question at issue in that is whether the Government Contractor Exemption "requires only a temporal nexus—meaning that government contractors are exempt from BIPA during the time they maintain active government contracts—or whether it also requires a substantive nexus between a contractor's alleged collection of biometric information and its government contracts." Petition, No. 25-8031 (7th Cir. Nov. 7, 2025). Whether the Seventh Circuit will also permit the appeal remains pending.

IV. Court Confirms New York City's Biometric Privacy Law Is Narrower than BIPA

While BIPA is the most heavily litigated biometric privacy law, it is not the only law. Since 2021, New York City's administrative code has imposed notice requirements and usage limitations for commercial establishments that collect customers' biometric data. See N. Y. C. Admin. Code § 22-1202. The ordinance also makes it unlawful to "sell, lease, trade, share in exchange for anything of value or otherwise profit from the transaction of biometric identifier information." Id. The ordinance provides a private right of action with damages of $500 for negligent violations and $5,000 for intentional or reckless violations, as well as reasonable attorneys' fees and costs, with a 30-day cure period for the commercial establishment. Id. § 22-1203.

This month, a federal court in New York granted a defendant's motion to dismiss a claim under New York City's biometric law for the defendant's alleged use—but not sale—of biometric information. Schottenstein, No. 25-cv-08635-LAK, 2025 WL 3442844, at *3 (S. D. N. Y. Dec. 1, 2025). In granting the motion, the Court rejected the plaintiff's reliance on BIPA caselaw, holding that the two laws are "meaningfully different" in that "the NYC Biometrics Law prohibits profiting from atransactionof biometric information, not merely from the information itself." Id.

The decision in Schottenstein hopefully cuts off any hope from plaintiffs' lawyers to turn New York City's biometric privacy law into the next BIPA.

V. New Legislation and Attorney General Enforcement: Colorado and Texas

While Illinois and New York City have biometric privacy laws with private rights of action, other state biometric privacy laws provide for attorney general enforcement. For example, on July 1, 2025, Colorado's H. B. 24-1130, an amendment to the Colorado Privacy Act, went into effect and requires employers to create biometric policies as well as obtain consent before collecting and using biometric information. The Colorado Privacy Act places enforcement authority in the hands of district attorneys and the Attorney General for Colorado. See Colo. Rev. Stat. §6-1-1311.

Biometric privacy laws without a private right of action are far from toothless. This year, Texas entered into a $1.375 billion settlement with Google to resolve alleged violations of various Texas laws, including the Texas Capture or Use of Biometric Identifier Act (CUBI), Tex. Bus. & Com. Code § 503. 001 et seq. CUBI allows the Texas Attorney General to seek civil penalties of $25,000 per violation, meaning that CUBI presents significant risk for any business whose data practices are viewed unfavorably by Texas.

As more AG offices gain familiarity with wielding their enforcement authority biometric privacy laws, it is possible that 2026 could see increased government enforcement of existing biometric privacy laws, even without any additional legislation.

VI. Courts Grapple with AI in the Biometric Data Privacy Context

As generative AI continues to develop and become widely accessible, courts have started to address the intersection of biometric privacy laws with AI. Three notable decisions in 2025 are covered below, which signify the growing exposure for companies in the biometric privacy space.

A. Court Approves Unique Settlement of BIPA and related claims in the AI Context

Notably, this year an AI startup settled litigation from a class of plaintiffs alleging that it had violated numerous privacy laws. The defendant had created a massive database of facial images compiled from publicly available pictures on the internet. Using AI software, it was able to provide customers, often law enforcement agencies, with the ability to identify otherwise unknown individuals based on facial images alone. The defendant was sued under multiple state statutes, including BIPA. Much of the litigation was ultimately consolidated into the United States District Court for the Northern District of Illinois. After months of negotiating, the company settled for an amount of money equivalent to a 23% stake in the company.

The court approved the settlement. Most importantly, it held that the settlement appeared fair given the zealous advocacy on both sides, the financial condition of the defendant, and the risk that the plaintiff would get much less if the case proceeded to trial. Indeed, the court pointed out that, given that the defendant was a startup company, it might be out of money by the time the claims could be tried. It thus reasoned that the 23% figure was carefully calibrated and meant to provide maximum value to the plaintiffs. The court also acknowledged the important policy concerns both for and against the technology involved but ultimately determined that those issues should be left to legislatures to address.

B. Court Holds that AI is not a "Medical Professional" and cannot Provide Treatment under BIPA

A New Jersey federal court struck another blow against the use of biometric data as a data input for AI. In this case, the defendant was a skincare company that used AI to provide a personalized skincare assessment to customers after obtaining facial images. Plaintiffs alleged that the company did this without obtaining proper consent under BIPA. The defendant moved to dismiss the claim, alleging that its conduct fell within the medical exception found in BIPA Section 14/10. That section excludes information "captured from a patient in a health care setting" from the reach of the statute. 740 Ill. Comp. Stat. 14/10.

The court denied the motion to dismiss because the plaintiffs were not "patients" under BIPA Section 10. This is because the plaintiffs had not been "presently awaiting or receiving care and treatment from a medical professional" as required by relevant Illinois case law. Melzer, No. 22-cv-03149-MAS-RLS, 2025 WL 755282, at *5 (D. N. J. Mar. 7, 2025) (citation omitted). The court reasoned that receiving advice from AI was not medical treatment nor was any medical professional involved.

This case again shows the importance of complying with the requirements of BIPA when employing generative AI tools, even and especially in contexts involving health, beauty, and wellness.

C. Court Finds that Creating AI Copy of a Persons Voice is a Violation of New York Law

The United States District Court for the Southern District of New York showed that BIPA is not the only statute regulating the use of biometric data and AI. This case involved plaintiffs, professional voice-over actors, who had contracted with the defendant, an AI voice generation company. The plaintiffs were told that their voice recordings would be used for "research purposes" and internal use only. Lehrman, 790 F. Supp. 3d 348, 356 (S. D. N. Y. 2025). The defendant, however, later used the voice recordings to come up with AI voice clones of the plaintiffs which were available to its customers. The plaintiffs sued the defendant under numerous theories. The most relevant here are the claims for violations of the New York Civil Rights Law. Sections 50 of that law prohibits using, "for advertising purposes, or for the purposes of trade, the name, portrait, picture, likeness, or voice of any living person without having first obtained the written consent of such person." N.Y. Civ. Rights Law § 50. Section 51 provides plaintiffs whose rights are violated with a private right of action.

The court declined to dismiss the Civil Rights Law claims. It held that the statute covered the sort of generative AI voice replicas at issue in the case. The purpose of the statute, the court explained is "to protect a person's identity, not merely a property interest in his or her 'name', 'portrait' or 'picture.'" Lehrman, 790 F. Supp. 3d at 381. Thus, the key question is whether the information used allows "recognition of the plaintiff's identity." Id. The voice clones were recognizable likenesses of the plaintiffs and were therefore protected by the law.

This case further shows the dangers that companies face when employing AI tools that rely on biometric data.

VII. Conclusion

Biometric privacy laws present substantial litigation risk, as the developments in 2025 indicate, building upon trends from prior years. See, e.g., https://www. privacyworld. blog/2023/01/privacy-world-2022-year-in-review-biometrics-and-ai/; https://www.privacyworld.blog/2022/01/2021-year-in-review-biometric-and-ai-litigation/. From multimillion-dollar class actions to billion-dollar enforcement actions, it is no wonder that privacy laws continue to be fiercely litigated.

Answers to questions currently pending before appellate courts like the Seventh Circuit may have significant consequences on the shape of biometric privacy litigation in 2026. Squire Patton Boggs, with its globally ranked "Elite" Data Disputes practices, can ensure companies stay help clients stay one step ahead in an environment of rapidly changing biometric-privacy obligations and litigation/arbitration risk. Moving forward, be sure to stay tuned to Privacy World for all the latest developments in the biometric privacy litigation space.

Disclaimer: While every effort has been made to ensure that the information contained in this article is accurate, neither its authors nor Squire Patton Boggs accepts responsibility for any errors or omissions. The content of this article is for general information only, and is not intended to constitute or be relied upon as legal advice.

Stay Ahead on Consumer Privacy News

Not a subscriber yet? Subscribe here to be among the first to receive timely updates on the fast-moving world of data privacy, security, and innovation—delivered straight to your inbox.

Looking for deeper insights and expert analysis? You can also subscribe here to our privacy attorneys' marketing communications for thought leadership and rich content when you need a more comprehensive perspective.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

[View Source]

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More