- within International Law, Food, Drugs, Healthcare, Life Sciences, Media, Telecoms, IT and Entertainment topic(s)
- in United States
- with readers working within the Healthcare industries
As we enter 2026, the changes of the last several years are no longer abstract. Technology decisions are now examined closely by regulators, courts and counterparties, often long after those decisions were made. At the same time, organizations remain under pressure to move quickly, adopt new tools and modernize infrastructure. Managing the balance between innovation and accountability has become a central challenge for legal and business leaders.
This year's Technology Transactions & Data Privacy Report reflects that reality. The articles focus on the issues we see most often in practice, including how organizations govern technology in real-world environments; how contracts allocate risk, once systems are deployed; and how privacy and security programs perform when tested by regulators or litigation.
Artificial intelligence is featured prominently throughout this report, but the discussion has shifted. For many organizations, the question is no longer whether to use AI, but how to control it responsibly. Tools that act autonomously, interact with enterprise systems or make decisions without constant human input raise difficult questions about oversight and liability. Litigation and enforcement activity are beginning to reflect these concerns, particularly where AI tools collect data, listen to communications or are deployed in sensitive contexts such as hiring.
Privacy compliance continues to evolve in a similar direction. Regulators and plaintiffs are increasingly focused on whether privacy programs operate as described, especially with respect to online tracking, consent and third-party technologies. Cross-border data transfers remain an area of sustained attention, requiring alignment of legal, contractual and technical safeguards across jurisdictions.
Data security is also under greater scrutiny. After a breach, regulators are examining not only the incident itself but the design and day-to-day operation of security programs. The cyber insurance market is reinforcing these expectations through tighter underwriting and renewed focus on documentation, vendor management and incident readiness. New state safe harbor statutes and compliance regimes such as CMMC are further shaping how organizations assess risk, particularly in regulated supply chains.
Looking ahead to 2026, investment in AI and data center infrastructure will continue to grow. AI workloads are driving decisions about where data is stored, how systems are secured and which vendors are involved. Those infrastructure choices increasingly influence transaction strategy, regulatory exposure and long-term operational risk. In this report, Polsinelli lawyers share practical insight, drawn from their work with clients navigating these issues every day. We remain committed to helping clients make technology decisions that are forward-looking and defensible in an increasingly complex environment.
CCPA 2025 Enforcement in Review: Ensuring Privacy Programs Work in Practice
KEY TAKEAWAY: California got more aggressive on privacy in 2025. Regulators now expect privacy tools to work in practice — not just on paper — and they're testing opt-outs, vendor contracts and employee notices for real-world performance.
Enforcement of the California Consumer Privacy Act (CCPA) entered a new and more assertive phase in 2025, with regulators focusing on how privacy practices actually function to protect consumers. Both the California Privacy Protection Agency (CalPrivacy) and the California Attorney General (AG) played active roles in this shift. CalPrivacy issued investigations, and its first enforcement orders centered on the technical performance of opt-out mechanisms, consent tools and data subject rights portals. The AG also brought its own enforcement action, also reinforcing that CCPA compliance depends on whether business' privacy controls operate effectively, not just whether they exist on paper. For businesses subject to the CCPA, 2025 enforcement made clear that compliance turns on how privacy practices work in reality — not just how they appear online or on paper.
In this article, we look at a series of 2025 CCPA enforcement actions to show what regulators' "proof-of-performance" focus means for privacy compliance obligations. CalPrivacy's settlement with Tractor Supply Co. highlights increased scrutiny of privacy notices for consumers, employees and job applicants. Settlements with American Honda Motor Co. and Todd Snyder Inc. highlight expectations around CCPA-compliant vendor and adtech contracts, functioning cookie management platforms (CMPs) and opt out tools, and right-sized identity verification. The AG's settlement with Healthline Media, LLC illustrates the CCPA's purpose-limitation principle in the context of sensitive health data, and CalPrivacy's recent Delete Act actions against multiple data brokers reinforce registration obligations. Taken together, these developments show that regulators are increasingly focused on whether privacy programs actually work in practice to protect consumers and that they are willing to test those programs for compliance.
Current, Accurate Privacy Notices
The CCPA requires businesses to maintain privacy notices that accurately disclose the categories of personal information collected and shared; the rights available to consumers to exercise over their personal information; and clear instructions on how those rights may be exercised. These notices must reflect current practices and be updated at least annually. The CCPA is unique among state privacy laws in extending the notice requirement to job applicants and employees, meaning that businesses must prepare and maintain notices tailored to employment.
The Tractor Supply enforcement action illustrates CalPrivacy's heightened scrutiny of privacy notice compliance. CalPrivacy imposed a $1.35 million penalty — its largest CCPA fine to date — after finding that Tractor Supply's consumer-facing privacy notice failed to disclose key categories of personal information collected or shared, did not adequately describe consumer rights and did not provide instructions on how to exercise those rights. CalPrivacy also emphasized that Tractor Supply had not updated its privacy notice in four years, despite the requirement for annual review. In addition, even though the Tractor Supply Co. had job applicants and employee notices in place, the notices were found to be non-compliant because they failed to describe CCPA rights or explain how those rights could be exercised.
From a practical standpoint, the Tractor Supply action demonstrates that businesses must ensure they have current, accurate privacy notices in place, conduct annual notice reviews and treat employee and applicant notices as meaningful compliance documents — not afterthoughts.
CCPA Provisions in Vendor and Adtech Contracts
Businesses under the CCPA must also maintain contracts that contain certain CCPA required data protection terms with service providers, contractors and other third parties that they disclose personal information to and be able to provide those contracts to regulators upon request. Regulators have made clear that businesses cannot rely on assumptions, generic industry frameworks or vendor assurances to satisfy these obligations. Instead, companies must be able to demonstrate, often on short notice during an audit, that each vendor relationship includes executed agreements containing the required provisions. Increasingly, the concern is not simply that the right contractual terms are missing, but that businesses are unable to locate and produce the agreements when regulators ask.
Several 2025 enforcement actions illustrate this trend. In CalPrivacy's investigation into Honda's privacy practices, CalPrivacy found that Honda had disclosed personal information to advertising technology partners and then could not prove that they had entered into contracts that contained the required CCPA provisions. Similarly, in the AG's settlement with Healthline, the AG concluded that Healthline assumed its advertising partners had adopted industry standard contracts but had failed to verify that the agreements included the specific terms required by the CCPA. The Tractor Supply action discussed above also involved insufficient contractual provisions with vendors handling personal information. These actions show that businesses must inventory their vendor relationships, ensure that they have these agreements on hand, identify contractual gaps and confirm that updated CCPA-compliant terms are executed and maintained across all data sharing partnerships.
Functioning CMPs and Opt-Out Mechanisms
Another focus of 2025 CCPA enforcement was that consumer-facing opt-out tools actually function and are easy for consumers to use. CalPrivacy repeatedly stressed that having a cookie banner, consent management platform (CMP) or "Do Not Sell or Share My Personal Information" link is not enough if the underlying system does not actually honor consumer choices by stopping tracking technologies or triggering a stop on the sale or sharing of information. Regulators also focused on the "symmetry of choice" principle, which requires businesses to make it just as easy to opt-out of data collection and sharing as it is to opt-in. Applied to CMPs, designs that require users to take extra steps, contain less conspicuous opt-out options or otherwise steer consumers toward "accept all" selections may be treated as dark patterns. Even one additional click required to opt out is enough to create a more burdensome choice. In addition, the option to opt out must be just as apparent to consumers and cannot be displayed in a less conspicuous color or font than the option to opt in.
Several CalPrivacy enforcement actions last year focused on the functionality of opt-out mechanisms. In the Honda action, Honda's website cookie banner allowed consumers to "Accept All" cookies with one click, but users had to individually toggle off categories of cookies they wanted to opt out of. This extra step was deemed a "dark pattern" and non-compliant with symmetry of choice requirements. The Todd Snyder settlement similarly involved a CMP that was misconfigured for approximately 40 days, during which the banner disappeared before users could interact with it — preventing consumers from submitting opt-out requests altogether.
Healthline's enforcement action reinforced this theme: although Healthline implemented multiple opt out mechanisms, including a "Do Not Sell or Share My Personal Information" link, CMP and Global Privacy Control signal detection, none of the tools functioned correctly, and Healthline continued to disclose personal information to advertisers even after consumers attempted to opt out.
Collectively, these actions signal that businesses must regularly test their CMPs, cookie banners and opt-out tools; review user experience designs for symmetry-of choice compliance; and monitor vendor-provided tools to ensure they perform as intended.
Purpose Limitation Principle
Regulators also emphasized the CCPA's purpose-limitation principle, which requires that personal information only be used or disclosed for purposes that were disclosed at the time of collecxstion or that consumers can reasonably anticipate. Sensitive personal information, such as data-revealing health conditions, requires special scrutiny because of the heightened risks involved.
The purpose-limitation principle is illustrated by the AG's $1.55 million settlement with Healthline, the Department of Justice's largest CCPA enforcement to date. Healthline allegedly disclosed to advertisers the titles of health-related articles visited by consumers, including content suggesting specific medical diagnoses such as multiple sclerosis or HIV. Although Healthline's privacy policy referenced targeted advertising generally, it did not disclose that sensitive, health condition revealing browsing data would be shared with third parties for targeted advertising purposes. The AG argued that consumers could not reasonably expect such sensitive information to be used for targeted advertising, and therefore, Healthline violated the purpose limitation rule. This action underscores the need for businesses to map their data flows, identify whether any sensitive personal information is being used for advertising or analytics and ensure that their privacy notice disclosures clearly and specifically reflect these practices.
Data Subject Requests and Verification
The CCPA differentiates between consumer rights requests that require identity verification and those that do not. Requests to opt-out of the sale or sharing of personal information and requests to limit the use of sensitive personal information do not require verification. For requests to access, delete and correct personal information, the verification process must allow the business to confirm the consumer's identity to a reasonable degree of certainty — typically by matching at least two data points provided by the consumer. Regulators have emphasized that businesses must avoid collecting unnecessary additional personal information for verification purposes when consumers attempt to exercise their data subject rights.
CalPrivacy investigations have found CCPA violations where businesses required consumers to provide more information than necessary to verify their identity, or where they required verification for rights that do not. For example, in the Honda action, a violation was found when they required consumers to submit eight separate data points to verify their identity for access, deletion and even opt-out requests, exceeding what was necessary for identify verification. Similarly, in the Todd Snyder action, CalPrivacy found a violation because the company required consumers to upload a government issued ID to submit data subject rights requests, even for rights requests that do not require verification.
Together, these actions demonstrate that businesses must calibrate identity verification procedures to the specific type of request and ensure that their systems for handling data subject requests are not collecting excessive or unnecessary personal information.
Data Broker Enforcement Under the Delete Act
Along with consumer-facing tools, CalPrivacy also kept busy in 2025 enforcing the data broker regulations under the California Delete Act, which applies to any business that collects and sells the personal information of consumers with whom they do not have a direct relationship. The Delete Act requires data brokers to register annually with CalPrivacy and disclose certain information about the information they are collecting and selling, as well as include those same disclosures in their privacy policy. Starting in 2026, data brokers must process statewide deletion requests through CalPrivacy's centralized Delete Request and Opt-Out Platform (DROP).
In early 2025, CalPrivacy announced multiple enforcement resolutions under the Delete Act, including orders and settlements with Key Marketing Advantage, LLC, National Public Data, Inc., Background Alert, Inc. and other data brokers that failed to register timely. Penalties ranged from $46,000 to $58,500 and included daily fines for late registration, payment of attorneys' fees and costs, and in one case, a requirement that the data broker shut down its operations through 2028 or face a $50,000 penalty.
These actions signal that data broker compliance is an active enforcement priority. For data brokers the message is straightforward: confirm whether you qualify as a data broker, register on time and prepare now for the operational demands of DROP, including the need to honor large volumes of deletion and opt-out requests on a recurring basis.
The Takeaway
Together, these enforcement actions and trends demonstrate that California is moving from a check-the-box model of privacy compliance to a proof-of-performance model. Regulators are increasingly concerned with whether tools are accessible and effective from the consumer's perspective and whether technical implementations match the promises made in privacy notices and user interfaces. To comply, businesses should:
- Regularly test consent tools to confirm that CMPs, cookie banners, GPC recognition and other opt-out mechanisms function technically — not just visually — and that these signals are honored by third-party partners.
- Maintain symmetry of choice by ensuring that opting out is no more burdensome than opting in and by avoiding dark patterns that make opting in easier than opting out.
- Maintain accurate and compliant privacy notices for consumers, job applicants and employees, and update these notices at least annually to reflect current data practices and statutory requirements.
- Ensure data-sharing contracts with vendors include all CCPA-required provisions and that downstream partners are bound to appropriate restrictions on processing and secondary use.
- Implement right-size identity verification for data subject requests to avoid over-verification while still protecting against fraud and unauthorized access.
- Monitor Delete Act obligations for any business that may qualify as a data broker, confirm registration where required, and ensure deletion workflows and request handling processes meet statutory requirements.
Cross-Border Data Transfers: New Obligations, Stable (For the Moment) Frameworks and Harmonizing Compliance
KEY TAKEAWAY: New U.S. rules restrict outbound transfers of sensitive personal data, while the EU-U.S. framework for inbound transfers remains intact — for now. Companies should map data flows, assess exposure under the Bulk Data Rule and prepare for shifting EU adequacy standards.
2025 saw developments that may either substantially change or stabilize privacy compliance programs for companies engaging in cross-border data transfers, depending largely on the directions of data flows, the types of data to be transferred and existing compliance programs.
For certain categories of personal data leaving the U.S., the Department of Justice (DOJ) finalized the Bulk Data Rule, a new national-security-driven regime that either prohibits or restricts the transfer of U.S. government data and "sensitive U.S. personal data" to "countries of concern." For data flowing into the U.S. from the European Economic Area, the European General Court's September decision in Latombe v Commission has, for now, shored up the EU U.S. Data Privacy Framework (EU-U.S. DPF) as a lawful mechanism for transatlantic data flows, but uncertainty remains as the case was appealed to the Court of Justice of the European Union (CJEU) at the end of October.
Together, these developments may alternately reshape or stabilize (at least temporarily) the risk calculus for companies operating in complex, global data ecosystems.
The Bulk Data Rule Complicates Transfers of Data Outside the U.S.
The Bulk Data Rule, codified at 28 C.F.R. Part 202, implements the Biden-era Executive Order 14117 "Preventing Access to Americans' Bulk Sensitive Personal Data and United States Government-Related Data by Countries of Concern." Effective as of April 8, 2025, the Rule prohibits or restricts U.S. entities from engaging in certain "transactions" that would grant access to either "any government-related data" or "bulk U.S. sensitive personal data" to "countries of concern" — specifically, China, Cuba, Iran, North Korea, Russia and Venezuela — and "covered persons," i.e., (a) a foreign entity that is 50% or more owned by one or more countries of concern (or by other covered persons) or that is organized, chartered or has its principal place of business in a country of concern; (b) persons who are "primarily" residents of a country of concern; (c) employees or contractors of a country of concern or other covered person; or (d) any person specified by the U.S. Attorney General (USAG). As of this writing, the USAG has not identified any specific individuals under (d).
"Covered personal identifiers" is extraordinarily broad under the Rule and means, in essence, any combination of two or more pieces of fairly innocuous data points such as internet protocol (IP) addresses, contact information (including email address), cookie data and a number of other identifiers. Thus, even websites with modest traffic that use tracking cookies may find themselves covered by the Rule, provided the data is transferred to a country of concern or covered persons.
Aside from government related data, the Rule applies only to transfers of "bulk" volumes (measured in the preceding 12 months) of certain categories of "U.S. sensitive personal data," including:
- 100+
U.S. persons' human genomic data;
- 1,000+
U.S. persons' biometric data;
- 1,000+
U.S. devices' precise geolocation data;
- 10,000+
U.S. persons' personal health data;
- 10,000+
10,000+ U.S. persons' personal financial data; and
- 100,000+
100,000+ U.S. persons' "covered personal identifiers."
Adding to this breadth, and unlike most data protection laws, data is not exempted or accorded any special treatment by virtue of being encrypted, pseudonymized or anonymized. It is therefore likely that a wide range of U.S. companies may be handling data subject to the Rule. Practically speaking, this only becomes a risk to the extent a company makes such data available to a country of concern or a covered person.
The Rule flatly prohibits transfers in the context of "data brokerage," which is defined as the "sale of data, licensing of access to data, or similar commercial transactions ... where the recipient did not collect or process the data directly from the individuals linked or linkable to the collected or processed data." Outside of data brokerage, the Rule restricts, but does not prohibit, transfers in the context of vendor agreements, employment agreements and investment agreements. These "restricted transactions" are permitted, subject to the implementation of certain security measures established by the Cybersecurity and Infrastructure Security Agency. There are a number of narrow exemptions to the most restrictive obligations under the Rule. For example, intra-company transfers that are "ordinarily incident to and part of administrative or ancillary business operations" such as for HR and payroll, may be exempt. Certain transfers of de-identified or pseudonymized data "necessary to obtain or maintain" approvals to market drugs, biological products or medical devices outside of the U.S. may be exempt. FDA-regulated clinical investigation and post-marketing surveillance data may also be exempt in certain circumstances. There is also an exemption for transactions that are "ordinarily incident to and part of the provision of" specific financial services. The Rule provides detailed examples of where these exemptions do — and importantly, do not — apply, and careful analysis is required before concluding that a transfer is exempt from certain obligations, as businesses may be subject to detailed recordkeeping requirements even where an exemption applies.
For U.S. companies, the Bulk Data Rule effectively layers a national-security export-control style regime on top of traditional privacy and cybersecurity laws. Cross-border deals involving cloud hosting, analytics, outsourcing, clinical research, ad-tech or data brokering now need to be screened not only for sanctions and CFIUS risk but also for DOJ bulk-data exposure, particularly where counterparties, infrastructure or subcontractors are linked to China or other countries of concern.
In practice, U.S. companies will want to develop (or supplement existing) detailed data-flow maps and inventories, revisit vendor data processing agreements and align internal data minimization strategies with the Rule's thresholds. Additionally, sellers in the M&A context will need to perform diligence into buyers to ensure that any deals do not run afoul of the Rule and obtain relevant representations, warranties and covenants.
The EU-U.S. Data Privacy Framework is Safe ... For Now
Across the Atlantic, the Latombe decision pulls in the opposite direction: toward stabilizing cross border transfers of personal data, at least for those U.S. companies self-certifying to the EU-U.S. DPF. In essence, the EU-U.S. DPF allows U.S. companies to self-certify with the U.S. Department of Commerce that they will accord certain protections to EEA personal data and abide by specific dispute resolution procedures. U.S. companies may additionally participate in parallel frameworks for transfers from the United Kingdom and Switzerland. On July 17, 2023, the European Commission issued an adequacy decision validating the EU-U.S. DPF as a lawful mechanism for transferring personal data to EU-U.S. DPF participants without the need for additional safeguards such as binding corporate rules or standard contractual clauses (SCCs). In practice, the SCCs are used frequently where the U.S. company importing the personal data is not an EU-U.S. DPF participant, but the CJEU effectively requires EEA data exporters to conduct detailed, and sometimes burdensome, transfer impact assessments to determine whether the personal data will receive essentially equivalent protections under the importing country's laws.
Notably, two previous similar frameworks — the Safe Harbor Framework and EU-U.S. Privacy Shield Framework — were challenged and subsequently invalidated by the CJEU in the Schrems I (2015) and Schrems II (2020) decisions, respectively. The EU-U.S. DPF presents a third bite at the apple, but Latombe may upset the apple cart in the long run. In September 2023, Philippe Latombe, a member of the French National Assembly, brought an action before the General Court seeking the annulment of the EC's EU-U.S. DPF adequacy decision, arguing that:
- The Data Protection Review Court (DPRC) — a key component of the dispute redress mechanism offered to EEA data subjects — is not independent; and
- U.S. intelligence collection of EEA personal data is not compatible with an adequacy designation because such collection does not require prior authorization from a court or other independent authority.
However, on Sept. 3, 2025, the General Court dismissed the action, holding that, at the time the EU-U.S. DPF was adopted, U.S. law — particularly Executive Order 14086 (Enhancing Safeguards for United States Signals Intelligence Activities) and the creation of the DPRC — ensured a level of protection for EEA personal data "essentially equivalent" to that in the EEA. The General Court found the DPRC sufficiently independent and effective and accepted that U.S. "bulk" signals intelligence collection could be compatible with EU law where subject to necessity, proportionality and ex post oversight.
For organizations relying on the DPF or using it as a positive factor in transfer impact assessments for SCCs, this was a major win: it avoids an immediate "Schrems III" style cliff edge. However, legal certainty may be fleeting. The Latombe decision has been appealed to the CJEU (Case C-703/25 P) which, as explained above, struck down similar predecessor frameworks after conducting its own assessment of U.S. surveillance law and redress mechanisms in place at the time.
We expect the CJEU, on appeal, to look more critically at issues the General Court treated as sufficiently addressed: the scope and oversight of bulk collection, the real-world independence and transparency of the DPRC and the durability of protections under shifting U.S. executive administrations. The appeal ensures that the DPF — and by extension many EU U.S. data flows — will remain under judicial scrutiny in 2026 and for the foreseeable future.
Cross-Border Transfers in 2026 and Beyond
Stepping back, the Bulk Data Rule and the EU-U.S. DPF (in light of Latombe and the CJEU appeal) are tightly interlinked for cross-border privacy compliance strategy. The EU-U.S. DPF adequacy assessment relies on the robustness of U.S. safeguards around surveillance and government access; at the same time, the U.S. is building a parallel regime that restricts bulk exports of sensitive U.S. data to certain foreign jurisdictions for national security reasons. From an operational standpoint, companies must now navigate a world in which
- EU law broadly permits transfers to certified U.S. organizations under the DPF (subject to the outcome of Latombe at the CJEU) while
- U.S. law may restrict outbound data flows in the opposite direction where data could be accessed by countries of concern or their proxies.
For U.S. companies managing cross-border data flows, the practical playbook for 2026 is reasonably clear: treat these developments as complementary constraints rather than isolated issues. On the U.S. side, build a Bulk Data Rule compliance program that inventories bulk-sensitive datasets, identifies any touchpoints with countries of concern (including through vendors and infrastructure) and embeds DOJ screenings into procurement, M&A and collaboration workflows.
On the EEA side, continue to make pragmatic use of the EU-U.S. DPF where available, but keep SCCs and other fallback mechanisms in good order, including by conducting transfer impact assessments reflecting current U.S. safeguards while explicitly flagging the pending CJEU appeal. In other words: design data flows that can survive both a more aggressive DOJ enforcement posture and a possible CJEU course correction — because cross border privacy law is no longer just about compliance today, but resilience to geopolitical and judicial swings tomorrow.
To view the full article, click here.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.
[View Source]