ARTICLE
5 August 2025

California's New Privacy And Cybersecurity Regulations On Risk Assessments, Automated Decision Making And Cybersecurity Audits: What Businesses Need To Know

GP
Goodwin Procter LLP

Contributor

At Goodwin, we partner with our clients to practice law with integrity, ingenuity, agility, and ambition. Our 1,600 lawyers across the United States, Europe, and Asia excel at complex transactions, high-stakes litigation and world-class advisory services in the technology, life sciences, real estate, private equity, and financial industries. Our unique combination of deep experience serving both the innovators and investors in a rapidly changing, technology-driven economy sets us apart.
During a Board Meeting on July 24, 2025, the California Privacy Protection Agency (CPPA) unanimously approved the long-awaited final text of its second rulemaking package, implementing a broad swath of new requirements...
United States California Technology

During a Board Meeting on July 24, 2025, the California Privacy Protection Agency (CPPA) unanimously approved the long-awaited final text of its second rulemaking package, implementing a broad swath of new requirements regarding risk assessments, automated decisionmaking technology (ADMT), and cybersecurity audits. The regulations, under the California Consumer Privacy Act (CCPA), also amended various provisions of the initial CCPA regulations. While not using – and, in fact, removing from previous drafts – the words "artificial intelligence," the regulations very much impact AI, through risk assessment and ADMT rules, and require companies to enhance their data privacy and cybersecurity programs, including undergoing an annual evidence-based cybersecurity audit.

The regulations adopt rigorous privacy and cybersecurity standards that will expand the scope of requirements for most business subject to the CCPA and are likely to become the benchmark for U.S. privacy and cybersecurity compliance. Nonetheless, they step back considerably from earlier drafts, which Goodwin has covered in-depth here. Key changes from prior drafts, such as narrowed applicability thresholds, increased flexibility in reporting timelines, and reduced burden on AI and algorithmic decisionmaking, echo the new regulatory tone in Washington that has reprioritized technological innovation.

California's Office of Administrative Law (OAL) now has 30 working days (plus an optional 60-calendar-day extension) to complete its review. Pending any challenges or pushback from the OAL, which did not impede previous CCPA rulemakings, the regulations will become effective January 1, 2026, with certain sections coming into force in 2027 as noted below.

This Alert analyzes three key aspects of the regulations: (1) risk assessment requirements for certain practices involving personal information; (2) notice, opt-out, and access rights for ADMT; and (3) mandated cybersecurity audit procedures applicable to certain businesses.

Mandatory Risk Assessments for High-Risk Data Processing: New Privacy Compliance Requirement

The regulations significantly expand privacy compliance obligations by embedding proactive risk management into the core of data processing activities. Businesses must now evaluate their data practices through the lens of privacy risk – including by developing internal protocols for identifying covered activities, conducting structured risk assessments, and maintaining documentation to demonstrate compliance – before initiating certain operations. Accordingly, the new risk assessment requirements signal a shift from reactive to preventative privacy governance, with legal exposure for businesses that fail to perform the required assessments before engaging in high-risk data processing. And while the regulations require businesses to conduct risk assessments when processing personal information in ways that regulators have determined present heightened risks to consumer privacy, even commonplace practices, such as targeting online ads to promote a service (e.g., on social media and search engines), likely will trigger risk assessment requirements.

When Is a Risk Assessment Required?

Under §7150(a) of the regulations, any business engaged in processing activities that involve significant privacy risks must conduct a formal risk assessment before initiating that processing. Removing ambiguity about which activities trigger this obligation, the regulations explicitly define what constitutes "significant risk" as:

  • "Selling" or "sharing" personal information, including any disclosures of personal information for targeted advertising purposes. Given the broad reading of the terms "sell" or "share" by regulators, a large number of businesses are likely to fall under the risk assessment requirement based on this regulatory hook.
  • Processing sensitive personal information, such as information revealing certain government identifiers, financial account credentials or card details, precise geolocation, health or genetic data, racial or ethnic origin, citizenship, or the contents of private communications. However, the collection of this information only in the employment context does not require a risk assessment where the data is used only for standard employment purposes.
  • Using ADMT for a "significant decision" involving a consumer (discussed further in the next section of this Alert below).
  • Using automated processing to analyze a consumer's intelligence, ability, aptitude, performance at work, economic situation, health, location, preferences, interests or other similar characteristics, based upon a systematic observation of that consumer when they are acting as an educational program applicant, job applicant, student, employee, or independent contractor, or on that consumer's presence in a sensitive location, such as a hospital, religious facility, or political party office.
  • Using personal information of consumers to train an ADMT that is intended to be used for a "significant decision" concerning a consumer, or a facial-recognition, emotion-recognition, or other technology designed to identify or profile consumers.

As discussed above, because of broad definitions of "selling" and "sharing" – which often encompass standard online advertising practices – most businesses that are regulated by the CCPA likely will be subject to these risk assessment requirements. In addition, not only does use of personal information to make a significant automated decision require a risk assessment, but a business also must conduct a risk assessment under the regulations if it uses personal information to train certain AI systems, even if those systems are never deployed to evaluate the individuals whose data is used for training purposes.

Content of a Risk Assessment

The regulations contain detailed requirements regarding the content of risk assessments and the procedures for conducting them. Specifically, risks assessments must:

  • Explain the activity and its purpose in granular detail, including: the specific categories of personal information required (and, in support of data minimization requirements, the minimum categories of personal information necessary to achieve the intended purpose); the method of collecting, using, and sharing personal information; the intended retention period for each category of personal information; and all transparency disclosures provided to consumers;
  • Identify the purported benefits and potential negative impacts of the proposed activity. Negative impacts may include unauthorized access to or use of personal information, discrimination, loss of control over personal information, coercion, or economic, physical, psychological or reputational harms; and
  • Describe the safeguards that will be implemented to manage potential negative impacts (e.g., technical safeguards, policies or procedures, notifications or consents, etc.).

Businesses that develop and sell to other businesses technologies (including AI) used for ADMT must provide the businesses using their technologies with all information necessary to conduct a risk assessment.

In a significant change from previous drafts of the regulations, the CPPA removed the prohibition on processing activities where the privacy risks outweigh the benefits. In its place, the final regulations include a softer formulation, stating instead that "the goal of a risk assessment is restricting or prohibiting" processing activities with disproportionate risks.

Procedural Requirements

In addition to specific content requirements, the regulations mandate specific procedures for conducting risk assessments, including:

  • Requirements to involve relevant internal stakeholders (e.g., employees who participate in the activity) in conducting the assessment, and allowing (or perhaps, suggesting) the involvement of relevant external stakeholders, such as affected consumers;
  • Requiring businesses to conduct risk assessments before initiating the activity and to review the risk assessment at least every three years or, in the event of a material change in the activity, as soon as "feasibly" possible, but no later than 45 days from the date of the change;
  • Designating a qualified individual (e.g., a business's highest-ranking executive) to certify the completion of risk assessments; and
  • Submitting to the CPPA, once per year beginning April 1, 2028, certain information about risk assessments that the business has conducted, including the number of assessments, categories of personal information involved, and a written certification. The CPPA may also, at any time, request copies of any risk assessment a business has conducted.

If approved by OAL, the risk assessment provisions of the regulations will come into effect on January 1, 2026 for any new activities initiated after that data. Businesses will have a grace period until December 31, 2027 to complete risk assessments for activities already underway before the effective data.

ADMT Rules Force Businesses to Open the Black Box

The regulations establish new consumer protections for any use of ADMT to make a "significant decision" regarding a California resident. Beginning on January 1, 2027, businesses that use ADMT will need to provide robust transparency – both before and after automated decision is made – and, in some cases, permit consumers to opt out by offering alternative decisionmaking channels or access to human review.

What is a Significant Decision?

The regulations define ADMT as "any technology that processes personal information and uses computation replace human decisionmaking or substantially replace human decisionmaking." This definition aligns with the EU's General Data Protection Regulation (GDPR) and covers decisions made "without human involvement" – an important scaling back from prior drafts of the regulations, which would have extended to AI-assisted decisions, even if not fully automated.

Moreover, the regulations' ADMT requirements apply only to "significant decisions," which are decisions "that result in the provision or denial of financial or lending services, housing, education enrollment or opportunities, employment or independent contracting opportunities or compensation, or healthcare services." Significant decisions include "profiling" (i.e., automated evaluation of individual characteristics to analyze or predict certain individual characteristics, such as work performance, health, economic situation, behavior, etc.), but importantly, do not include "advertising to a consumer."

Pre-Use Notice – Wait, Another CCPA Notice?

Businesses that use ADMT for significant decisions will need to provide consumers a "pre-use notice" that describes, among other elements, the specific purpose for using ADMT, the consumer's rights to access and opt out or appeal the decision, and how the ADMT works to make a significant decision, such as the types of outputs the system generates and how the business uses those outputs.

The description needs to be provided in plain language and must include granular details about the operation of the system, such as the categories of personal information that affect the ADMT's output. At the same time, the rules permit businesses to withhold protected trade secrets and any information that would compromise the business's ability to protect security and integrity, resist malicious, deceptive, or illegal actions, or ensure the safety of individuals.

Pre-use notices can be standalone notices or they can be rolled into a business's existing "notice at collection," which is already required under the CCPA. Businesses can also consolidate information about multiple decisionmaking systems into the same pre-use notice provided that the notice gives all the required information for each system.

Critically, the notice must be "presented prominently and conspicuously at or before the point when the business collects the consumer's personal information that the business plans to process using ADMT." And, if a business will re-use personal information initially collected for a different purpose, the business must provide the pre-use notice before using ADMT.

ADMT Opt Outs and Appeals

The regulations grant consumers the right to opt out of the use of ADMT for significant decisions. The stated purpose of the opt-out right is to permit consumers to bypass automated decisions altogether and access alternative decisionmaking systems. For this reason, the rules require businesses to describe in their pre-use notices the alternative process consumers can undergo – instead of ADMT – if they decide to opt out. Businesses must also make available two or more designated methods to submit opt-out requests at or before any significant decision is made.

However, several exceptions to these requirements appear designed to motivate businesses to adopt safeguards for their use of ADMT rather than offering alternative decisionmaking channels. For example:

  • The regulations exempt businesses from pre-ADMT opt-out requirements if they instead permit consumers to appeal and obtain human review after a decision has been taken. To use this exception, the human reviewer must have authority to overturn the automated decision and must know how to interpret and use any outputs of the ADMT.
  • The regulations also exempt certain classes of decisions from opt-out requirements provided certain safeguards are in place. For example, certain decisions affecting admission, acceptance, hiring, allocation of work, and compensation are exempt from opt-out requirements as long as the business uses the system only for exempted purposes and "the ADMT works for the business's purpose and does not unlawfully discriminate based upon protected characteristics." These safeguards likely will depend on the outcome of a risk assessment concluding that a particular ADMT is fit for purpose and not biased or discriminatory.

Businesses are not permitted to verify consumers' identities in order to facilitate opt-out requests, unless the business has documented reasons to suspect a request is fraudulent. The regulations do not address whether a business can or must verify consumers' identities when offering a right to appeal instead of an opt-out.

A New Right to Explanations?

In addition to notice and opt-out/appeal rights, the regulations permit consumers to request specific information concerning the use of ADMT to make decisions affecting them. While some of the information consumers can access mirrors the disclosure requirements of the pre-use notice – such as information about the purpose for which the ADMT was used – other elements of this access right likely will require more specific and detailed information.

Among other things, the regulations require businesses to provide information about the logic of the ADMT in a manner that permits "a consumer to understand how the ADMT processed their personal information to generate an output with respect to them, which may include the parameters that generated the output as well as the specific output" (emphasis added). The business must also explain, on request, how any outputs of the ADMT were used by the business, including the roles of any humans in the loop.

These provisions are likely to present a significant challenge for businesses that use complex or black-box algorithms that do not rely on pre-programmed logic. Businesses will need to design thoughtful processes – with robust collaboration between technical and legal teams – for providing meaningful information about the decisions they make using ADMT, informing consumers not only about the logic generally underlying an ADMT but also of the reasons it reached a certain decision about them.

Unlike the regulations' opt-out requirements, the right of access is subject to the CCPA's standard verification procedures. Accordingly, businesses must confirm consumers' identities with a "reasonably high degree of certainty" before granting access.

Cybersecurity Audits

The cybersecurity audit regulations require designated businesses to conduct comprehensive annual audits of their cybersecurity program. Critically, the regulations are sector-agnostic – a substantial deviation from most US cybersecurity frameworks, which typically apply only to certain regulated sectors, such as financial services and healthcare. Accordingly, businesses in sectors that in the past were not subject to mandatory cybersecurity requirements – including many technology and life sciences companies – will need to review, and in many cases, significantly expand, their cybersecurity programs.

Scope of Application

The cybersecurity audit requirements in the regulations apply only to a subset of businesses regulated by the CCPA that, in the prior calendar year:

  • had annual gross revenues above the specified monetary threshold, adjusted for inflation (currently set at $26.625 million) and either (a) processed personal information of at least 250,000 consumers or households, or (b) processed sensitive personal information of 50,000 or more consumers; or
  • derived at least 50% of annual revenue from selling or sharing personal information of California residents.

Audit Requirements

The regulations include a prescriptive set of requirements for the audits, which much assess every applicable component of the business's cybersecurity program. Covered businesses must evaluate both technical and non-technical safeguards, and describe how those safeguards mitigate specific risks. Specifically, the regulations specify a list of 18 cyber controls that the audits must assess, thereby providing a glimpse into how California regulators interpret the ever-elusive concept of "reasonable security."

These controls include, if applicable, the company's approach to authentication (including phishing resistant MFA), password policies, encryption at rest/in transit, zero trust architecture, account/access management including privileged access controls, inventory and classification of personal data and systems, monitoring/logging, secure configuration, vulnerability management/patching, incident response and breach procedures, disaster recovery/business continuity, penetration testing and red team exercises, employee training and awareness, third party/service provider oversight, change management, secure software development including code review and testing, multi layered network/security controls, data minimization and retention limits, and threat intelligence and current threat awareness capabilities.

The audit must also evaluate how the covered business implements and enforces compliance with the foregoing requirements. In addition, the audit must document any incidents that triggered notification requirements in California in the previous year.

Importantly, businesses that engage in a cybersecurity audit or assessment (if the business relies on an existing assessment, such as under requirements from New York's Department of Financial Services (NYDFS) or using the NIST Cybersecurity Framework 2.0), may leverage it to demonstrate compliance with the audit requirement under the law, provided that the audit meets the regulations' requirements.

Audit Reports, Conduct, and Certification

The audit or assessment must be conducted by impartial "qualified internal or external personnel" with "knowledge of cybersecurity and how to audit a business's cybersecurity program." When using internal auditors, the regulations require the business to establish procedures governing the auditor's reporting chain and performance reviews to maintain the auditor's independence.

The audit report must document specific gaps and weaknesses of applicable policies, procedures, and safeguards, and explain how the audit or assessment's findings are or will be acted upon, and timeframes for such actions. The audit's identification of compliance and/or gaps in controls must be based on, and include, supporting evidence and documentation. The result of such audit must be provided to a member of the business's executive management team with direct responsibility for its cybersecurity program.

The regulations also require businesses to maintain mandatory records and certify compliance to the CPPA. Specifically:

  • Record-keeping: Covered businesses must document their audits or assessments in writing each year that the audit is required and retain records for at least five years following the completion of each audit.
  • Certification: Each year that the requirement applies, businesses must submit to the CPPA a certification that the business completed the required audit. The certification must be made by a member of the business's executive management team who is directly responsible for the business's cybersecurity audit compliance and who has sufficient knowledge of the audit to provide accurate information. The certification does not include a copy of the audit report.

Compliance Timeline

The timeline for completing cybersecurity audits was also revised with the latest approved draft. The regulations push out the requirement for a business to complete its first cybersecurity audit report to April 1, 2028, if its 2026 gross revenue exceeded $100 million, and April 1, 2029, or 2030, if its gross revenue was $50-100 million, or under $50 million, respectively. The regulations state that failure to conduct or document audits as required may be considered a separate violation under the CCPA.

What Should Businesses Do Now?

To prepare for the regulations, businesses should take a number of steps to make sure they are ready once these new requirements come into force.

  1. Risk assessments: Identify any activities that present a "significant risk" under the regulations, such as engaging in targeted advertising or otherwise "selling" personal information, processing sensitive personal information, making consequential automated decisions about consumers, or profiling consumers in sensitive contexts (e.g., aptitude, work performance or education) or for biometric identification. Business may also start developing risk assessment procedures to ensure that new activities are surfaced to appropriate teams for review. The compliance deadline for this requirement is December 31, 2027.
  2. ADMT: Identify any activities that involvement making "significant decisions" using ADMT, such as decisions affecting access to financial or lending services, housing, education enrollment or opportunities, employment or independent contracting opportunities or compensation, or healthcare services. Businesses that use ADMT should start collecting information about how their systems operate and, if the systems are operated by vendors, work with vendors to obtain information about the logic their systems use to generate outputs and how they manage risks of error, bias and discrimination. The compliance deadline for this requirement is January 1, 2027.
  3. Cybersecurity audits: Begin identifying, and where necessary, developing and/or updating cybersecurity policies and procedures to address the prescriptive new requirements set out in the regulations. Businesses outside of highly regulated sectors are likely to find that their current policies – as well as their technical and operational security controls – will need significant upgrades to meet these new standards. The compliance deadline for initial audits ranges from April 1, 2028, to April 1, 2030 depending on the size of the business.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More