ARTICLE
13 December 2024

California's Privacy Regulator Had A Busy November, Automated Decisionmaking Edition: What Does It Mean For Businesses?

SM
Sheppard Mullin Richter & Hampton

Contributor

Sheppard Mullin Richter & Hampton logo
Sheppard Mullin is a full service Global 100 firm with over 1,000 attorneys in 16 offices located in the United States, Europe and Asia. Since 1927, companies have turned to Sheppard Mullin to handle corporate and technology matters, high stakes litigation and complex financial transactions. In the US, the firm’s clients include more than half of the Fortune 100.
In the second in our series of new CCPA regulations from California, we look at proposed rules for use of automated decisionmaking technology.
United States California Privacy

In the second in our series of new CCPA regulations from California, we look at proposed rules for use of automated decisionmaking technology. As a reminder, CCPA discusses these technologies in relation to profiling, namely "any form of automated processing of personal information" to analyze or predict people's work performance, health, and personal preferences, among other things.

The law had called on the California privacy agency (CPPA) to promulgate rules to give consumers the ability to opt out of the use of these technologies and get access to information about how the tools are used when making decisions about them. The first set of proposed rules were met with some concern, some of which has been addressed in this newest version. Highlights of the changes are below:

  • Narrowing the definition of "automated decisionmaking technology:" The law does not define this term, and in 2023 the agency had proposed that it be broadly any system that "in whole or in part" facilitates human decisionmaking. The term has now been narrowed to that which either replaces humans or substantially facilitates their decisionmaking. Meaning, that it is a "key factor" in the human's decision. The rule gives an example: using a tool's score as primary factor in making a significant decision about someone.
  • Automatic decisionmaking and risk assessments: As part of the new rules for risk assessments, the agency has included specific provisions on profiling. First, companies would need to conduct risk assessments themselves. Second, the proposed rule imposes obligations on entities that make automated decisionmaking or AI technologies available to others if it trains on personal information. In those cases, the company would need to give the other entities the information they need to conduct their own risk assessments. That information would need to be given in "plain language."
  • Automated decisionmaking that results in a "significant decision:" If there will be a "significant decision" made, the rules contemplate a "pre-use" notice. This was also contemplated in the 2023 version of the rules. However, in the 2023 version, the obligation arose if there was a "legal or similarly significant" impact (the language of CCPA). Under the proposed rules, the agency discusses "significant decisions" impacting an individual. It gives examples, including education and employment opportunities. Also included are extensive profiling and training automated decisionmaking technology that might, among other things, identify someone or make a significant decision about them.
  • Changes to company privacy policies: The rule as revised would require companies to add into the privacy policy (in the rights section) that an individual can opt out of having their information used by automated decisionmaking that results in a "significant decision." The policy also needs to explain how someone can access automated decisionmaking.

Putting It Into Practice: The California privacy agency has addressed some of the concerns raised in the initial automated decisionmaking rules. However, the obligations continue to be expansive, and may impact many organizations' uses of AI tools, especially in the HR space. That said, the obligations outlined in the rule should look familiar to those who already fall under NYC's AI law.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More