Data Protection Impact Assessments: Are You Ready?

Squire Patton Boggs LLP


Squire Patton Boggs LLP
This year has widened the landscape of consumer privacy protections, with dozens of comprehensive privacy bills moving through state legislatures and becoming enacted.
United States Privacy
To print this article, all you need is to be registered or login on

This year has widened the landscape of consumer privacy protections, with dozens of comprehensive privacy bills moving through state legislatures and becoming enacted. So far in 2023, Iowa's Act Relating to Consumer Data Protection ("Iowa Privacy Law") and Indiana's Consumer Data Protection Act ("ICDPA") were signed into law. These two laws join the Virginia Consumer Data Protection Act ("VCDPA"), California Privacy Rights Act ("CPRA"), Colorado Privacy Rights Act ("CPA"), Connecticut's Public Act No. 22-15 ("CTPA"), and Utah Consumer Privacy Act ("UCPA") in the state comprehensive consumer privacy law framework. The Iowa Privacy Law becomes effective on January 1, 2025, and the ICDPA becomes effective on July 1, 2026. The VCDPA and CPRA (amending the California Consumer Privacy Act or "CCPA") went into effect on January 1, 2023, while the CPA and CTPA go into effect on July 1, 2023. The UCPA will go into effect December 31, 2023.

In addition, a few states are on the cusp of enacting their own comprehensive consumer privacy laws, including Montana, Tennessee, and Florida. Privacy World reported on bills that are moving through state legislatures here, although there have been several developments since then.

A common thread among several comprehensive state privacy laws is the requirement to conduct and document a data protection assessment (commonly known as "Data Protection Impact Assessment" or "DPIA") in various circumstances. We previously discussed DPIA requirements under VCDPA, CTPA, CCPA, and CPA here. In addition to these four state laws, the ICPDA also obligates DPIAs, although the requirements mirror the CTPA and VCDPA. So far only Colorado has promulgated detailed requirements on what DPIAs must include, though the California Privacy Protection Agency has on its May 15, 2023 meeting agenda a report on the status of its DIPA rulemaking. Other privacy laws may also apply and obligate DPIAs, including the California Age-Appropriate Design Code Act ("CAADCA") and New York City's Local Law 144 ("Local Law 144"). We discuss these below.

California Age-Appropriate Design Code Act

Under CAADCA, businesses that provide an online service, product, or feature ("online service") likely to be accessed by children (defined as consumers under 18-years of age) must complete a DPIA before such online service is offered to the public, beginning on July 1, 2024. The DPIA must identify the purpose of the online service, how it uses children's personal information, and the risks of material detriment to children that arise from the data management practices of the business. Specifically, the DPIA must address:

  • Whether the design of the online service could harm children, including by exposing children to harmful, or potentially harmful, content on the online service;
  • Whether the design of the online service could lead to children experiencing or being targeted by harmful, or potentially harmful, contacts on the online service;
  • Whether the design of the online service could permit children to witness, participate in, or be subject to harmful, or potentially harmful, conduct on the online service;
  • Whether the design of the online service could allow children to be party to or exploited by a harmful, or potentially harmful, contact on the online service;
  • Whether algorithms used by the online service could harm children;
  • Whether targeted advertising systems used by the online service could harm children;
  • Whether and how the online service uses system design features to increase, sustain, or extend use of the online service by children, including the automatic playing of media, rewards for time spent, and notifications; and
  • Whether, how, and for what purpose the online service collects or processes sensitive personal information of children.

Businesses must create a timed plan to mitigate or eliminate risks identified before the online service is accessed by children. The California Attorney General can make written requests to see specific DPIAs or a list of all DPIAs the business has completed, and the business must comply within 3 to 5 business days. This chart identifies the basic requirements for DPIAs under the VCDPA, CPA, CTPA, and CPRA as compared to requirements under CAADCA.

Algorithmic Discrimination Laws

2023 has also seen a rise in bills and laws addressing algorithmic discrimination. For example, New York City's Local Law 144 addresses the use of Automated Employment Decision Tools ("AEDT"s), which are used to screen and score job applicants and employees applying for promotions. Local Law 144 requires employers and employment agencies to conduct a bias audit within one year of using an AEDT.

  • For AEDTs that select or classify individuals into groups, the bias audit must calculate the selection rate for each category (defined as any component 1 category required to be reported by employers pursuant to 42 U.S.C. 2000e-8, including sex and race/ethnicity for each occupational position) and calculate the impact ratio for each category.
  • For AEDTs that score applicants or candidates, bias audits must similarly calculate the average score for individuals in each category and calculate the impact ratio for each category.

Results of bias audits must be publicly available on the employer's or employment agency's website. Local Law 144 will likely be enforced beginning on July 5, 2023. The final rule, found here, helpfully provides charts and formulas to reference when conducting bias audits.

There are also bills in state legislatures intending the regulate algorithmic decision-making, including by requiring assessments or audits. Two such bills include California's AB 331, which intends to generally regulate Automated Decision Tools, and Washington D.C.'s Stop Discrimination by Algorithms Act, which prohibits the making of algorithmic eligibility determinations on the basis of race, color, religion, national origin, sex, gender identity, sexual orientation, familial status, source of income, or disability in a manner that discriminates against an individual or class of individuals.

The requirements for the content of data protection assessments can be extensive, and new laws are being passed quickly. We discuss them in detail in our Data Protection Assessment Tool Kit, which includes data protection assessment templates and is available to clients for a fixed fee. The templates are designed to be configured on privacy management software platforms such as OneTrust, or can be used via project management or ticketing systems. Privacy World will continue to cover developments. For more information, contact your relationship partner at SPB.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More