As Halloween night approaches, the neighborhood air buzzes with thrill and excitement. Ghouls and monsters flood the streets, crunching over leaves while pillowcases fill up with sweets. One homeowner has divided treats into three distinct bowls. As each group approaches, the homeowner ushers the trick-or-treaters to the bowl corresponding to their age. "Little Ones" is filled to the brim with colorful toothbrushes and travel-sized toothpastes. "Teens" features a mix of chocolate bars. Lastly, the "Young Adults" bowl brims with liquor truffles for more mature palates.
A new trick-or-treater, draped in a flowy white sheet with cutout eyes—the classic Halloween ghost outfit—rings the homeowner's doorbell. Peering through her peephole, the homeowner is confused: the ghost costume hides the child's appearance. Dread washes over her as she realizes it will be difficult to determine which bowl is appropriate for the visitor to pick from.
This is the difficulty the age signals required under the New York Child Data Protection Act (NYCDPA), effective June 2025, hopes to relieve for online service providers. Just as the homeowner must decide which bowl of candy the ghost trick-or-treater may pick from—considering the child might be too young to pick from the "Young Adults" bowl—operators of online services must navigate the complexities of granting full access to their website while respecting technical age signals. Will this result in website providers spending time and resources to unmask their spooky guests, or will they just shepherd everyone to take from the "Little Ones" bowl?
Just as children collect candy door to door, websites gather bits of information known as cookies from visitors. These small data treats, while not edible, help websites personalize user experience and make them more user friendly, typically through memorizing preferences, such as a trick-or-treater's favorite candy bar or horror movie. Cookies, much like their physical counterparts, aren't new, and privacy laws have been knocking on their door for quite some time. Even the Federal Children's Online Privacy Protection Act (COPPA), passed in 1998, covers cookies if they collect personal information directly from children under 13.
However, a law explicitly calling out cookie-related signals in its statutory language is a different matter. The California Online Privacy Protection Act, passed in 2004, explicitly references "Do Not Track" signals. These signals are meant to flag cookie providers to stop tracking a user over time and across the Internet. However, most web browsers and websites continue to not support Do Not Track signals, partially because they're not legally required to acknowledge them.
For some time, it seemed that adoption of a universal signal to set cookie preferences was just a story told around a campfire. Then the California Consumer Privacy Act (CCPA), enacted in 2018, arose, explicitly referencing "opt-out preference signals," which include tools that a user can set to signal the user's preference to opt-out of cross-context behavioral advertising cookies. Subsequently, CCPA copycat laws (such as in Colorado, Connecticut, Delaware, Montana, Oregon, Texas, and New Jersey) also adopted requiring businesses to respect opt-out preference signals. So far, the only signal that has been approved by state enforcers is the Global Privacy Control (GPC).
So how does a Halloween fright like legally-mandated cookie-related signals make its way to New York, a state without a comprehensive consumer privacy law?
The NYCDPA regulates online services that are at least partially "targeted to minors" or where the operator has "actual knowledge" (not defined in the law) it is collecting personal data directly from a minor. This sounds awfully familiar to another beastie, COPPA. In fact, many of NYCDPA's requirements echo COPPA, with some additional flavor. For example, the NYCDPA also borrows from the CCPA and similar state comprehensive privacy laws, and prohibits selling children's personal data unless consent is obtained. Specifically, the NYCDPA applies to "covered users," defined as minors, including those under the age of 13 and those age 13 to 17. Before website operators can process personal information about covered users, they must obtain informed consent, subject to narrow exceptions. This informed consent can be sent by a user through an age signal.
When it comes to age signals, the NYCDPA is the first privacy law to explicitly introduce this technical solution. Age signals include those sent through a browser plug-in, privacy setting or device setting. If a covered user sends a signal to decline providing informed consent, then the website operator cannot ask for such consent. Additionally, and broadly written, the website operator must treat a user as a "covered user," regardless of actual age, if the user's device communicates or signals that the user is or shall be treated as a minor.
Respecting such signals adds a layer of transparency and control over personal information, theoretically empowering minors (or their guardians) to actively manage privacy settings. However, these goals may be ahead of the actual technology available to facilitate compliance.
In other words, there is a gap between the NYCDPA's "actual knowledge" standard and age signal requirements. What constitutes "actual knowledge" is not defined, but other children's privacy, age-appropriate design, and online safety laws, including COPPA, have demonstrated that the standard can be interpreted as including constructive knowledge or a willful disregard for a user's age. By not being clear as to what standard it has chosen for this term, NYCDPA has made it unclear whether a circular horror has been created in the law. Do these signals implicitly indicate an age, because the informed consent signal means to signify a choice that only "covered users", namely those under 18, have under this law? Does this mean any consent-related signal, such as a do-not-track, opt-out preference, or GPC signal, counts as indicating a choice about informed consent (namely, not consenting to data processing)? Does this mean that the law contemplates two different types of signals, one for informed consent and one signifying age? How does this work with the concept that parents may use these signals set by other legal regimes as a way to indicate not consenting to processing due to the child's age? Does this lead to the need to consider these factors as part of measuring "actual knowledge"?
It is a bit spooky that there is a possibility that other types of signals could be construed as creating "actual knowledge" of a user being a "covered user," even if those signals were not designed as age signals. We highly anticipate the next steps in the Rulemaking for Regulations to the NYCDPA. After all, a call for public comments included the question, "What standards should the regulations set for acceptable device communications or signals that a user is a minor or consents or refuses to consent to data processing?" That is a question that is sure to haunt us on this eerie October evening. They say that if you're still thinking about this question past tonight's witching hour, it may even follow you home from the office...
But that's enough scary tales. Until next time, here's hoping the privacy tricks don't outweigh the treats of digital safety!
Contributing author: Jimmy Nguyen
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.