The Federal Trade Commission (FTC or Commission) recently announced that it will hold a virtual workshop to examine the use of design features that aim to monopolize kids' time online, a concept that has been branded the "attention economy."
The virtual workshop, to be held February 25, 2025, will bring together researchers, technologists, child development and legal experts, consumer advocates and industry professionals to explore how the attention economy has a particular impact on kids and teens.
The topics to be discussed include:
- Whether and how certain design features result in more engagement or time spent on digital platforms and what relevant scientific research exists on the topic;
- The physical and psychological impacts, both positive and negative, of the design features on youth well-being; and
- What measures or design considerations related to youth well-being might be effective, feasible, and consistent with the current legal landscape.
In the view of at least one commissioner, examples of design strategies aimed at keeping kids online include intermittent variable rewards, video autoplay, 24-hour push notifications and nudges, infinite scroll, content that expires within a predetermined window and quantified public popularity that enables social comparison. Video auto-play (automatically playing another video when one video ends) in particular has been given as an example of a "dark pattern." No doubt we will see some or all of these specific design techniques discussed at the FTC's workshop in February.
While this is just a workshop, workshops often lead to reports, and reports can lead to rules and/or enforcement actions – and perhaps recommendations to Congress. Therefore, whenever the FTC expresses an interest in an issue – even in a more informal setting like a workshop and even where the goal (protecting kids online) is laudable – it is important to consider how that issue fits within the FTC's rulemaking and enforcement authority.
With respect to protecting children online, the FTC's primary tools are through either (1) its general Section 5 authority, which prohibits unfair and deceptive trade practices, or (2) the Children's Online Privacy Protection Act (COPPA) and the associated COPPA Rule, which impose certain requirements on operators of websites that are directed at children under age 13 or that knowingly collect personal information from that age group.
While COPPA doesn't seem like the most natural fit – after all, it is about protecting children's online privacy – we have already seen an attempt by the agency to stretch COPPA to impose limits on "nudging" kids to stay online. Under the proposed revisions to the COPPA Rule, announced in December 2023, operators would be required to obtain verifiable parental consent before using or disclosing persistent identifiers or other personal information to optimize children's attention or maximize their engagement with a website or an online service, and any such tactics must be explicitly declared in the operator's online notices. In proposing this change, the Commission noted that it sought to address concerns that children "may be overusing online services due to engagement-enhancing techniques."
With the fate of the proposed COPPA Rule – and this particular provision – unknown, that leaves the FTC with its general unfairness authority as the most likely path to regulating "engagement-enhancing techniques" used by online platforms to keep kids hooked online. Indeed, FTC Commissioner Alvaro M. Bedoya signaled as much in remarks he made last year before the National Academies of Sciences, Engineering & Medicine. He noted that the Commission has used its unfairness authority "to stop what appears to be one of the drivers of mental health issues online." On the other hand, in other settings, Republican commissioners often express concerns with the agency broadening the scope of its unfairness authority in unprecedented ways.
Not surprisingly, then, many stakeholders have expressed concerns that existing tools are insufficient to protect kids, tweens and teens online. There are numerous proposals at both the state and federal levels that are trying to remedy this, including the federal Kids Online Safety Act (KOSA). KOSA would require internet service platforms to take measures to reduce online dangers and grants specific enforcement authority to the FTC. However, as noted in a recent blog post, although the Senate-passed version of KOSA would require technology companies to design products in a way that would mitigate "patterns of use that indicate or encourage compulsive use by minors," a House version removed that provision. Bottom line, KOSA remains a heavily debated bill and its future is very much unknown.
As bills aimed at protecting kids online make their way through Congress and with similar legislation being proposed at the state level, it will be interesting to keep an eye on any FTC activity in this space.
In the meantime, interested stakeholders looking to contribute to the FTC's virtual workshop in February have until November 15 to register their interest and expertise with the FTC.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.