ARTICLE
10 September 2024

Children's Privacy Roundup: U.S. vs. TikTok And Federal And State Legislation Updates

DL
Davis+Gilbert LLP

Contributor

Davis+Gilbert LLP is a strategically focused, full-service mid-sized law firm of more than 130 lawyers. Founded over a century ago and located in New York City, the firm represents a wide array of clients – ranging from start-ups to some of the world's largest public companies and financial institutions.
In the fast-paced landscape of data privacy, protecting children and teens online has remained in focus this summer.
United States Media, Telecoms, IT, Entertainment

The Bottom Line

  • In the fast-paced landscape of data privacy, protecting children and teens online has remained in focus this summer.
  • The DOJ, together with the FTC, sued TikTok for multiple violations of COPPA and the FTC Act.
  • Two bills – COPPA 2.0 and KOSA – passed in the Senate and will advance to the House of Representatives for consideration.
  • Children's privacy continues to be an active area of legislation at the state level.

The Federal Trade Commission (FTC) and the Department of Justice (DOJ) have continued their commitment to protecting kids' and teens' digital privacy.

In 2019, Google and YouTube paid what was then a record fine to the FTC and the New York Attorney General for violating Children's Online Privacy Protection Act (COPPA) — $170 million total.

Then, in 2023, Epic Games faced the largest set of financial penalties for a COPPA violation in children's games.

Now, the Department of Justice, through a referral from the FTC, has filed a lawsuit against TikTok and its parent company for multiple violations of COPPA and its implementing regulations, and the FTC Act.

The DOJ and FTC Sue TikTok for Children's Privacy Violations

Since its 2019 settlement with the FTC to the tune of $5.7 million, TikTok's predecessor, Musical.ly, has been subject to a court order requiring them to undertake specific measures to comply with COPPA. The present complaint alleges that despite the court order, TikTok continued to knowingly permit children under the age of 13 to create TikTok accounts and to create, view, and share short-form videos and messages with adults and others on the TikTok platform. Furthermore, TikTok collected and retained a wide variety of personal information from these children – including persistent identifiers that were used to build profiles and target advertising to them – without notifying parents of the full extent of its data collection and use practices, or obtaining the legally required consent from them.

The complaint further alleges that TikTok also had deficient and ineffectual internal policies and processes for identifying and deleting TikTok accounts created by children. In particular, TikTok allegedly built backdoors into its platform that allowed children under 13 to create accounts without having to provide their age or obtain parental consent to use TikTok by using credentials from third-party services like Google and Instagram. Ultimately, millions of accounts were created this way and classified as "age unknown" accounts.

Even for accounts that were created in "Kids Mode" (a pared-back version of TikTok intended for children under 13), TikTok unlawfully collected and retained children's email addresses and other types of personal information.

Further, when parents discovered their children's accounts and asked TikTok to delete the accounts and information in them, TikTok not only made it difficult for parents to submit such deletion requests, but it also frequently failed to honor those requests.

The complaint asks the court to impose civil penalties against ByteDance and TikTok and to enter a permanent injunction against them to prevent future violations of COPPA. The FTC Act allows civil penalties up to $51,744 per violation, per day.

Legislative Developments on Children's Privacy

Meanwhile, Congress is in the midst of passing the most significant overhaul of children's privacy legislation since COPPA was enacted in 1998. On June 30, the U.S. Senate overwhelmingly passed the Children and Teens' Online Privacy Protection Act (COPPA 2.0) and the Kids Online Safety Act (KOSA).

COPPA 2.0 would extend the protections under the original COPPA law for children under 13 to include teenagers under 17 years of age, thus requiring companies to review the scope their parental consent mechanisms and the nature of their disclosures on how children's and teens' data is collected and used. It would also heighten the standard for compliance, raising COPPA's "actual knowledge" standard to one that covers platforms that are "reasonably likely to be used" by children or minors.

Importantly, COPPA 2.0 would also ban targeted advertising to children and teens; create an "eraser button" that would enable parents and kids to eliminate personal information online about a child or teen; establish a "Digital Marketing Bill of Rights for Teens" that limits the collection of personal information of teens; and establish a Youth Marketing and Privacy Division at the FTC.

KOSA provides children and parents with the tools, safeguards and transparency to protect against online harms. It establishes a "duty of care" for online platforms and requires them to activate the most protective settings for kids by default, providing minors with options to protect their information, disable addictive product features and opt-out of personalized algorithmic recommendations.

COPPA 2.0 and KOSA will move to the House of Representatives, which returns from recess on September 9. It remains to be seen whether the Senate's momentum will continue, especially given the tumultuous legislative agenda ahead of the November election. In any event, if the bills become law, they will likely face judicial scrutiny, similarly to how the California Age-Appropriate Design Code was challenged by industry trade associations on First Amendment grounds.

State Children's Privacy Laws

Despite the Ninth Circuit's decision to uphold much of a lower court's injunction of the California Age-Appropriate Design Code, a number of states have advanced similar legislation. The Maryland Age Appropriate Design Code, which applies to online products "reasonably likely to be accessed by children" takes effect October 1, 2024, and mirrors a number of elements of KOSA, including a duty of care for companies that process children's data and default settings for privacy protections. The Maryland law also requires covered businesses to complete a Data Protection Impact Assessment for each online service, product or feature reasonably likely to be accessed by children, and prohibits processing of children's precise geolocation data by default, unless the precise geodata is strictly necessary to provide the product and the business processes the precise geodata for the limited time necessary to provide the product.

Maryland is the second state this year to enact legislation to protect children and teens under 18 years of age, with New York's SAFE for Kids Act and Child Data Protection Act having been signed by the governor in June.

Virginia, Colorado, and Connecticut all amended their existing consumer data privacy laws to impose additional requirements on controllers that process the personal data of a known child under 13 years of age, while existing laws in Utah and Florida impose a range of requirements for online platforms that provide an online service, product, game, or feature likely to be predominantly accessed by children (i.e. social media).

Michigan and Pennsylvania are currently considering age appropriate design codes of their own.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More