ARTICLE
22 May 2025

When It Comes To Our Children, Social Media Companies Cannot Be Trusted To Self-Regulate

BS
Boies Schiller Flexner

Contributor

Boies Schiller Flexner is a firm of internationally recognized trial lawyers, crisis managers, and strategic advisers known for our creative, aggressive, and efficient pursuit of successful outcomes for our clients.

Another day brings another report about the perils of allowing Big Tech to self-regulate when it comes to our children. In releasing Teen Accounts in September 2024, Meta stated that "teens will be placed into the strictest setting of our sensitive content control, so they're even less likely to be recommended sensitive content, and in many cases we hide this content altogether from teens, even if it's shared by someone they follow."
United States Media, Telecoms, IT, Entertainment

Another day brings another report about the perils of allowing Big Tech to self-regulate when it comes to our children. In releasing Teen Accounts in September 2024, Meta stated that "teens will be placed into the strictest setting of our sensitive content control, so they're even less likely to be recommended sensitive content, and in many cases we hide this content altogether from teens, even if it's shared by someone they follow." However, when Gen Z users and a Washington Post columnist set up accounts as young teenagers on Instagram, they were exposed to inappropriate sexual content, drugs, alcohol, hate, and other posts too graphic to describe in the article.

Meta's Teen Accounts seemingly do very little to protect young users from disturbing content, and even worse, they do nothing to protect them from content that leads to psychological and physiological harm.

Cigarettes and alcohol are heavily regulated industries that are prohibited from marketing and selling their products to children, yet we allow social media, which has been proven to be just as, if not more, dangerous to minors, into our children's lives for hours each day. These companies convince us that their platforms are necessary for our children's friendships to thrive, and each time a legislature threatens action, they promise to self-regulate to protect our children. However, they have no motivation to meaningfully do so, because each child they hook is another dollar added to their bottom line.

Litigation against social media companies over harms done to teens has been abundant, with multidistrict litigation in the Northern District of California managing more than1,600 cases and counting. These cases allege claims against social media companies for the harms caused to children due to social media addiction, including being groomed by pedophiles, engaging in self-harm, and even committing suicide.

Social media companies often try to hide behind Section 230 of the Communications Act and the First Amendment, arguing they bear no liability for any conduct tied to third-party content. However, neither Section 230 nor the First Amendment was intended to protect companies from features they design, like algorithmic feeds and endless push notifications, to addict our children to their platforms. U.S. District Judge Yvonne Gonzalez-Rodriguez recognized this in a recent decision, largely denying the companies' omnibus motion to dismiss, holding that allegations based on specific platform features are "neither barred by Section 230 nor the First Amendment."

In addition to litigation, we need meaningful regulation at the state and federal levels to hold these companies accountable for their actions. Senators Marsha Blackburn (R-Tenn.) and Richard Blumenthal (D-Conn.) recently re-introduced the Kids Online Safety Act (KOSA), a federal bill aimed at protecting children from harmful content and practices online. The bill would, among other things, mandate that platforms enable the strongest privacy settings for kids by default (unless disabled by a parent) and create a duty of reasonable care for social media platforms with respect to design features that could potentially harm minors, including but not limited to, the promotion of suicide, eating disorders, substance abuse, and sexual exploitation. KOSA, if passed, will be one step in the right direction.

1628190.jpg

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More