ARTICLE
21 May 2025

Missouri Attorney General Introduces New Rule Requiring Social Media Companies To Offer Competitor Content Moderators

KD
Kelley Drye & Warren LLP

Contributor

Kelley Drye & Warren LLP is an AmLaw 200, Chambers ranked, full-service law firm of more than 350 attorneys and other professionals. For more than 180 years, Kelley Drye has provided legal counsel carefully connected to our client’s business strategies and has measured success by the real value we create.
In effect, the rule would require social media platforms to provide users with an option to employ the platform's in-house content moderation algorithm, or to choose from an independent content moderator
United States Missouri Media, Telecoms, IT, Entertainment

Utilizing regulatory powers under its UDAP law, the Missouri Merchandising Practices Act (the"MMPA"), Missouri Attorney General Andrew Bailey announced a new rule, codified as 15 C.S.R § 60-19, that would prohibit social media companies from requiring their users to rely on the company's in-house content moderation algorithm. Instead, social media companies must offer"algorithmic choice" to users. In a press release, the Missouri AG called the rule the first of its kind in the nation.

In effect, the rule would require social media platforms to provide users with an option to employ the platform's in-house content moderation algorithm, or to choose from an independent content moderator. The platform must offer the choice upon account activation, with no selection chosen by default, and renew the opportunity to choose at least every six months thereafter. When offering the choice among competitor moderators, the rule requires the platform to grant the selected third-party moderator access to data on the platform to the extent necessary to moderate the content viewed by the user. The AG did not specify, and it remains unclear, whether current content moderation providers can serve this role, or if a new industry of moderators will fill the needs required by the rule.

Upon granting access to third-party moderators, platforms may not favor their own algorithm over another by limiting a competitor's functionality or inhibiting user choice. However, the rule allows platforms to set access limits on third-party moderators to the extent necessary to reasonably protect trade secrets, proprietary processes, and privacy information and secure the platform from hacking and other information security threats. In addition, a platform can advertise or promote their own content moderation outside the moderator-choice prompt screen. They may also continue being the sole moderator of enumerated categories of content, which include: (1) content it is specifically authorized to restrict or moderate under federal law, (2) content subject to requests or referrals concerning CSAM or sexual abuse, (3) content that directly incites criminal activity or consists of specific threats of violence targeted against a person or group due to protected classes, (4) sexually explicit content, or (5) otherwise unlawful expression consistent with the Constitution.

By filing the rule, the Missouri AG's office notes in its press release that it sought to address perceived"corporate censorship" in online forums. The AG argued that tech companies engaged in deceptive and unfair trade practices by implementing their own content moderation algorithm without offering competitor algorithms or allowing users to choose their own content moderators. The office invoked both free speech interests and the prohibition of anticompetitive practices under the MMPA. Characterizing the social media industry as consolidated and"centrally controlled," implying potential violations of state and federal antitrust laws, the AG argued Missourians are harmed when online speech and content is subject to the moderation of a few large industry players.

As with previous attempts at implementing social media regulations regarding content moderation, the focus of the rule is on "Big Tech." The rule would apply to any media website open to the public with user-generated content which has at least 50 million distinct active users in the United States (or 1 billion worldwide) in a calendar month. A platform may also be subject to the rule if at least 30% of Americans regularly obtain news from the platform, unless such platform qualifies under one of the enumerated excluded mediums (an"Internet Service Provider"; online encyclopedia; email; electronic dating service; or sites primarily consisting of news, sports, entertainment, or other content that is not user-generated, etc.). It is unclear how the AG intends to measure where Americans regularly obtain their news for purposes of this provision. According to the rule, compliance with the portion of the rule regarding the prohibition on restricting choice of content moderator will cost private entities up to $41.96 million for the first year and $11.96 million for subsequent years. The text of the rule clarified that the risk to users"comes from concentrated control" and exempts smaller and medium-sized platforms from enforcement under the MMPA. As written, these smaller platforms do not violate the rule by not permitting user choice in moderation.

The regulation comes on the back of Supreme Court case, Moody v. NetChoice, which we covered here. In that decision, the Supreme Court was presented with a circuit split arising from laws out of Texas and Florida that sought to prohibit social media companies from moderating content based on certain viewpoints. While the Court took issue with the facial analysis of both circuit courts, the Supreme Court also disagreed with the Fifth Circuit's speech analysis as applied to moderating content. Ultimately, the Court found that the editorial judgments of the platform's content moderation regimes were protected expressive activity. The AG's press release states the Missouri regulation was grounded in the Supreme Court's guidance in NetChoice, which the AG characterized as recognizing "the authority of state governments to enforce competitions laws in the interest of free expression."

The Missouri rule, while first of its kind, is one of a number of laws and regulations targeting social media platforms and Big Tech broadly, many addressing social media access (and its alleged dangers) to children. Other states that have implemented rules or taken enforcement actions with respect to social media companies include Florida, New Jersey, and New Mexico. These efforts have seen mixed results, with some laws enjoined by the judiciary.

Before taking effect, the Missouri AG's new rule must undergo a notice and comment period of 30 days after publication in the register. Bailey's office states they will be holding public forums to hear input and gather evidence pertaining to the rule from Missourians (though the current posted rule states that no public hearing is scheduled). As with other recent social media regulations, it seems possible the rule will be challenged if it passes the notice and comment period and ultimately takes effect.

The Missouri AG's proposed new rule is the latest example of AGs addressing what they perceive to be censorship in the online space. While many previous rules and enforcement actions have sought to protect children from alleged harm online, including the Texas and Florida laws at issue inNetChoice, this rule reflects a continued broad approach to regulating social media companies through state AG authority. Companies should stay current with the shifting legal landscape as states pursue policies implicating social media platforms.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More