On September 21, 2023, a California appeals court ruled that a lawsuit alleging discrimination against women and older people via Meta's ad systems for insurance could move forward.

The plaintiff alleged that Meta requires all advertisers to choose the age and gender of users who will not be served ads and that insurance companies limit their ads from older people and women. The lower court sustained Meta's demurrer, concluding that its tools are neutral and, while they could have a disproportionate impact on a protected class, they were not intentionally discriminating. And, the lower court stated that Meta was immune under Section 230 of the Communications Decency Act ("Section 230").

After the lower court dismissed the plaintiff's suit, the court of appeals in the First Appellate District of California (the "Court") reversed, stating that the plaintiff had suffered actual harm in being denied life insurance ads because she was "deprived of information regarding . . . opportunities despite being ready and able to pursue those opportunities." Further, the Court held that Meta itself – rather than the advertisers – were responsible for discriminatory conduct. It stated that Meta "retains the discretion and ability to approve and send an ad that includes age or gender restrictions" and therefore "knowingly sends or publishes an ad that discriminates." Specifically, the court held that tools such as Meta's Lookalike Audience, a tool often used by advertisers, "are not facially neutral" and thus permit a valid Unruh Civil Rights Act claim of intentional discrimination by a business establishment.

The court also dismissed Meta's argument that it was immune from liability due to Section 230. Section 230 "immunizes providers of interactive computer services against liability arising from content created by third parties." However, immunity only applies to interactive computer service providers who are not also the information content provider, i.e. the entity responsible for the creation or development of the content at issue. Because Meta requires users to disclose their age and gender before using its services, has "designed and created an advertising system . . . that allowed insurance companies to target their ads based on . . . gender and age," and "allowed advertisers to exclude all persons of a protected category," Meta was held to create, shape, or develop content by materially contributing to the content's alleged unlawfulness. Such activity as a content developer, the court held, placed Meta outside the scope of Section 230 immunity.

The Court's ruling comes after significant focus has been placed on Meta's advertising algorithm, including a 2018 federal complaint accusing the company of permitting housing discrimination. Of major note is the Court's dismissal of Meta's Section 230 immunity, which social media platforms have long enjoyed. While this ruling could have major effects on the online advertising ecosystem, the lawsuit is still in its early stages as the Court's decision only permits the suit to move forward at the demurrer stage.

We will continue to follow the suit, and will update as necessary.

www.fkks.com

This alert provides general coverage of its subject area. We provide it with the understanding that Frankfurt Kurnit Klein & Selz is not engaged herein in rendering legal advice, and shall not be liable for any damages resulting from any error, inaccuracy, or omission. Our attorneys practice law only in jurisdictions in which they are properly authorized to do so. We do not seek to represent clients in other jurisdictions.