Client Alerts & Insights

Social Media Might Have to Rethink Platform Design and Features as Courts Reject Communications Decency Act, Section 230 Defense

May 1, 2026

Key Takeaways

  • Section 230 immunity is narrowing for social media companies. Recent rulings in Massachusetts and California held that Section 230 does not shield platforms from claims based on addictive or harmful design features, as opposed to third‑party content, significantly weakening a long‑relied‑upon defense.
  • Courts and juries are increasingly finding liability for negligent platform design. A California bellwether jury found Meta and YouTube liable for knowingly creating addictive features harmful to teens, awarding both compensatory and punitive damages, and signaling growing judicial skepticism toward Section 230 and First Amendment defenses in design‑based claims.
  • Litigation risk and exposure are rapidly escalating industry‑wide. These rulings provide plaintiffs with precedents to bypass Section 230, opening the door to thousands of personal injury, school district and state attorney general claims that could materially impact social media companies’ business models, platform design and long‑term liability exposure.

On April 10, 2026, on appeal from a motion to dismiss, the Massachusetts Supreme Judicial Court held that section 230 of the Communications Decency Act (“Section 230”) did not bar Massachusetts’s claims that Meta engaged in unfair business practices by creating a platform that was addictive to teens and failing to warn the public about it.[1] 

This decision is significant because social media companies have used Section 230 as a defense against negligence and personal injury claims.[2] Social media companies have claimed that they are protected under Section 230 as a service provider,[3] which establishes federal immunity from liability for content generated by third parties.[4] To qualify for Section 230 immunity, a defendant must show that: (1) it is a provider or user of an interactive computer service; (2) that the claim would treat the defendant as the publisher or speaker of information; and (3) that the information was provided by another information content provider.[5] 

The Massachusetts Court held that the alleged harm was not a result of third-party statements, and rather, focused on the alleged design features that prolonged user time on the platform and ineffective design of age-gating mechanisms.[6] Accordingly, Section 230 immunity did not apply to bar the unfair business practices claims.[7]

The Massachusetts court also questioned the federal district court’s decision in the multidistrict litigation (“MDL”) In re Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, which dismissed similar claims—such as the negligent design claims—reasoning that the social medial companies were immune under Section 230.[8] The MDL plaintiffs appealed the decision to the Ninth Circuit, which heard the oral arguments on January 6, 2026. Meta (Facebook and Instagram) is now waiting for the decision.

Earlier, on March 24, 2026, a California jury, hearing the bellwether cases[9] with claims against Meta and YouTube (Google), ruled that the social media companies were liable for negligently designing platform features that harm teens’ mental health.[10] The case was tried after it survived a Motion for Summary Judgment where the California Court also held that neither Section 230 nor the First Amendment barred Plaintiffs’ design claims.[11]

The jury found that the social media companies should have known that the features were addictive and unsuitable for children, yet failed to take any action.[12] Furthermore, the jury found that the social media companies acted with malice, oppression or fraud.[13] This led to an additional $3 million in punitive damages on top of the $3 million compensatory damages.[14]  While these dollar figures might not seem significant to a multi-billion dollar company, the impact of this series of rulings substantially increased the social media companies’ damage exposure. Considering the thousands of personal injury claims proliferating across the country, the aggregate damage exposure becomes nuclear.

Moreover, the Massachusetts case and the California bellwether case have provided plaintiffs with a playbook to circumvent social media companies’ Section 230 immunity, opening the gate for claims against addictive features that are harmful to young adults.[15] These verdicts could lead to expanded claims against other features and users of different ages. It also puts social media companies on notice, which would make further similar violations willful and intentional.

With these potential consequences, it is not surprising that Meta has publicly stated that it plans to appeal while it faces the next set of bellwether trials in both state and federal courts. On April 15, 2026, the federal MDL Court heard the motion for summary judgment by school districts claiming social media companies’ negligence. The school districts allege that they bear the brunt of the public health crisis created by the social media companies. The trial is scheduled for June 15, 2026, which will be followed by a state attorneys general bellwether trial advancing similar claims. That trial is set for August 6, 2026.[16]

The social media industry is facing unprecedented changes in 2026, as various jurisdictions have ruled unfavorably against the Section 230 defense. Companies in the industry should closely monitor these cases for further impact on platform business model and design features.

If your company is looking to navigate compliance, investigations or litigation related to your platform’s business model and design features, contact Benesch’s State Attorneys General Investigations & Enforcement Practice Group.


[1] Commonwealth v. Meta Platforms, Inc., No. SJC-13747, 2026 WL 969430 (Mass. Apr. 10, 2026).

[2] Id., at *13.

[3] Id. at *11 (a service provider is an entity that provides “information content” where “it directly and materially contributes to what makes the content itself unlawful”).

[4] Id., at *10 (citing M.P. v. Meta Platforms Inc., 127 F.4th 516, 525, 530 (4th Cir.)).

[5] Id. at *7.

[6] Id. at *14, 16.

[7] Id. at *14.

[8] Id. at *13 (citing In re Social Media Adolescent Additon/Personal Injury Prods. Liab. Litig., 753 F. Supp. 3d 849, 880-883 (N.D. Cal. 2024) (addressing consumer protection claims); In re Social Media Adolescent Addiction/Personal Injury Prods. Liab. Litig., 702 F. Supp. 3d 809, 830-834 (N.D. Cal. 2023) (addressing negligent design claims); see also In re Social Media Adolescent Addiction/Pers. Inj. Prods. Liab. Litig., 754 F. Supp. 3d 946, 963 (N.D. Cal. 2024), appeal dismissed sub nom. Fla. Off. of Att’y Gen. v. Meta Platforms, Inc., No. 24-7019, 2024 WL 5443167 (9th Cir. Dec. 16, 2024), and motion to certify appeal denied, No. 4:22-MD-3047-YGR, 2025 WL 1182578 (N.D. Cal. Mar. 11, 2025) (finding that use of algorithm promotion addictive engagement is barred by Section 230).

[9] Kevin Frankel et al., California’s Social Media Litigation Poses Potential to Reshape the Legal Landscape, Benesch (Feb. 27, 2026), https://www.beneschlaw.com/insight/californias-social-media-litigation-poses-potential-to-reshape-the-legal-landscape/.

[10] Id.

[11] Id.

[12] See Verdict Form, P.F. et al (KGM) v. Meta Platforms, Inc., Case No. 22STCV21355 (March 25, 2026).

[13] Verdict Form – Compensatory Damages and Claims of Punitive Conduct, Verdict Form, P.F. et al (KGM) v. Meta Platforms, Inc., Case No. 22STCV21355.

[14] Id.

[15] See also Frankel, et al., supra note 9.

[16] Id., see also Plaintiffs’ Master Complaint (Local Gov’t and Sch. Dist.), In re: Social Media Adolescent Addiction, MDL No. 3047, Doc. No. 504 (Dec. 18, 2023).st.), In re: Social Media Adolescent Addiction, MDL No. 3047, Doc. No. 504 (Dec. 18, 2023).