Benesch, Friedlander, Coplan & Aronoff LLP Benesch, Friedlander, Coplan & Aronoff LLP
PeopleServices

Menu

  • People
  • Services
  • Resources
  • Locations
  • Careers
  • About
  • Contact
New Hampshire Joins Data Protection Trend, Passes Comprehensive Data Protection Law
  1. Resources
July 11, 2025

The Intersection of Social Media, AI, and Product Liability

Client Bulletins
Authors : Steven M. Selna, Brinson Elliott

State governments, public school districts, and individuals are suing social media companies, such as TikTok and Snapchat, alleging that defects in their algorithms and platform features cause psychological and physical harm, especially to young users.

These cases raise novel questions about technology regulation and consumer protection and are testing the boundaries of product liability doctrine. Courts have yet to decide whether these companies are liable for alleged harm, but they have issued several rulings narrowing the scope of litigation theories that can survive dismissal.

Social media: platforms, products, or services?

Disagreement over whether social media platforms should be classified as products under the traditional product liability framework has arisen primarily in cases involving youth social media addiction.

There are now 1,867 cases pending in the Adolescent Social Media Addiction Multi-District Litigation (MDL). On June 16, 2025, US District Judge Yvonne Gonzales Rogers, who is overseeing the social media addiction MDL in the Northern District of California, selected 11 lawsuits for bellwether trials. The cases involve six school districts (AZ, GA, KY, MD, NJ, and SC) and five individual plaintiffs, with trials beginning in 2026.

In November 2023, Judge Rogers ruled that various design defects alleged by plaintiffs indeed refer to products or product components:

Classified as products:

  • Failure to implement robust age verification processes to determine users’ ages
  • Failure to implement effective parental controls and notifications
  • Needlessly complicating the account deactivation/deletion process, disincentivizing users from leaving the platform
  • Failure to label images edited through in-app filters as edited content (if plaintiffs’ allegations focus on the design of the filter)
  • Filters enabling users to manipulate content before posting or sharing it on defendant’s platforms (e.g., beauty-enhancing features, such as “blur imperfections, and filters enabling users to overlay content on top of existing content, such as Snapchat’s Speed Filter, discussed below)
  • Failure to create sexual abuse/CSAM reporting tools accessible to those without an account

Classified as product components:

  • Failure to implement opt-in restrictions and default protections for the length and frequency of use sessions

A fellow judge has since clarified that plaintiffs’ claims must concern defects in the products or product components themselves to state a valid claim under product liability doctrine. “[T]his is an objection to Defendants’ decisions, after receiving Plaintiffs’ reports, to remove or not remove certain videos; it is not an objection to the functionality of the reporting tool itself. . . Such allegations fail to state a claim under products liability law.” Bogard v. TikTok (N.D. Cal. Feb. 25, 2025).

Elsewhere, in October 2023, a California superior court ruled that social media sites are not “products” under product liability doctrine. There, plaintiffs argued that defendants harvested user data to create and push algorithmically tailored feeds that space out dopamine-triggering awards and cause user addiction, especially among youth. The court ruled that social media sites are not tangible products and are better viewed as services. Likewise, social media platforms are interactive, not static products “that all consumers experience in a uniform manner.” Consequently, California’s risk-utility and consumer expectations tests – which are “at the heart of determining when a manufacturer should be liable for a product defect” – cannot be applied to these platforms. In sustaining the demurrer, the judge concluded that “allowing this case to go forward on theories of product liability would be like trying to fit a four-dimensional peg into a three-dimensional hole.” She did allow plaintiffs to proceed with their negligence claims, however.

This debate is far from over and will likely involve differentiating between platforms, products, services and their defects on a more nuanced case-by-case basis.

The Communications Decency Act (CDA) and negligent product design

Social media companies are also trying to dismiss cases on Section 230 grounds. Courts have consistently interpreted Section 230 of the CDA to provide broad immunity.

In Lemmon v. Snap (9th Cir., May 4, 2021), plaintiffs sued Snap after their children died in a car accident while using Snapchat’s Speed Filter, alleging that the “Filter and reward system worked together to encourage users to drive at dangerous speeds” sometimes exceeding 100 MPH. In reversing and remanding, the Ninth Circuit reasoned that the plaintiffs were not seeking to hold Snapchat liable as a publisher of user content, implicating Section 230, but for negligent product design. Because Snapchat can be held responsible for its product architecture, it could be found to have violated its duty to design a reasonably safe product. Snap could satisfy this duty without impacting its publication of third-party content, such as by removing its filter.

Courts are divided over whether recommendation algorithms are protected under Section 230.

In Anderson v TikTok (3rd. Cir., Aug. 2024), a mother sued on behalf of her ten-year-old daughter who died of self-asphyxiation after participating in a video challenge recommended to her by TikTok’s algorithm. “[B]ecause the information that forms the basis of Anderson’s lawsuit—i.e., TikTok’s recommendations via its FYP algorithm—is TikTok’s own expressive activity, § 230 does not bar Anderson’s claims.”

Judge Rogers, however, held that Section 230 bars plaintiffs’ claims that defendants’ algorithms are content-neutral but designed to increase user interaction with their platforms to generate profit. “Nothing in Section 230 or existing case law indicates that Section 230 only applies to publishing where a defendants’ only intent is to convey or curate information... Because plaintiffs do not show how the conduct at issue is distinct from determinations of what to publish and how, or that the alleged duty could be met other than by changing the way defendants’ publish third-party content, Section 230 bars the claim as to the recommendation algorithms.” She reasoned that this conduct is identical to publishing, whether performed by a human or an algorithm.

Industry challenges to social media legislation

Netchoice, a trade association that represents social media companies, has successfully challenged several state laws designed to safeguard children’s social media use. In the last month, federal judges in Georgia and Florida blocked laws requiring children to get parental consent before using social media apps. Earlier this year, these age-verification laws were struck down as unconstitutional in Arkansas and Ohio and temporarily blocked in Utah.

In March 2025, a California district court judge granted Netchoice’s preliminary injunction against the state’s Age-Appropriate Design Code Act, which was set to become effective in July 2024. The first of its kind, the Act will require online service providers whose platforms are “likely to be accessed by children” to complete Data Protection Impact Assessments (DIPA), addressing, among other things, whether the design of the online service could harm children or seeks to increase or sustain their use of the online service.

Conclusion

The landscape for social media platforms and product liability remains unclear.

Ahead of bellwether trials, debate is certain to continue over whether and to what extent social media platforms are subject to product liability doctrines.

Given Congress’ recent rejection of a ten-year ban on state-level AI regulation, bills addressing social media and algorithmic harms will likely also continue being proposed by state legislatures and fought by industry.

  • Steven M. Selna
    liamE
    628.600.2261
  • Intellectual Property
  • Artificial Intelligence (AI)
Stay Current. Sign up for our eAlerts
>
  • 2025 Benesch
  • Disclaimers
  • Privacy Policy
  • Related Sites
  • GDPR Statement
  • Terms
  • Client Payment Portal
  • Careers
Twitter
Facebook
LinkedIn