Attorney Challenges Social Media Giants Over Harm to Youth
November 19, 2025
Technology News

Attorney Challenges Social Media Giants Over Harm to Youth

Legal Battles Target Section 230 Immunity and Platform Design to Protect Children

Summary

Matthew Bergman, founder of the Social Media Victims Law Center, is spearheading lawsuits against major social media companies alleging their platforms encourage harmful behaviors in children. Utilizing a novel legal approach that circumvents Section 230 immunity, Bergman seeks accountability for alleged platform negligence and misleading safety claims. His work represents a growing push to hold tech companies responsible for the mental health and safety risks their products pose to children.

Key Points

Matthew Bergman founded the Social Media Victims Law Center to represent families of children harmed by social media platforms.
Bergman utilizes a product-liability legal strategy to bypass Section 230 immunity, targeting negligent platform design and misleading safety claims.
Thousands of lawsuits have been filed against major social media companies including TikTok, Instagram, and Snapchat under this framework.
Approximately 1,500 of these cases have overcome initial Section 230 defenses and are proceeding in court, often consolidated in larger multidistrict litigations.
Bergman’s clients include families affected by suicides, drug overdoses, sexual grooming, and eating disorders linked to social media use.
Expert legal opinion is divided on the applicability of product liability theories to digital platforms and the feasibility of managing online harms through litigation.
Bergman emphasizes his clients seek systemic change in platform safety rather than financial settlements, with only one case settled to date for seven figures.
Cases like that of Selena Rodriguez exemplify the personal tragedies driving these lawsuits and the movement to press Congress for stronger youth online protections.

Last summer in Manhattan, Matthew Bergman, the founder of the Social Media Victims Law Center (SMVLC), sat outside a courthouse preparing to represent Norma Nazario. Nazario’s 15-year-old son Zackery had died earlier that year while engaging in "subway surfing" aboard a moving Brooklyn-bound J train, a dangerous stunt. His mother contends that this behavior was influenced by social media algorithms on platforms like TikTok and Instagram, which Bergman’s lawsuit asserts "targeted, goaded and encouraged" Zackery to take such risks due to their "unreasonably dangerous design." As sunlight filtered through the courtroom windows, Bergman wrote "230" surrounded by a circle on his legal pad, symbolizing the challenge Section 230 of the Communications Decency Act poses to holding platforms accountable.

Section 230, enacted in 1996, offers expansive immunity to online platforms for content created by users, making it difficult to pursue legal responsibility for harms caused by content hosted on these sites. For years, numerous advocates—including parents, lawyers, and mental health experts—have criticized this statute for blocking efforts to seek accountability from social media companies for adverse effects on users.

In recent years, Bergman has emerged as a leading attorney representing families whose children have suffered serious consequences tied to social media use. His clients include parents of minors who died by suicide or drug overdose, children allegedly groomed and sexually exploited by online predators, as well as youths who developed severe anorexia influenced by online content. Last week, the SMVLC filed seven lawsuits against OpenAI, three involving individuals reportedly urged toward suicide by interactions with ChatGPT. (OpenAI did not respond to requests for comment.)

Bergman’s strategic innovation in these cases centers on sidestepping Section 230 protections by framing social media companies as manufacturers of defective products rather than mere hosts of user-generated content. He pursues claims of negligence in platform design and allegations that companies have misrepresented the safety of their offerings. This product-liability approach has laid the foundation for thousands of cases Bergman has lodged in state and federal courts nationwide, suing major platforms like Instagram, TikTok, and Snapchat. Meta, TikTok, and Snap declined to comment.

Although product liability lawsuits against social media are not unprecedented—similar cases against Myspace and Grindr in 2009 and 2017 respectively were dismissed due to Section 230 limits—the success of Bergman’s method remains uncertain. Nonetheless, he has become a prominent figure and rallying point for families seeking justice, boasting more than 4,000 clients. Through these cases, Bergman is helping convert a complex legal battle into a broader movement advocating for digital safety reform.

Previn Warren, co-lead counsel in a multidistrict federal case involving over 1,300 of Bergman’s clients, notes Bergman’s effectiveness in elevating awareness and uniting affected families into a formidable political lobbying force. Approximately 1,500 of Bergman’s suits have survived initial legal challenges related to Section 230 and are advancing within multi-plaintiff actions in various courts.

Experts observe that while Section 230 was once thought to be an almost insurmountable obstacle, recent cases indicate vulnerabilities in its protections. University of Virginia law professor Danielle Citron remarks that there is now a visible "chink in that armor." However, critics like Eric Goldman of Santa Clara University caution that applying product liability concepts to intangible digital platforms may overextend legal theory, as these platforms cannot feasibly prevent all online harms and extending liability might render risks legally unmanageable.

Bergman has achieved a seven-figure settlement in only one social media case to date. He emphasizes, though, that his clients prioritize systemic safety reforms over financial compensation, aiming to compel tech companies to change design practices and safeguard children more effectively. "Kids are dying every day," Bergman asserts. "This is not simply a question of seeking recompense for past wrongs. This is a moral crusade to stop the killing."

Outside the courtroom, Bergman’s personal environment reflects his eclectic interests: pre-war Upper East Side residence adorned with Picasso prints and ancient artifacts. His legal career spans decades, including significant experience litigating asbestos cases on behalf of severely ill clients, recovering over $1 billion for his plaintiffs. Drawing parallels, Bergman sees social media companies similarly aware of harm yet concealing evidence.

He established the SMVLC in 2021 after sensing shifts in judicial views on Section 230 following a notable Ninth Circuit ruling in Lemmon v. Snap and Facebook whistleblower Frances Haugen’s congressional testimony on youth mental health risks. A serious car accident that same year heightened his resolve. Laura Marquez-Garrett, a Harvard-trained corporate litigator, joined him early on, accepting a pay cut due to strong belief in the mission to represent affected families and seek justice both legally and publicly.

One poignant case involved Selena Rodriguez, a Connecticut girl who became addicted to Snapchat at age nine. According to her mother, Selena exhibited violent behavior when separated from her phone and sought WiFi access outdoors. She was exposed to sexual predation on the platform, became suicidal, and died by overdose at 11. Rodriguez promptly contacted SMVLC, becoming among the first clients to file suits against Snap, Meta, and TikTok, while lobbying for safer policies like the Kids Online Safety Act. Bergman notes that when CEO Mark Zuckerberg apologized publicly for social media’s harms to families, it was to his clients.

Selena’s case is part of ongoing federal multidistrict litigation aimed at holding tech companies accountable and prompting changes to reduce teenage addiction and online risks. Bergman challenges the industry’s exemption from normal product liability standards, emphasizing that all companies must bear responsibility for defective designs, questioning why social media should be an exception.

Rodriguez’s involvement with the litigation helped channel her grief constructively, redirecting blame toward companies rather than her family. She and other SMVLC clients, who typically do not seek monetary damages, underscore their focus on accountability and preventing future tragedies.

Risks
  • The broad immunity granted by Section 230 continues to pose a significant legal barrier against holding social media platforms liable for user-generated content.
  • Judicial precedent is uncertain on whether product-liability theory can be successfully applied to intangible digital platforms.
  • Critics argue that imposing liability on social media companies for all online harms may create unmanageable legal risks and unintended consequences.
  • Companies have consistently denied wrongdoing and declined to comment on ongoing litigation, indicating potential protracted legal battles.
  • The outcome of these multidistrict litigations remains unresolved, adding unpredictability to the prospects of reform through courts.
  • Some experts remain skeptical about whether legal strategies alone can effect substantial changes in the social media business model.
  • There is a possibility of prolonged litigation delaying any remedial actions that could protect vulnerable youth in the short term.
  • The complexity and novelty of these cases could lead to inconsistent rulings across jurisdictions affecting legal clarity and enforcement.
Disclosure
Education only / not financial advice
Search Articles
Category
Technology News

Technology News