Last summer in Manhattan, Matthew Bergman, the founder of the Social Media Victims Law Center (SMVLC), sat outside a courthouse preparing to represent Norma Nazario. Nazario’s 15-year-old son Zackery had died earlier that year while engaging in "subway surfing" aboard a moving Brooklyn-bound J train, a dangerous stunt. His mother contends that this behavior was influenced by social media algorithms on platforms like TikTok and Instagram, which Bergman’s lawsuit asserts "targeted, goaded and encouraged" Zackery to take such risks due to their "unreasonably dangerous design." As sunlight filtered through the courtroom windows, Bergman wrote "230" surrounded by a circle on his legal pad, symbolizing the challenge Section 230 of the Communications Decency Act poses to holding platforms accountable.
Section 230, enacted in 1996, offers expansive immunity to online platforms for content created by users, making it difficult to pursue legal responsibility for harms caused by content hosted on these sites. For years, numerous advocates—including parents, lawyers, and mental health experts—have criticized this statute for blocking efforts to seek accountability from social media companies for adverse effects on users.
In recent years, Bergman has emerged as a leading attorney representing families whose children have suffered serious consequences tied to social media use. His clients include parents of minors who died by suicide or drug overdose, children allegedly groomed and sexually exploited by online predators, as well as youths who developed severe anorexia influenced by online content. Last week, the SMVLC filed seven lawsuits against OpenAI, three involving individuals reportedly urged toward suicide by interactions with ChatGPT. (OpenAI did not respond to requests for comment.)
Bergman’s strategic innovation in these cases centers on sidestepping Section 230 protections by framing social media companies as manufacturers of defective products rather than mere hosts of user-generated content. He pursues claims of negligence in platform design and allegations that companies have misrepresented the safety of their offerings. This product-liability approach has laid the foundation for thousands of cases Bergman has lodged in state and federal courts nationwide, suing major platforms like Instagram, TikTok, and Snapchat. Meta, TikTok, and Snap declined to comment.
Although product liability lawsuits against social media are not unprecedented—similar cases against Myspace and Grindr in 2009 and 2017 respectively were dismissed due to Section 230 limits—the success of Bergman’s method remains uncertain. Nonetheless, he has become a prominent figure and rallying point for families seeking justice, boasting more than 4,000 clients. Through these cases, Bergman is helping convert a complex legal battle into a broader movement advocating for digital safety reform.
Previn Warren, co-lead counsel in a multidistrict federal case involving over 1,300 of Bergman’s clients, notes Bergman’s effectiveness in elevating awareness and uniting affected families into a formidable political lobbying force. Approximately 1,500 of Bergman’s suits have survived initial legal challenges related to Section 230 and are advancing within multi-plaintiff actions in various courts.
Experts observe that while Section 230 was once thought to be an almost insurmountable obstacle, recent cases indicate vulnerabilities in its protections. University of Virginia law professor Danielle Citron remarks that there is now a visible "chink in that armor." However, critics like Eric Goldman of Santa Clara University caution that applying product liability concepts to intangible digital platforms may overextend legal theory, as these platforms cannot feasibly prevent all online harms and extending liability might render risks legally unmanageable.
Bergman has achieved a seven-figure settlement in only one social media case to date. He emphasizes, though, that his clients prioritize systemic safety reforms over financial compensation, aiming to compel tech companies to change design practices and safeguard children more effectively. "Kids are dying every day," Bergman asserts. "This is not simply a question of seeking recompense for past wrongs. This is a moral crusade to stop the killing."
Outside the courtroom, Bergman’s personal environment reflects his eclectic interests: pre-war Upper East Side residence adorned with Picasso prints and ancient artifacts. His legal career spans decades, including significant experience litigating asbestos cases on behalf of severely ill clients, recovering over $1 billion for his plaintiffs. Drawing parallels, Bergman sees social media companies similarly aware of harm yet concealing evidence.
He established the SMVLC in 2021 after sensing shifts in judicial views on Section 230 following a notable Ninth Circuit ruling in Lemmon v. Snap and Facebook whistleblower Frances Haugen’s congressional testimony on youth mental health risks. A serious car accident that same year heightened his resolve. Laura Marquez-Garrett, a Harvard-trained corporate litigator, joined him early on, accepting a pay cut due to strong belief in the mission to represent affected families and seek justice both legally and publicly.
One poignant case involved Selena Rodriguez, a Connecticut girl who became addicted to Snapchat at age nine. According to her mother, Selena exhibited violent behavior when separated from her phone and sought WiFi access outdoors. She was exposed to sexual predation on the platform, became suicidal, and died by overdose at 11. Rodriguez promptly contacted SMVLC, becoming among the first clients to file suits against Snap, Meta, and TikTok, while lobbying for safer policies like the Kids Online Safety Act. Bergman notes that when CEO Mark Zuckerberg apologized publicly for social media’s harms to families, it was to his clients.
Selena’s case is part of ongoing federal multidistrict litigation aimed at holding tech companies accountable and prompting changes to reduce teenage addiction and online risks. Bergman challenges the industry’s exemption from normal product liability standards, emphasizing that all companies must bear responsibility for defective designs, questioning why social media should be an exception.
Rodriguez’s involvement with the litigation helped channel her grief constructively, redirecting blame toward companies rather than her family. She and other SMVLC clients, who typically do not seek monetary damages, underscore their focus on accountability and preventing future tragedies.