In a significant legal development, social media giants Meta, TikTok, and Google-owned YouTube are set to face a court trial in Los Angeles that challenges the impact of their platforms on adolescent mental health. This case marks the first occasion where these companies must legally defend allegations that their products contain features intentionally designed to create addiction among young users, causing serious psychological harm.
The lawsuit was filed by a 19-year-old identified as KGM, alongside her mother Karen Glenn, accusing the social media firms of deliberately engineering addictive user experiences. Specifically, the complaint asserts that these addictive elements resulted in KGM suffering from mental health deterioration, including self-harm and suicidal ideation. Snap Inc., also named in the original suit, recently reached a settlement under private terms and is no longer part of ongoing litigation.
The concerns voiced by parents, health professionals, advocacy groups, whistleblowers from within the tech industry, and affected youth, have persisted for years. They cite social media as a catalyst for addictive scrolling behavior, exposure to cyberbullying, disrupted sleep patterns, and encouragement of harmful content consumption. Although technology firm executives have at times testified before Congress and issued public apologies—particularly in response to tragedies linked to platform use—little has been done to impose regulatory constraints within the United States.
KGM’s lawsuit seeks monetary compensation of unspecified amount, but its significance extends further. The trial’s verdict could influence the resolution of approximately 1,500 other personal injury lawsuits claiming similar mental health damages attributed to products from Meta, Snap, TikTok, and YouTube. The trial proceedings, expected to last multiple weeks, will include testimony from senior executives of the implicated companies.
In recent years, these companies have introduced various safety mechanisms, including parental controls and revised policies intended to protect minors. Nonetheless, they remain defiant amidst mounting legal scrutiny and parallel actions filed by school authorities and state-level legal representatives. Potential judgments could compel these corporations to pay billions in damages and alter platform functionalities significantly.
Sarah Gardner, CEO of the Heat Initiative—a nonprofit committed to protecting children online—characterized the upcoming trial as a critical step in achieving accountability for technology firms following years of perceived neglect. She compared the current litigation to historical tobacco-related lawsuits, emphasizing that families nationwide will hear directly from tech executives about the purposeful design choices promoting youth addiction.
Details of the KGM Case
The lawsuit claims that social media platforms were developed with features that drive compulsive use despite awareness of the risks to young audiences. KGM reportedly began engaging with social media at age ten, overcoming parental efforts to restrict access via third-party software. The complaint highlights that these platforms facilitated children bypassing parental consent mechanisms.
According to the lawsuit, Instagram, TikTok, and Snapchat employ "addictive design" elements and frequent notifications that encouraged compulsive usage correlating with KGM’s decline in mental health. Furthermore, the platforms allegedly enabled connections between KGM and strangers, including potentially predatory adults, facilitated by friend recommendation functions on Snapchat and Instagram.
The complaint also argues that content algorithms on Instagram and TikTok directed KGM toward depressive material and harmful social comparisons impacting body image perception. Instances of bullying and sextortion on Instagram are cited, where KGM was targeted until friends and family intervened by requesting reports to Meta, which only acted after a two-week delay.
The lawsuit asserts that the defendants’ deliberate design and operational choices inflicted severe emotional and mental distress, including dependency, anxiety, depression, self-injury, and body dysmorphia affecting KGM and her family. This case serves as a bellwether within a larger multidistrict litigation encompassing hundreds of similar complaints.
Corporate Responses and Current Safety Measures
In 2024, then-US Surgeon General Vivek Murthy publicly urged lawmakers to enforce tobacco-style warning labels on social media platforms, citing a mental health crisis among youth. Complementing these concerns, Pew Research Center data revealed that nearly half of American teenagers perceive social media’s impact on their age group as predominantly negative.
Despite this, executives from these tech companies have dismissed allegations of harm, arguing that available research remains inconclusive and underscoring the benefits their platforms offer in entertainment and social connectivity. Companies also cite Section 230 of the Communications Decency Act to shield themselves from liability regarding user-generated content.
Judge Carolyn Kuhl of the Los Angeles Superior Court, presiding over these cases, has indicated that jurors should evaluate whether platform design features—such as infinite feeds—contribute to mental health harms, aside from the content itself. Snap has emphasized that Snapchat’s design, focused on camera use rather than traditional feeds and lacking public likes, differentiates it from other social media and mitigates social comparison risks.
Each company has outlined various youth safety features: parental controls, message warnings to prevent sextortion, and removal protocols for inappropriate content. Meta has implemented "teen accounts" offering default privacy settings and content limitations, alongside AI tools to detect underage users.
A Meta spokesperson directed inquiries to a dedicated website addressing youth mental health litigation, asserting that the lawsuits misrepresent the company's ongoing commitment to teen safety. Meta claims it has responded to parents’ concerns and implemented numerous protections to ensure safe online experiences for young users.
YouTube, through spokesperson José Castañeda, denied the allegations categorically, highlighting their collaboration with youth, mental health, and parenting experts to develop age-appropriate experiences and provide robust parental control tools. Their policies restrict access to violent or sexual content, utilize AI for minor identification, and recently introduced limits on short-form video feed scrolling for teenagers.
TikTok did not supply comments but has enacted default privacy protections, disabled notifications during late-night hours, and launched a guided meditation feature aimed at reducing teen engagement with the app.
Despite these efforts, concerns persist among families and child safety advocates regarding the adequacy of protections on social media platforms. The forthcoming trial will be a critical juncture for determining the validity of these concerns and the responsibilities of tech companies for youth mental health.