BY COMFORT OGBONNA
Meta Platforms and YouTube deliberately designed their products to addict children, a lawyer told jurors in California on Monday, as a closely watched trial began that could reshape how responsibility is assigned to Big Tech companies for the impact of their platforms on young users. The case is being seen as a major test of whether social media and video-sharing companies can be held legally liable for the way their apps are designed, rather than for the content posted by users.
The lawsuit was brought by a 20-year-old woman identified in court as Kaley G.M., who is suing Meta Platforms, the parent company of Facebook and Instagram, and Google, which owns YouTube. Kaley alleges that she became hooked on social media at a young age due to intentionally addictive features built into the platforms, and that prolonged exposure worsened her mental health.
Addressing jurors, Kaley’s lawyer, Mark Lanier, argued that internal company documents show the platforms were engineered to exploit how children’s brains develop. He said the companies knowingly built systems designed to keep young users engaged for as long as possible, despite being aware of the potential harm. According to Lanier, these designs were not accidental but the result of deliberate choices aimed at maximizing user attention and, ultimately, profits.
Meta pushed back strongly against those claims. In his opening statement, Meta attorney Paul Schmidt pointed to Kaley’s personal history, telling jurors that her medical records show a background of verbal and physical abuse and a difficult family environment following her parents’ divorce when she was three years old. Schmidt questioned whether removing Instagram from her life alone would have fundamentally changed her outcome, arguing that other factors played a far greater role in her struggles.
YouTube’s legal team is expected to deliver its opening statement on Tuesday. Both Meta and Google have denied the allegations, maintaining that their platforms are not responsible for Kaley’s mental health issues and that they have taken steps to improve safety, particularly for younger users.
The stakes of the case are high. A verdict against the companies could make it easier for similar lawsuits to move forward in state courts and potentially weaken the technology industry’s long-standing legal defenses in the United States. Meta, Google, TikTok, and Snap currently face thousands of related lawsuits in California alone, many of them accusing the platforms of being harmful by design.
Meta Platforms CEO Mark Zuckerberg is expected to testify during the trial, which is anticipated to run into March. TikTok and Snap had previously settled with Kaley before the case went to trial, leaving Meta and Google as the remaining defendants. Kaley herself is also expected to take the stand, where she will describe how she believes the apps fueled her depression and suicidal thoughts.
Her legal team plans to argue that the companies were negligent in designing their platforms, failed to adequately warn users and parents about the risks, and that their products were a substantial factor in her injuries. If jurors agree, they will then decide whether to award damages for pain and suffering and whether punitive damages are warranted.
Meta and Google are expected to defend themselves by highlighting safety features aimed at young users, emphasizing parental controls, and arguing that they cannot be held responsible for harm caused by content uploaded by others. Los Angeles Superior Court Judge Carolyn Kuhl, who is overseeing the case, has instructed jurors that the companies cannot be held liable for recommending third-party content, but only for their own design and operation of the platforms.
U.S. law has historically provided broad protections to internet companies against liability for user-generated content. If jurors in this case reject those protections in practice by focusing on design-related harm, it could open the door to a new wave of litigation claiming social media platforms are inherently dangerous due to how they function.
The trial comes amid a wider legal and political backlash against social media companies. In federal court, more than 2,300 similar lawsuits have been filed by parents, school districts, and state attorneys general. A judge overseeing those cases is currently weighing how far liability protections extend, with the first federal trial potentially beginning as early as June.
Legal pressure is also mounting at the state level. On Monday, a jury in Santa Fe, New Mexico, heard opening statements in a case accusing Meta of profiting from its platforms while exposing children and teenagers to sexual exploitation and harming their mental health. An attorney for the New Mexico attorney general told jurors that Meta misrepresented the safety of its platforms while privately knowing about the risks. Meta’s lawyers countered that the company has made extensive efforts to protect users and has consistently warned about harmful content.
Beyond the United States, governments are increasingly moving to curb children’s access to social media. Australia and Spain have already barred users under 16 from accessing such platforms, and several other countries are considering similar restrictions, reflecting a growing global concern over the impact of social media on youth mental health.