A closely watched trial over alleged social media addiction begins Tuesday in California. Senior executives from major technology companies are expected to testify. The case could redefine how courts view digital platform responsibility.
The plaintiff is a 19-year-old woman identified as KGM. She claims platform algorithms caused addiction and damaged her mental health. She says the designs encouraged compulsive use throughout her teenage years.
The defendants include Meta, owner of Instagram and Facebook, TikTok owner ByteDance, and YouTube parent Google. Snapchat reached a settlement with the plaintiff last week. The remaining companies now face trial.
The case will proceed at Los Angeles Superior Court. Legal experts see it as the first of many similar lawsuits. These cases could weaken a long-standing legal defence used by technology firms.
Judges and jurors assess platform design choices
The companies argue the evidence fails to prove responsibility for depression or eating disorders. They deny any direct link between their products and the alleged harms.
The decision to move forward signals a broader legal shift. Courts increasingly consider claims that digital products encourage addictive behaviour. Pressure on the technology sector continues to rise.
For decades, companies relied on Section 230 of the Communications Decency Act. Congress passed the law in 1996 to shield platforms from liability over user content.
This lawsuit challenges a different issue. It focuses on algorithms, notifications, and engagement features. These design choices shape how users interact with social media platforms.
KGM’s lawyer, Matthew Bergman, described the trial as a legal milestone. He said a jury will directly evaluate social media company conduct.
He said many young people around the world suffer similar harm. He accused companies of prioritising profits over children’s wellbeing.
Legal stakes rise for technology companies
Eric Goldman, a law professor at Santa Clara University, warned the risks are significant. He said court losses could threaten the companies’ long-term future.
He also highlighted difficulties for plaintiffs. Courts rarely connect psychological harm directly to content publishers.
Still, he said these lawsuits opened new legal ground. Existing laws never anticipated claims centred on digital product design.
Internal records and top executives under scrutiny
Jurors will hear extensive testimony during the trial. They will also examine internal company documents.
Mary Graw Leary, a law professor at Catholic University of America, expects major disclosures. She said companies may reveal information long hidden from public view.
Meta previously said it introduced dozens of safety tools for teenagers. Some researchers question the effectiveness of those measures.
The companies plan to argue third-party users caused the alleged harm. They deny their designs directly injured young people.
Meta chief executive Mark Zuckerberg is scheduled to testify early. His appearance represents one of the most anticipated moments.
In 2024, Zuckerberg told US senators scientific studies showed no proven causal link. He said research failed to connect social media with worse youth mental health.
During that hearing, he apologised to victims and their families. Lawmakers questioned him during emotional exchanges.
Worldwide pressure on social media firms grows
Mary Anne Franks, a law professor at George Washington University, questioned executive testimony strategies. She said technology leaders often struggle under intense questioning.
She added companies hoped to avoid putting top executives on the stand. Public testimony carries serious reputational risks.
The trial comes as global scrutiny continues to intensify. Families, school districts, and prosecutors increasingly challenge social media practices.
Last year, dozens of US states sued Meta. They accused the company of misleading the public about platform risks.
Australia has banned social media use for children under 16. The UK signalled in January it may adopt similar measures.
Franks said society has reached a turning point. She argued governments no longer grant the technology industry automatic deference.
