By Dawn Chmielewski, Courtney Rozen and Jody Godoy
LOS ANGELES, March 25 (Reuters) – Alphabet’s Google and Meta were found liable on Wednesday for designing platforms that are dangerous for teens, in a landmark verdict that could force tech firms to rethink how they defend themselves against safety claims.
The verdict could mark a turning point in the global backlash against their platforms’ perceived mental health harms to kids and teens, more than two decades after the emergence of social media.
Punitive damages for the companies will be decided next. The jury may consider whether Google or Meta’s products caused the plaintiff physical harm or whether the companies disregarded the health of other users, Judge Carolyn Kuhl said in court.
The case involves a 20-year-old woman, known in court only by her initials, who said she became addicted to Google’s YouTube and Meta’s Instagram at a young age because of their attention-grabbing design. The jury found Google and Meta were negligent in the design of both apps and failed to warn about their dangers.
“Today’s verdict is a referendum — from a jury, to an entire industry — that accountability has arrived,” the plaintiff’s lead counsel said in a statement.
Shares of Meta were up nearly 1% and Alphabet shares were slightly lower, little changed after the verdict.
Meta disagrees with the verdict and its lawyers are “evaluating our legal options,” a company spokesperson said. Google plans to appeal, said company spokesperson José Castañeda.
The plaintiff in the Los Angeles proceeding focused on platform design rather than content, making it harder for the companies to avert liability.
Snap and TikTok were also defendants in the trial. Both settled with the plaintiff before it began. Terms of the agreements were not disclosed.
MOUNTING CRITICISM
Large technology companies in the U.S. have faced mounting criticism in the last decade over child and teen safety. The debate has now shifted to courts and state governments. The U.S. Congress has declined to pass comprehensive legislation regulating social media.
At least 20 states enacted laws last year on social media usage and children, according to the nonpartisan National Conference of State Legislatures, an organization that tracks state laws.
The legislation includes bills that regulate the use of cellphones in schools and require users to verify their ages to open a social media account. NetChoice, a trade association backed by tech companies such as Meta and Google, is seeking to invalidate age verification requirements in court.
A separate social media addiction case brought by several states and school districts against technology companies is expected to go to trial this summer in federal court in Oakland, California.
Another state trial is slated to begin in Los Angeles in July, said Matthew Bergman, one of the attorneys leading the cases for the plaintiffs. It will involve Instagram, YouTube, TikTok and Snapchat.
Separately, a New Mexico jury on Tuesday found Meta violated state law in a lawsuit brought by the state’s attorney general, who accused the company of misleading users about the safety of Facebook, Instagram and WhatsApp and of enabling child sexual exploitation on those platforms.
TRIAL ARGUMENTS
At trial, the plaintiff’s lawyers sought to show Meta and Google intentionally targeted kids and made decisions that put profit over safety. Meta’s attorneys emphasized the plaintiff’s difficult home life as a child as the cause of her mental health struggles, while YouTube argued her usage of the streaming platform was minimal.
Jurors saw internal documents revealing how Meta and Google sought to attract younger users, and heard executives, including Meta CEO Mark Zuckerberg, take the stand last month to defend company decisions.
When asked about Meta’s decision to lift a temporary ban on beauty filters that some inside Meta warned could be harmful to teen girls, Zuckerberg said he decided to let users express themselves.
“I felt like the evidence wasn’t clear enough to support limiting people’s expression,” he said.
How free speech and content moderation factored into the companies’ decisions is likely to play a part in any appeal.
(Reporting by Dawn Chmielewski in Los Angeles, Courtney Rozen and Jody Godoy in Washington; Additional reporting by Katie Paul; Editing by Nia Williams, Rod Nickel)

