A landmark verdict ruled against social media companies in a teen mental health case.

A California court found Instagram and YouTube liable for a teenager's mental health crisis. As a father of a toddler, this one hit home.

Published:
Perspective
Robin Anil
AuthorRobin AnilCTO and parent writing on children’s mental health and technology
Emora Health Clinical Team
Clinical ReviewerEmora Health Clinical TeamEmora Health Therapists & Clinical Reviewers

Key Takeaways

  • A jury concluded that platform design was a “substantial factor” in a young woman’s depression, anxiety, and body dysmorphia. That carries a different weight than a research paper.
  • These products were designed to be hard to put down. As a parent if you’ve felt guilty handing your kid a tablet, know that the difficulty of pulling them away is by design. That’s on the companies, not on you.
  • I ask my fellow technologists to make children’s wellbeing a pivotal pillar of your product designs.

My son is two. He loves Ms. Rachel, Blippi, and Sesame Street on YouTube Kids. He dances to the songs, repeats the words, and lights up when his favorite characters come on screen. It’s one of those things that makes you smile as a parent.

But we’ve also seen what happens on days when watch time creeps too high. The behavior shifts. He’s crankier, less patient, harder to redirect. We’ve started blocking channels and telling him that “Blippi went on vacation” just to wean him back. Sorry buddy, you’ll figure that one out eventually.

I share this because I think it’s important context for what I’m about to say. I’m the CTO of Emora Health. I build technology for children’s mental health for a living. And I’m still navigating the same screen-time struggles as every other parent out there.

This week, a jury in California made headlines that hit close to home.

What Happened

On March 25, 2026, a Los Angeles jury found Meta (Instagram) and Google (YouTube) negligent for designing platforms that contributed to a young woman’s depression, anxiety, and body dysmorphia. The plaintiff, Kaley, started using YouTube at age 6 and Instagram at 11. She is now 20.

The jury found “malice, oppression, or fraud” and awarded $6 million in damages, $3 million compensatory, $3 million punitive. Meta was held responsible for 70% of the total. Snapchat and TikTok had already settled before the trial began.

This is the first time a jury has held social media companies liable for the mental health impact of their platform design. Over 2,000 similar lawsuits are pending across the country, with a federal trial expected this summer.

This Verdict Landed Differently for Me

I’ve read the research. I follow the studies. But a jury examining the evidence and saying “yes, this caused harm” carries a different weight than a published paper. Courts are adversarial. Both sides present their best case. And twelve people looked at the evidence and concluded that the way these platforms were designed was a “substantial factor” in a young woman’s mental health deterioration.

The plaintiff started on YouTube at age 6. My son is already on YouTube Kids, and he’s two. That timeline is personal. I know YouTube Kids is not YouTube. I know the content he watches. Ms. Rachel teaching phonics, Elmo learning to share. That is not the algorithmic feed a teenager encounters. The business model is the same: capture attention, hold it for as long as possible, and serve it back to an algorithm that optimises for engagement.

The fact that I have to block channels and invent stories about why they vanished tells me something. Even at the toddler level, these products are designed to be sticky. And if I’m already managing that at two, I worry a lot about what the next decade looks like.

They Were Found Liable for How They Built It

This is the part that stays with me as someone who builds technology. The plaintiffs didn’t win by arguing about any specific piece of content. They won by focusing on design choices like recommendation algorithms, autoplay, infinite scroll, notification systems. The argument was that these features were intentionally engineered to be addictive, especially for young users.

Internal Meta documents presented at trial included strategies like “If we wanna win big with teens, we must bring them in as tweens.” That’s not a product team responding to user demand. That’s a growth strategy targeting children.

I want to be clear: I’m not anti-tech. I built tech for the better part of two decades. I built substantial parts of Google Maps and Ads, built a reservation platform at Tock and scaled an analytics platform at a Hedge Fund. I even used the help of Claude to tune this article. But this verdict draws a line that I think matters enormously. There is a difference between technology designed to hold a child’s attention and technology designed to support a child’s wellbeing. The features you choose to build, the metrics you optimize for, the way you handle a user who can’t stop scrolling. These are design choices. And when your users include children, every one of those choices carries clinical weight.

The Jury Said What Clinicians Have Been Saying

Depression, anxiety, body dysmorphia are the exact presentations that clinicians working with young people see every day. Many have been pointing to social media as an accelerant for years. This verdict gives legal weight to what has been clinical observation.

Jonathan Haidt documents this extensively in The Anxious Generation. When children's social lives moved onto smartphones and social media in the early 2010s, rates of depression and anxiety among young people surged. It's not a coincidence. It's a pattern with data behind it.

The numbers back this up. A recent JAMA Pediatrics study found that pediatric behavioral health spending nearly doubled as a share of total child health spending between 2011 and 2022, rising from 22% to 40%. Families paid $2.9 billion out of pocket for children’s mental health services in 2022. Families with a child receiving behavioral health services were 40% more likely to face extreme financial burden.

This crisis is not abstract. It’s in the therapy session, and the family budget. What this verdict says is that at least part of it has structural, design-driven causes and the people who built those systems knew it.

What I Think About as a Parent After This Verdict

This verdict is validating. But it doesn’t change what’s on my son’s screen tomorrow. He’ll still ask for the “Blippi excavator” video in the morning. I think there’s something in this verdict that every parent should hear: the mental health crisis among young people has structural, design-driven causes.

It is not just “too much screen time.” It is not a parenting failure. These platforms were designed to be hard to put down. A jury confirmed that. If you’ve ever felt guilty about handing your kid a tablet to get through a car ride or a long afternoon, know that the difficulty of pulling them away is by design. That’s on the companies, not on you.

Right now, the best tools I have as a parent are blocking channels and making up stories about why they vanished. There is also a growing movement of parents delaying access to a phone. These shouldn’t be the state of the art. My dad gave my brother and me unfettered access to a Windows PC and the Internet in the 90s. I am forever grateful for that. I also recognize that technology and the internet are different now.

I don’t want to spend energy as a parent to counter the addictive design in today’s internet. I need platforms and tools to be designed with children’s wellbeing as the actual goal, not engagement. We need technology that supports the therapeutic relationship, not one that competes with it.

Haidt called this verdict the start of "a new era in the fight to protect children from online harms." I hope he's right. But I don't want to wait for more verdicts to build better. I ask my fellow technologists to make this a pivotal pillar of your product designs.

The fact that a jury held these companies accountable means the conversation is shifting. And for those of us building technology in children’s mental health, it’s a reminder that how we build matters as much as what we build.

Litigation Is Not the Answer. Building Better Is.

Julia Angwin wrote in the New York Times that this could be Big Tech's "Big Tobacco moment," and that the solution is to raise the cost of bad behavior until companies are forced to change. I understand the impulse. People have been genuinely harmed by these platforms, and they deserve to be heard. But I don't believe negative reinforcement is the path forward. Anyone who has ever tried saying no to a child knows that. It doesn't change the behavior. It doesn't fix what's broken.

The tobacco analogy is seductive but incomplete. Cigarettes had one use case: you smoked them. Social media is woven into how we communicate, learn, and connect. My son learns words from Ms. Rachel. My family stays close across time zones because of video calls and shared photos. The goal should not be to make these platforms so expensive to operate that they disappear. The goal should be to make them safe enough that a parent doesn't have to lie to a two-year-old about where Blippi went.

Angwin argues that profit-motivated companies won't voluntarily make less money, so courts must force them. I've spent my career in technology. I know she's partly right. But I also know that the best products I've worked on were built by people who cared about the people using them. Litigation is slow. Appeals take years. Kids are growing up now. My son is growing up now. I'd rather see my fellow technologists lead this change than be dragged into it by a courtroom.

The answer is not making social media too expensive to exist. The answer is making children's wellbeing too important to ignore.

Frequently Asked Questions

The jury found that social media platform design was a substantial factor in one young woman’s mental health deterioration. It’s an important legal milestone, but it’s specific to this case. What it does confirm is that the way these platforms are built, the algorithms, the autoplay, the engagement loops, all of it can contribute to real psychological harm. The research has been pointing in this direction for years, and now a jury has agreed.

That’s a personal decision that depends on your child’s age, maturity, and how they’re using these platforms. What I’d encourage is having an open conversation with your child about how these platforms work. Understanding that the difficulty of putting the phone down is by design and not a personal weakness can be a genuinely helpful reframing for young people. If you’re concerned about your child’s mental health, a conversation with their pediatrician or a mental health professional is always a good starting point.

Supervise their screen time and watch for behavioural changes on heavy-use days. Use the built-in parental controls (they’re imperfect, but they help). Talk openly with your kids about why these apps are hard to put down. And don’t carry guilt for the moments when screens are part of your day. The goal isn’t perfection. It’s awareness.

This is the first jury verdict holding tech companies liable for their platform design’s impact on youth mental health. With over 2,000 similar lawsuits pending and a federal trial expected this summer, the pressure on social media companies to redesign their products for safety is mounting. Both Meta and Google have said they will appeal, so this is far from settled. But the precedent that platform design choices can create legal liability is now real.

More Clinical Insights

View All