Meta Liable: Jury Holds Instagram and YouTube Accountable for Teen Harm

A California jury just made history. After nine days of deliberations, they ruled that Meta and Google are liable for deliberately addicting a young woman to their platforms — starting when she was just six years old. It's the first time a jury has held social media companies legally responsible for treating their apps like defective products designed to exploit children's developing brains.

The verdict is $6 million. And it could reshape everything — for the 2,000-plus similar lawsuits waiting in the wings, for how these companies operate, and for every parent who's ever watched their kid disappear into a phone screen and wondered if they were the only one who saw something was wrong.

You weren't. And now a jury of your peers is saying the same thing.


What the Jury Actually Decided

The plaintiff — identified in court documents only as KGM, and referred to as "Kaley" by her legal team — first used YouTube at age six and Instagram at age nine. She grew up, became an adult, and took Meta and Google to court. She said their platforms were deliberately engineered to be addictive, and that the companies knew what they were doing to kids like her.

The jury agreed. On all counts.

Bottom Line: Meta bears 70% of the responsibility for Kaley's depression and anxiety. YouTube bears 30%. The jury awarded $3 million in compensatory damages and another $3 million in punitive damages — money meant specifically to punish companies for especially harmful behavior.

Both companies say they'll appeal.


The Internal Documents That Changed Everything

Here's what made this trial different from every other tech accountability fight. KGM's legal team didn't just argue that the apps were harmful — they showed the jury what was happening inside Meta's own walls.

Internal documents. Executive communications. Proof that the people running these companies knew exactly what they were building.

One document from 2018 summed it up in ten words: "If we wanna win big with teens, we must bring them in as tweens."

That's not an accident. That's a strategy. Meta was tracking how to hook kids before they were even teenagers — because their own internal research showed that 11-year-olds were four times as likely to keep returning to Instagram compared to competing apps. By 2015, an estimated 30% of American 10-to-12-year-olds were already on a platform that wasn't supposed to let them in until they were 13.

Zuckerberg took the stand during the trial and denied that Instagram targets kids. The jury saw the documents and decided otherwise.

Bottom Line: This wasn't a case about bad parenting or screen-time habits. The evidence showed a deliberate corporate decision to target children — and executives who knew it and did it anyway.


Why This Verdict Is Bigger Than $6 Million

Six million dollars is nothing to Meta. The company made more than $164 billion in revenue in 2024 alone. They could write that check from the sofa cushions.

But that's not why this matters.

It matters because Kaley's case is the first of more than 2,000 similar lawsuits to go to trial — and her verdict could shape how all of them play out. School districts. Parents. Teens who've been through eating disorders, depression, self-harm. Cases brought by families who lost children entirely.

They've all been waiting.

And now they have something they didn't have before: a jury that looked at the evidence, heard from Zuckerberg himself, reviewed the internal documents, and said — yes. These companies built a defective product. And they should be held responsible for what it did.

Shelby Knox, director of online safety campaigns at the nonprofit ParentsTogether, put it plainly after the verdict came down. "For years, families have been told this was a parenting issue," Knox said, "but the jury saw the truth: these companies made deliberate decisions to prioritize growth and profit over kids' safety."

Bottom Line: This verdict doesn't just award one woman damages. It sends a message to every tech company that has ever made money off children's suffering: your internal documents can be evidence. Your executives can be cross-examined. And a jury of ordinary people can hold you accountable.


What Comes Next — And Why Congress Needs to Move

The companies will appeal. That's already guaranteed. And the appeals process could drag this out for years.

But the pressure doesn't stop in the courtroom.

More than 2,000 pending lawsuits are watching what happens with Kaley's case. Parents and school districts across the country have spent years arguing that these platforms should be treated like manufacturers of defective products — and now they have a verdict backing them up.

And there's a bigger fight: federal legislation. Right now, there is no comprehensive federal law protecting kids from these platforms' most predatory design features. The platforms have largely written their own rules, and those rules have been built around engagement metrics, not child welfare.

Congress has talked about this for years. They've held hearings. They've called tech CEOs to testify. And then — almost nothing happens.

This verdict should change the calculus. When juries start awarding millions in punitive damages and when 2,000 more cases are lined up behind this one, suddenly the cost of doing nothing gets very, very real.

Bottom Line: Litigation is doing the work that Congress has refused to do. But what we actually need is federal law — strong, enforceable rules that protect kids online, not just more settlements that get factored into the cost of doing business.


FAQ: What You Need to Know About the Social Media Addiction Trial

Did Meta and YouTube actually know their apps were hurting kids? Yes — that's what the jury concluded. Internal documents shown at trial revealed that Meta executives tracked engagement rates for children as young as 10 and 11, set goals to increase the time young users spent on Instagram, and wrote strategy documents about "bringing in" tweens. The jury found the companies were aware of the harm and failed to protect their youngest users.

What does it mean to call a social media app a "defective product"? It's a legal argument that reframes how we think about tech harm. Instead of saying "you chose to use this app," plaintiffs argue that the app was engineered with dangerous design features — like endless scroll, variable reward notifications, and algorithms that push increasingly extreme content — that make it unreasonably harmful, especially for developing brains. This verdict is the first time a jury has agreed with that framing.

How much did Meta and YouTube have to pay? The jury awarded $6 million total: $3 million in compensatory damages (to make Kaley whole) and $3 million in punitive damages (to punish the companies). Meta is responsible for 70% of the damages, Google for 30%.

Will this verdict affect other lawsuits? Probably, yes. This trial is connected to more than 2,000 other pending lawsuits from parents and school districts. The outcome won't automatically determine those cases, but it gives other plaintiffs a roadmap — and a precedent that juries are willing to hold these companies accountable.

Can I sue Meta or YouTube if social media harmed my child? Potentially. You should speak with a personal injury attorney who specializes in social media litigation. Many firms are actively taking these cases. The landscape has shifted significantly after this verdict.


The Bottom Line: The Jury Saw What We Already Knew

For years, parents have been gaslit. Told they were overreacting. Told it was their job to manage their kids' screen time. Told that the platforms were neutral tools and any harm was a matter of personal responsibility.

But that's not what the internal documents said. And it's not what the jury decided.

Meta built Instagram to hook tweens. YouTube built algorithms designed to keep six-year-olds watching. These companies made deliberate choices — and they made those choices knowing what they would do to developing brains.

Now, finally, there's accountability. It's not enough. One verdict and $6 million is nowhere near enough. But it's a start. And 2,000 more cases are right behind it.

Share this story with every parent you know. Talk to your kids about what these platforms are built to do. And tell your representatives in Congress that it's way past time to pass real, enforceable protections for children online.

We've been patient long enough.

Comments