top of page
YES _MONO_cream.png

Medieval Institutions, Shattered.

  • Writer: Brad Sorte
    Brad Sorte
  • Mar 26
  • 3 min read

The legal framework that protected Big Tech for 30 years just collapsed.


By Brad Sorte, MSW, MBA


This is Part 2. If you haven't read They Knew, start there.


Meta trial: Woman's profile with floating social media icons and graphs. Background shows digital interface, creating a technology-focused atmosphere.

In just 48 hours, the legal shield protecting Big Tech didn't crack. It shattered.

Between Tuesday and Wednesday this week (March 24-25, 2026), two separate juries on opposite sides of the country did something no jury had ever done before. They reviewed the evidence, weighed Silicon Valley's internal research against its public statements, and found the companies liable.


The first blow: $375 million in New Mexico. A jury found Meta liable for violations of the state's Unfair Practices Act. They ruled that Meta concealed what it knew about child sexual exploitation, hid evidence of harm to children's mental health, and engaged in what the jury called "unconscionable" trade practices.


The second blow: $6 million in Los Angeles. On Wednesday, a Los Angeles jury returned a landmark verdict in the KGM case. They found both Meta and YouTube guilty of negligence.


For the first time, a jury ruled that features like infinite scroll and autoplay were not just annoying , they were defective by design.

Words used by the Juries:

"Unconscionable. Negligent. Malicious." These words have specific legal meanings.

They also happen to be exactly the right ones.


What the juries actually found

This wasn't a finding that Meta or Google made a mistake. It wasn't a finding that they moved too slowly.

The juries found that these companies knew. That they concealed what they knew. And that they deliberately exploited children's vulnerabilities to drive engagement. In the LA trial, the jury went a step further, finding that the companies acted with "malice, oppression, or fraud."


The prosecution in New Mexico put it plainly:

"The output is meant to be engagement and time spent for kids."

Not connection. Not community. Engagement. The metric that feeds the algorithm and keeps a 13-year-old on the platform for another 47 minutes when she should be asleep.


Why this week changed everything

There have been settlements before. Regulatory fines. Consent decrees.

This week is different for three reasons.

  1. The product is the problem. The LA verdict is the first to hold tech giants responsible not for the content they host, but for the code they wrote. By focusing on addictive design, lawyers successfully bypassed Section 230 protections that have shielded these companies for thirty years.


  2. Two losses in 48 hours destroy the industry's indestructible image. It proves that whether it's a consumer protection case in New Mexico or a personal injury case in LA, juries are no longer buying the neutral tool defense.


  3. The LA case is a bellwether for over 1,600 pending lawsuits from families and school districts. This verdict provides a roadmap for every one of them.


What this means for families

I want to be direct with the families who read this:

The verdicts do not fix the harm already done. Your child's anxiety, sleep disruption, or body dysmorphia did not disappear this week. A damages award doesn't recover what was taken.


What it does is confirm the instinct many of you had:

  • Something seemed wrong because it was designed to be wrong.

  • The platforms were not neutral, as their defense stated.

  • They were designed to hold attention at the expense of well-being, without your child's best interests anywhere in the architecture.


The Accountability Era has Accelerated

This week is a threshold.


The industry operated for two decades under a legal framework that largely shielded it from the consequences of its products. That framework is being dismantled, one jury at a time.

What comes next is a second phase in New Mexico on May 4, where a judge will decide on public nuisance claims that could force Meta to fundamentally change how its platforms operate. California's attorney general has an August trial scheduled. More than 40 state attorneys general have similar suits pending.

The argument from Silicon Valley that parents simply needed better digital literacy just became significantly harder to make. $381 million says otherwise.


Brad Sorte is co-founder of YES Family Consulting. He spent more than a decade as President and CEO of Caron Treatment Centers and works at the intersection of behavioral health, family systems, and emerging technology. If your family is navigating the impact of social media on a young person's mental health or recovery, YES can help. consultyes.com


Sources:

  • Meta and YouTube found liable in landmark social media addiction trial — CBS News (March 25, 2026)

  • Jury awards $6M in first-of-its-kind California addiction case — Los Angeles Times (March 25, 2026)

  • New Mexico Jury Orders Meta to Pay $375 Million — New York Times (March 24, 2026)

 
 
 

Comments


Commenting on this post isn't available anymore. Contact the site owner for more info.
bottom of page