They Knew: What Meta's Internal Research Actually Shows About Your Children
- Brad Sorte

- Feb 25
- 6 min read
Updated: Feb 26
31 internal studies. Two executives on the stand. And a database that changes the argument.
KEY TAKEAWAYS
This week, Mark Zuckerberg and Adam Mosseri testified under oath in Los Angeles — the first time either has defended their products before a jury
In January 2026, Jonathan Haidt's team at NYU Stern launched metasinternalresearch.org, compiling all 31 known internal Meta studies on youth mental health in one public database
Meta's own research found that 13% of teens aged 13-15 receive unwanted sexual advances on Instagram every week
One in three teen girls who struggle with body image said Instagram made it worse — Meta's internal slide said so in 2019
Meta ran a deactivation experiment showing that stopping social media use caused depression, anxiety, and loneliness to drop — then kept the findings internal
In 2024, Zuckerberg told the Senate under oath that science had not found a causal link between social media and youth mental health harm. His own researchers had already found one.

This week, for the first time in history, the people who built Instagram are being forced to defend what they built before a jury.
Adam Mosseri, head of Instagram, took the stand in Los Angeles Superior Court on February 11.
Under questioning, he testified that he did not consider 16 hours of daily Instagram use by a teenager to be addiction. He called it "problematic use."
When asked about internal documents showing Instagram had set a goal of growing average teen engagement to 46 minutes per day by 2026, he acknowledged the targets existed.
Mark Zuckerberg followed on February 18. It was his first time testifying about child safety before a jury. When the plaintiff's attorney asked whether a company that knows it does more harm than good should reexamine its values, Zuckerberg answered: "I think that's probably right, yes."
The case centers on a 20-year-old woman identified in court as Kaley, who began using Instagram at age nine. She alleges the platform's design contributed to anxiety, depression, and body dysmorphia. The outcome could affect more than 1,500 similar lawsuits.
The database that makes the trial legible
In January 2026, Jonathan Haidt's Tech and Society Lab at NYU Stern launched metasinternalresearch.org.
The site compiles all 31 known internal Meta studies on youth mental health — sourced through whistleblowers and Attorney General discovery — in one place for the first time.
I've read through it. What it contains is not advocacy. It is a record of what Meta's own researchers found, what they reported internally, and in several cases, how leadership responded.
"Instagram hosts the largest-scale sexual harassment of teens to have ever happened."

Meta's Bad Experiences and Encounters Framework (BEEF) survey, conducted in 2021, found that 13% of Instagram users aged 13 to 15 receive unwanted sexual advances on the platform every week. Eight percent of the same age group encounter self-harm or suicide content weekly.
The researcher who oversaw the survey, Arturo Béjar, briefed Meta executives on the findings before testifying before Congress in 2023. His characterization of what the data showed: "Instagram hosts the largest-scale sexual harassment of teens to have ever happened." These characterizations come from internal testimony and research summaries, not from Meta’s formal public disclosures.
A 2019 internal presentation summarized research on teen girls in a single line: "We make body image issues worse for one in three teen girls."
More than 40% of Instagram users who reported feeling unattractive said the feeling started on the app.
Project Mercury, a deactivation study Meta ran in partnership with Nielsen, randomly assigned users to stop using Facebook and Instagram for one week. The researchers found that those who stopped reported lower depression, lower anxiety, and less social comparison. Meta's own team concluded: "The Nielsen study does show causal impact on social comparison."
After the results came in, one Meta employee asked internally: "If the results are bad and we don't publish and they leak, is it going to look like tobacco companies doing research and knowing cigs were bad and then keeping that info to themselves?"
The results have leaked.
The trial is happening now.
What Zuckerberg said under oath in 2024
On January 31, 2024, Zuckerberg testified before the U.S. Senate. Under oath, he said: "Mental health is a complex issue, and the existing body of scientific work has not shown a causal link between using social media and young people having worse mental health outcomes."
At the time he delivered that sentence, Meta had conducted at least 31 internal studies examining exactly that question. Some used experimental designs — the highest standard of scientific evidence, the kind that establishes causation. Those experiments found that when users stopped using Instagram and Facebook, their depression, anxiety, and loneliness went down.
Meta's own researchers had found the causal link Zuckerberg told the Senate didn't exist.
What I recognize in the pattern
I've worked in behavioral health for nearly two decades. I have watched addiction cycles from the inside for a long time.
When I read through the internal documents at metasinternalresearch.org, I recognize the sequence immediately:
A profitable product causes harm to a meaningful percentage of its users.
Internal research documents the harm.
Leadership receives the findings, calculates what to do with them, and the research stays internal.
Public statements emphasize complexity and the need for more study.
The product continues to optimize for engagement.
This is the tobacco playbook.
It is the opioid playbook.
Historically, the gap between internal corporate knowledge and public accountability has taken decades to close.
What distinguishes Meta from earlier industries like tobacco is the granularity of the data available to it. Tobacco companies had population-level statistics: mortality rates, addiction prevalence, and long-term health outcomes.
Meta, by contrast, collected behavioral log data at an extraordinary scale. Internal systems tracked what users viewed, how long they paused, what content they engaged with, and which features increased time on the platform. Internal research summaries indicate that company analysts examined how certain types of content correlated with negative social comparison, anxiety, and other mental health concerns among some youth populations.
Internal researchers identified patterns consistent with harm for particular groups of children and adolescents. Publicly, however, the company has emphasized that the broader body of research shows mixed outcomes, including reports of connection and social benefit for some teens.
The dispute is not whether every adolescent is harmed. It is whether internal findings about identifiable risk patterns were adequately addressed, disclosed, or acted upon.
Much of this record comes from internal documents surfaced by whistleblowers and later obtained through state investigations. The archive compiled at metasinternalresearch.org provides the most comprehensive public summary of those internal materials currently available.
The question of benefit
The picture that emerges from Meta's internal research is not uniformly negative. Some studies found that many adolescents report feeling more connected, supported, and seen through social media use. These findings are real. Meta has cited them in its public responses, and they should not be dismissed.
The relevant question is not whether social media produces some benefit for some users.
The relevant question is: what standard should we apply to products used daily by hundreds of millions of children? A pharmaceutical company that found its drug caused serious harm in 10 to 30 percent of pediatric patients would not be permitted to continue marketing it by pointing to the patients who weren't harmed. The standard for products used by children is different, and should be.
Meta's internal researchers understood this. Some of them documented it. Their findings went into presentations, memos, and eventually into the discovery materials for state Attorney General lawsuits working their way through the courts right now.
The families we work with
The families we work with at YES are the downstream of all of this. Adolescent girls with body image disorders who can trace the onset of their symptoms to their use of Instagram. Boys whose capacity for sustained attention has been so thoroughly conditioned by algorithmic feeds that anything slower feels like deprivation. Parents are trying to understand how an app that seemed like a free social tool became the dominant psychological fact of their child's life.
These families are not statistics. They are the people on the other side of the knowledge that was generated, briefed to executives, discussed in internal meetings, and then managed as a narrative problem rather than a clinical one.
Conclusion
Internal research and external academic studies alike describe a complex picture. Some adolescents report increased connection and belonging through social platforms, while others experience measurable increases in anxiety, depression, and body image distress. The dispute at trial centers not on whether every teen is harmed, but on what the company knew about risk, and how it chose to respond.
The tobacco companies knew. The opioid manufacturers knew. The social media companies know.
The trial in Los Angeles is ongoing.
The database is public.
For any parent who wants to understand what the companies actually knew — and when — metasinternalresearch.org is where to start.
~Brad Sorte, MSW, MBA

Sources
metasinternalresearch.org — Tech and Society Lab, NYU Stern (launched January 13, 2026; last updated February 18, 2026)
Béjar, A. (2023, November 7). Testimony before the U.S. Senate Judiciary Committee. judiciary.senate.gov
Wells, G., Horwitz, J. & Seetharaman, D. (2021, September 14). "Facebook Knows Instagram Is Toxic for Teen Girls, Company Documents Show." Wall Street Journal.
School Districts v. Social Media (2025, November 21). Lieff Cabraser Heimann & Bernstein. lieffcabraser.com
NBC News (2026, February 18). "Mark Zuckerberg grilled about underage Instagram users, social media addiction during landmark trial."
ABC News (2026, February 18). "Mark Zuckerberg takes the stand in landmark trial over social media addiction claims."




Comments