Meta’s Child Sex-Trafficking Problem
A newly unsealed court filing shows the tech giant connected children to predators on a scale much larger than previously understood
Note (Dec. 5, 2025): This essay now includes a revised estimate related to child sex trafficking in the U.S., based on updated data and sourcing. For a full explanation of the change, please see the author’s note at the bottom of the essay.
We’ve long known that Facebook and Instagram are major hubs for sex and labor trafficking. In 2021, internal documents released by whistleblower Frances Haugen revealed, among other things, that Apple threatened to pull Facebook (now Meta) apps from the App Store because they were being used to trade and sell maids in the Middle East.
The company was able to assuage Apple, insisting it was working on the problem and had appropriate policies in place. In general, Meta has maintained that it does not allow human trafficking on its platforms. But a newly unsealed court filing contains shocking details that undermine that claim.
The court filing shows that Meta leadership knowingly chose to expose teen users—particularly young women and girls—en masse to human traffickers and other child predators in order to protect its bottom line. These are quite possibly the most appalling revelations yet to emerge about the tech giant and they come at a moment when Congress is considering a flurry of new bills to regulate Meta and other social media providers. They deserve the ear of every parent and every citizen in America.
The Hunt is On
The recently unsealed filing — part of a lawsuit brought against several social media platforms by school districts around the country — outlines in detail how Meta maneuvered aggressively from 2015 onward to maximize teen engagement despite major known safety risks.
Unless otherwise stated, all text in bold below is directly quoting internal documents at Meta, unearthed during legal discovery. Unbolded text quotations are from the filing as written by the plaintiffs.
The filing alleges that Meta’s leadership realized in 2015 that “the company was losing its youngest users — and treated that decline as an existential threat.” In response, and “at Zuckerberg’s direction, employees undertook a ‘lockdown sprint’ to launch Facebook Live in early 2016 as ‘the beachhead we need to expand into other use cases in videos and teens,’” with Zuckerberg himself “warning that notifying adults ‘will probably ruin the product from the start’ and instructing that the company ‘be very good about not notifying parents/teachers’” (pp. 15-16).
The launch of TikTok in 2018 led to a similar reaction. Meta considered the app “‘an existential threat’” and rushed to release Instagram Reels in response, despite knowing it would be doing so without adequate safety restrictions (p. 16).
Coincident with its years-long push to recapture the youth market, the company embarked on a no-holds-barred campaign to “embed Instagram and Facebook directly into school communities.” Among other things, this involved developing new technical capabilities to determine when teen users were at school, to infer which school they attended, and to target push notifications to students at specific schools in what it called “‘school blasts’” (p. 17-18).
The filing also reveals that Meta paid the National PTA and Scholastic six-figure sums to conduct outreach on its behalf, a choice motivated by the perception, in the company’s own words, that these organizations could “‘get [their] materials into the hands of parents, grandparents, and educators at scale.’” It also details how the company presented Orwellian “‘safety roadshows’ at high schools across the country,” and that it “recruited and paid 13-17 year-old ‘teen tastemakers to act as [their] plug at … high schools’” in key markets (p. 18-20).
Tinder for Pedophiles
If pushing questionably safe products on children and teens nationwide wasn’t bad enough, previously unknown internal research cited in the filing reveals that Meta did so despite the near certainty that this would mean putting large volumes of young people in harm’s way.
In particular, Meta’s recommendation algorithms put what the company internally calls “IIC violators”—IIC standing for inappropriate interactions with children—in direct contact with millions of minors. At one point in 2023, the Instagram feature “Accounts You May Follow” recommended “‘nearly 2 million minors’” to “adult groomers” in the previous three months. Twenty-two percent of those recommendations, in turn, “‘resulted in a follow request.’” The filing also discusses an internal 2022 audit, which found that Accounts You May Follow “recommended 1.4 million potential IIC violators to teenage users in a single day” (p. 53).
How do we know Meta could have stopped them? Because its own researchers recommended in 2019 that teen Instagram accounts be defaulted to private mode so that young users would not receive (among other things) “‘[u]nwanted messages that were sexual in nature’” (p. 55). But instead of implementing these changes the moment they understood children were in danger, Meta leadership asked its growth team what the “‘growth and engagement impacts’” of the recommended protections would be. Finding that the falloff in engagement would be too steep (“‘this will likely smash engagement, DAP, MAP, etc.’”), Meta decided to put the issue off due to concerns over its bottom line.
The same question arose again a year later—still Meta did not act. By this point, researchers within the company had put together a more detailed proposal for defaulting teens to private accounts, which they called “Smart Defaults” (p. 55). As before, the growth team remained skeptical, determining that “a true private-by-default would result in a loss of 1.5 million monthly active teens a year” (p. 56). In response, Meta leadership decided to formally shelve private-by-default accounts, even as they acknowledged that “‘[a]ctors take advantage of our tools on Instagram to find and inappropriately engage with children,’” and that “placing teens into a default-private setting would have eliminated 5.4 million unwanted interactions a day over Instagram direct message” (p. 57).
Finally, due to increasing pressure from virtually every team within the company—with the crucial exception again of the growth team—Meta launched a watered-down version of the private-by-default feature in March 2021. It only applied, however, to new users under the age of 16, although it did prevent adults from DMing minors who didn’t follow them.
In practice, the change made little difference. Child predators could easily circumvent it by claiming to be minors using Instagram’s voluntary age-identification settings. In July 2021, after this new feature was launched, Meta found in an internally conducted survey that 13% of 13-to 15-year-olds had received “unwanted sexual advances on Instagram in the past seven days” (p. 58).
One would think that at a minimum Meta would deal swiftly with the most dangerous among this group — those looking to traffic minors for sex on its platforms. But no, and it’s here that things escalate from wildly irresponsible to truly depraved.
Perhaps the most stunning revelation from the November 21 filing is that the company, amid the extraordinary volume of contact it facilitated between children and predators, maintained a 17x strike policy for accounts engaged in “the ‘trafficking of humans for sex.’” That is according to former Instagram Head of Safety and Wellbeing Vaishnavi Jayakumar (and was also confirmed by internal documentation). As she explained under deposition, “‘that means you could incur 16 violations for prostitution and sexual solicitation, and upon the 17th violation, your account would be suspended.’” “’By any measure across the industry,’” she continued, putting it mildly “‘[that] is a very high strike threshold’” (p. 61-62).
Jayakumar’s testimony accords with previous reporting by The Guardian, who, back in 2023, had already unearthed details like these surrounding Meta’s high tolerance for child sex trafficking on its platforms:
We talked to six other moderators who worked for companies that Meta subcontracted between 2016 and 2022. All made similar claims to Walker. Their efforts to flag and escalate possible child trafficking on Meta platforms often went nowhere, they said. “On one post I reviewed, there was a picture of this girl that looked about 12, wearing the smallest lingerie you could imagine,” said one former moderator. “It listed prices for different things explicitly, like, a blowjob is this much. It was obvious that it was trafficking,” she told us. She claims that her supervisor later told her no further action had been taken in this case.
In summary, from at least 2015 onward, Meta was knowingly exposing teen users to millions of interactions with suspected child predators in order to protect its bottom line—including sex traffickers reported as such as many as 16 times. Meta leadership made a conscious decision to permit conditions on its platforms that it knew meant increasing traffickers’ access to minors.
Calculating the Cost
Nor did the decision’s consequences stop at inappropriate messages. What even some of Meta’s harshest critics may not know is that, over the last six years, Facebook and Instagram have been implicated as recruiting platforms in numerous federal sex trafficking cases — including those involving minors.
According to the Human Trafficking Institute’s 2023 Federal Human Trafficking report, from 2019 to 2023, a remarkable ~23% of all sex trafficking victims in federal court cases whose recruitment locations could be identified were recruited from Facebook and Instagram (p. 63).1 On top of that, of those victims from 2019-2023 whose ages could be identified, just over half were minors (p. 39).
Those are just the cases caught and tried by federal authorities. While the true number of such incidents is notoriously hard to discern, several recent studies have attempted to estimate the “dark figure” of total sex trafficking victims by state, county, or city. Specifically, since 2015, this figure has been estimated for Florida, Texas, Sacramento County, CA, and Greater New Orleans. Two of these studies, those on Florida (in 2024) and Sacramento County (from 2015-2020), also collected data about time of victimization, making it possible to roughly estimate the number of victims per year, rather than only those victimized at any point in their lives. The average yearly sex trafficking prevalence reported across both studies was ~.95 victims per 1,000 people per year. While it is important to note that trafficking prevalence might vary substantially across other states and counties, applying the prevalence rate from what little evidence we have to the country as a whole suggests a nationwide victim count of ~323,100 people per year.2
If we in turn assume that victims are recruited in the same year they experience other trafficking-related abuses, and that, as in federal court cases from 2019-2023, roughly half of victims nationwide are minors and roughly 23% are recruited from Facebook and Instagram, that implies that ~37,150 minors — almost all young women and girls — are recruited per year by sex traffickers on Meta platforms.
To be clear: this is an extrapolation from the small sample of victims whose recruitment location was identified in federal court. Recruitment via Facebook and Instagram may be overrepresented in this group. It may also be underrepresented. For conservatism’s sake, let us assume victims recruited via Facebook and Instagram are overrepresented in this sample. Adjusting for that concern still yields unacceptably high numbers. For instance, suppose we cut the 23% figure to 10%. We may as well cut it to 5% to account for any remaining methodological ambiguities. That would still leave ~8,075 minors recruited by sex traffickers on Meta platforms every single year. And that is only in the United States.
While this figure is only a crude estimate, we can be confident that, whatever the true value is, it would be lower had Meta not consistently chosen to prioritize its profit margins over children’s safety. The company knew for years that minors were interacting with child predators on its platforms. It also knew that some of them were being trafficked. Still, it chose to accept this state of affairs in return for higher engagement. Meta leadership, including Mark Zuckerberg and Instagram head Adam Mosseri, made decisions that placed company profits above the safety of — conservatively — thousands of young women and girls. Legislators and parents must never forget that. It is a moral imperative that we hold Meta and its leaders accountable.
Too Little, Too Late
These are still only some of the disturbing new insights contained in the recent filing. We also learned that Meta buried internal research that found evidence of a causal relationship between the use of its platforms and mental illness (p. 27), and that they refused to automatically delete content identified with 100% surety as child porn (p. 72); and still more, which would take another whole article to cover in depth.
Perhaps angling to get ahead of future revelations like those in the late November filing, Meta in September 2024 launched “Instagram Teen Accounts,” which are set to private by default, along with other new protections.
While this is of course a welcome change, there is a small wrinkle — most of the new safety features don’t work. An independent analysis by former Meta engineer Arturo Béjar and Cybersecurity for Democracy (in March through July 2025) found that, of 47 of 53 new Teen Accounts safety features, 64% were either ineffective, meaning they could be trivially circumvented, or were simply no longer available at all. Critically, that includes features meant to limit contact between teens and adults. Among other things, the report found that teens were still encouraged to follow adults they didn’t know, and that once they did, those adults could message them.
Since Meta’s renewed teen-engagement push in 2015, even the most conservative estimates imply some tens of thousands of minors have been recruited by traffickers on Meta’s platforms, a tragedy the tech giant had every chance to prevent, and still hasn’t substantively corrected. That they repeatedly chose not to, despite knowing the risks — that they were doing “safety roadshows” at schools, that they were recruiting “teen tastemakers” to hawk their products while all of this was going on — deserves to go down as one the greatest acts of corporate wrongdoing in modern history, one belonging in the same camp as those of corporate tobacco producers, opioid salers, chemical polluters, and climate deniers.
Meta has created a vast digital machine for putting the most defenseless among us in danger. What’s worse, that machine has worked its way into our homes, into our pockets, into our children’s bedrooms. But now that we know what Meta is, and that its actions are worse than we could have imagined, there can no longer be any excuse for inaction. Children should not be allowed on Meta platforms. So long as they are, thousands more will be delivered into the arms of traffickers.
Author’s note: after this piece was published, a number of readers got in touch to raise methodological complaints about this study, by the University of South Florida Trafficking in Persons Lab, which attempts to estimate the total prevalence of sex-trafficked persons in the state of Florida. While not in agreement with all of them, after following up on these complaints, I agree that they suggest the study has substantially overestimated the prevalence of trafficking victimization in the state. I have re-calculated the study’s topline result to reflect this fact (see footnote 2 for details), and have corrected the section in which it is cited. I have also made more explicit that these numbers are extremely tentative. Trafficking prevalence estimation is a nascent field, and while I believe it is well worth navigating to give readers a sense of the scale of child trafficking in the United States (in particular as it relates to Meta), accuracy is paramount when discussing anything as important as children’s safety. I sincerely apologize for any confusion the inflated prevalence estimate in the original version of this essay may have caused.
The corrected estimate in no way diminishes the core thesis: that Meta knowingly exposed millions of young users to interactions with predators in order to protect its bottom line.
This figure was calculated by dividing the number of victims recruited on Facebook and Instagram from 2019-2023, 215, by the total for whom recruitment location is known over the same period, who are listed by location in the page’s final paragraph, and sum to 949. 215/949 = .226.
This figure, and the preceding average were calculated as follows, taking each study in turn. Details on how the inflated prevalence estimate from the previous version of this essay was corrected can also be found below.
Florida
Topline result was ~200,000 sex-trafficking victims in Florida in 2024.
This estimate is almost certainly inflated for two reasons:
1) It is based on a trafficking screener (the QYIT / RAFT) previously validated on one study of homeless youth, and one study of ER patients, and then administered to the general (adult) population of Florida. Since these two groups (and particularly the former) are more likely to have been trafficked, a screener that accurately predicts when they have been trafficked will overestimate trafficking when applied to the general population.
2) In both validation studies, the screener had a high false positive rate, but instead of correcting for this, the authors of the Florida study assume that all positives (in particular scores of ≥2 on the screener) are true ones.
I correct for the overestimate as follows:
The rate of true positives a.k.a. the ‘positive predictive value,’ or PPV of the screener in the ER study—whose participants are more similar to the general population than those in the homeless study—was 10% (it was 57% in the homeless study). In other words, of those with a score of ≥2 on the screener, 10% were actually trafficking victims.
Multiplying the ~200,000 victim figure by the PPV from the ER validation study in order to correct for false positives yields ~20,000 yearly victims.
This correction is far from perfect. For one thing, it is held back by the fact that the study’s authors do not explain how their calculation of 2024 sex-trafficking victimization is derived from their initial estimate of lifetime trafficking victims statewide. Nonetheless, it represents a substantial improvement over their initial estimate, enough of one, in my view, to make the revised figure usable with appropriate caveats.
20,000 / 23,372,200, Florida population in 2024 = ~.0009 = ~.9 victims per 1,000 people.
Note (Dec. 10, 2025): Since making the above correction, I have heard from the study’s main author (Dr. Joan Reid of USF), who described the method for calculating the topline ~200K victims number. In short, this figure is the proportion of survey participants who gave a positive answer to a more elaborate version of question 3 of the QYIT/RAFT (which, roughly, asks whether the individual has received anything of value in exchange for sex, either below or above the age of 18) multiplied by, in turn, the adult population of Florida, the proportion of positive respondents who claimed to be trafficked within the previous 2 years, and finally by one half (to estimate the figure for a single year). While this represents a more targeted method than that employing the proportion of respondents answering positively to ≥ 2 questions on the QYIT/RAFT, it hasn’t been independently validated on smaller samples (unlike the latter method, whose true positive rate we know from the aforementioned ER study). That makes it difficult to assess how much of the ~200K figure may represent false positives. As a result, and given the sensitive nature of these figures, I have opted not to modify my initial correction, even though my communications with Dr. Reid suggest it is excessively conservative.
Sacramento County, CA
This study uses a multiple systems estimation (MSE) method based on aggregating administrative data on known trafficking victims. It is therefore much more likely to under, rather than overestimate prevalence.
Topline result was ~13,079 victims sex-trafficking victims in Sacramento from 2015-2020.
This averages to ~2,179 victims / per year.
2,179 / 1,540,000, average population of Sacramento from 2015-2020 = ~.001 = ~1 victim per 1,000 people.
National. estimate:
~.0009 and ~.001 average to .00095.
.00095 x 340,100,000, U.S. population in 2024 = ~323,100 victims per year in the U.S.




They knew. They always knew.
Remember when I told you Meta was funding "parent advocacy" groups to blame Apple and Google for problems Meta created? Remember when I explained how the Digital Childhood Alliance was taking Meta's money while refusing to hold the actual source of harm accountable (link: https://open.substack.com/pub/jessicareedkraus/p/how-meta-funded-mom-groups-teach)?
Here's your receipt.
Parents, you've been played. The groups claiming to advocate for your children while taking Meta's money have been running interference for one of the most sophisticated child exploitation operations in human history. They wanted you focused on "app store accountability" while Meta turned your daughters into products for predators.
Big Tech is failing our children, and now we know Meta knew this was happening for years. What we need is real platform accountability combined with parents who refuse to hand unlimited smartphone access to children who aren't ready for it.
Stop asking app stores to fix Meta's problems.
Demand that Meta fix Meta's problems.
Better yet, take the damn phone away until these companies prove they can protect children instead of serving them up to predators for profit.
Thank you for investigating and reporting on this important issue. Meta has gone dark and dangerous. Perhaps it is time to consider shutting it down if it cannot protect children and vulnerable adults.