74 Comments
User's avatar
Brian Lenney's avatar

They knew. They always knew.

Remember when I told you Meta was funding "parent advocacy" groups to blame Apple and Google for problems Meta created? Remember when I explained how the Digital Childhood Alliance was taking Meta's money while refusing to hold the actual source of harm accountable (link: https://open.substack.com/pub/jessicareedkraus/p/how-meta-funded-mom-groups-teach)?

Here's your receipt.

Parents, you've been played. The groups claiming to advocate for your children while taking Meta's money have been running interference for one of the most sophisticated child exploitation operations in human history. They wanted you focused on "app store accountability" while Meta turned your daughters into products for predators.

Big Tech is failing our children, and now we know Meta knew this was happening for years. What we need is real platform accountability combined with parents who refuse to hand unlimited smartphone access to children who aren't ready for it.

Stop asking app stores to fix Meta's problems.

Demand that Meta fix Meta's problems.

Better yet, take the damn phone away until these companies prove they can protect children instead of serving them up to predators for profit.

Robert Hunter's avatar

SM can't be fixed, its profit driven and will always maximize profits at your expense. Purely mind control of the worst kind.

mathew's avatar

Kids should have smartphones nor social media.

They shouldn't get either till 18

Rhymes With "Brass Seagull"'s avatar

Well the genie is out of the bottle now. Harm reduction makes far more sense than prohibition.

mathew's avatar

It is very possible to take smartphones or tablets away from kids

Our kids used to have tablets.Now they don't

Christina Dinur's avatar

I agree with 18+ for social media and smartphones. I hope this becomes the norm.

We also took away the tablets after previously being fairly lenient on screens. It's been an incredibly positive change in our household.

EyesOpen's avatar

Thank you for investigating and reporting on this important issue. Meta has gone dark and dangerous. Perhaps it is time to consider shutting it down if it cannot protect children and vulnerable adults.

Robert Hunter's avatar

It's a mind toxin by nature and it can't be fixed.

Rhymes With "Brass Seagull"'s avatar

Indeed, the time for half-measures was ten years ago. We need to do a safety recall ASAP and QUARANTINE these platforms until they can be made safer for everyone. Then lock up Zuckerberg and all of his associates and sycophantic lackeys.

Christina Dinur's avatar

"The filing also reveals that Meta paid the National PTA and Scholastic six-figure sums to conduct outreach on its behalf, a choice motivated by the perception, in the company’s own words, that these organizations could ‘get [their] materials into the hands of parents, grandparents, and educators at scale.’”

To me this is the most upsetting part - that our nation's oldest "child advocacy" org (National PTA) continues to take money from Meta, promote Meta products to kids and teens, and actively helps whitewash Meta's reputation in the public eye. Last spring, Smartphone Free Childhood US sent an open letter to National PTA leadership signed by hundreds of PTA members, medical professionals, education experts, parents, and advocates urging them to end this sponsorship. PTA was not open to the idea at that time but I can only hope that as more and more awful news comes out about Meta maybe they will finally do the right thing and end this disgraceful partnership.

Brooke's avatar

What this article describes is disturbing — but I think we’re still missing the deeper diagnosis. We keep treating symptoms as if they are causes, and so we’re surprised when the same crises appear again in new forms, on new platforms, under new corporate logos.

The uncomfortable truth is this: problems like these don’t emerge from “bad platforms” alone. They emerge from the society we’ve built — one increasingly defined by fear, fragmentation, and a breakdown of intergenerational connection. And layered over all of this is an economic system that rewards disconnection. Our dominant institutions — especially large tech and media companies — have a single overriding mandate: **increase shareholder value at all costs**. Social, environmental, and community wellbeing enter the equation only when legislation forces them to, and even then those requirements are resisted, litigated, or minimised.

When you build a civilisation where corporate success depends on maximising engagement, extracting attention, and keeping people isolated but online, it shouldn’t surprise us that the social fabric frays.

Forty years ago, the streets of most suburbs were full of children. Kids rode bikes in packs, played cricket in cul-de-sacs, and moved between houses without anyone tracking their steps. Adults talked across fences, neighbours were extended family, and intergenerational warmth was a normal part of daily life. Now our streets are empty. Children are kept indoors, supervised, scheduled, and shielded. Adults, terrified of moral panic or community judgment, keep their distance.

We frame this as “safety,” but the result is isolation for everyone — children and adults alike.

A void like that doesn’t remain empty. It gets filled by whatever can reach through the cracks: algorithms, parasocial contact, corporate-mediated social life, and a digital world optimised for attention rather than belonging. That includes harmful actors, but it also includes the simple and heartbreaking truth that millions of people now try to meet basic emotional needs — for connection, closeness, visibility — online, because they have nowhere else to put them.

And then we blame the platforms for showing us the world *we created*.

The real crisis here is not just technological. It’s sociological. It’s civilisational. We built a society where parents are terrified, children are hidden, neighbours barely know each other, and any intergenerational warmth is treated with suspicion. We cannot atomise a population and then be shocked when predatory behaviour, despair, addiction, or extremism find fertile ground.

If we genuinely want to address the problems outlined in the article, we have to rebuild the social structures that once made communities resilient.

That means:

* **reviving communal living**, where food, tools, gardens, and everyday labour are shared

* **restoring intergenerational contact**, the foundation of social learning and emotional development

* **designing suburbs and public spaces for people, not cars**

* **reducing economic pressures that keep adults overworked and children indoors**

* **breaking the capitalist feedback loop** that profits from fear, isolation, and endless growth at any cost

* **moving from punitive systems to community-centred social management**

These changes aren’t utopian; they’re simply the opposite of the conditions that produce the crises we keep reading about. As long as we sustain a civilisation that maximises disconnection, maximises fear, and minimises community, the same problems will keep resurfacing no matter what regulations or detection tools Silicon Valley deploys.

If we want to stop the harms described, we have to stop pretending this is just a tech problem.

It’s a society problem.

And until we rebuild the social world around children — and around each other — we will continue treating symptoms while the underlying causes grow deeper roots.

I’ve written similar discussions around age verification, pornography, moral panic, and the broader civilisational forces driving these crises.

See my growing work at *The System Is Broken*:

**[https://thesystemisbroken.substack.com](https://thesystemisbroken.substack.com)**

Rhymes With "Brass Seagull"'s avatar

Alas, such fundamental changes to society at large will likely take many generations to accomplish, and will have to happen organically rather than be forced from the top down. Until then, in an echo of the late James Q. Wilson, non-root causes can and should be seen as at LEAST as important as root causes in the short to medium term. That's not to say I support broadstroke bans or mandatory age verification (for the record, I vehemently oppose both). But we certainly need MUCH better guardrails on Big Tech than we have now for the time being IMHO.

Stosh Wychulus's avatar

The Zucker revealed who he was long ago when he posted online pictures of freshman girls at Harvard to be rated by their appearance. He is one of many high function sociopathic oligarchs who are infecting the country.

Bala Subramanian's avatar

All ratings lead to bias and infringement of freedom. Isn’t that why, in the Bible, the creation is declared to be “good” and nothing else?

Christine Paquette's avatar

Oh course they knew. When there is no conscience, no accountability and no oversight, evil will do what evil will do, prey and profit.

Rhymes With "Brass Seagull"'s avatar

Indeed, Meta is literally run by cold-blooded psychopaths, sociopaths, and malignant narcissists. All the way to the very top.

Helen Morris's avatar

On 10 December (one week from now) Australian law bans access to social media, YouTube, etc for children under 16 years. Proof of age mechanisms etc. It won't be perfectly fool proof but at least it's a courageous national Government initiative doing what we elect our government to do - care for its children

Rhymes With "Brass Seagull"'s avatar

I still think that such a thing throws out the proverbial baby with the bathwater, and will do more harm than good on balance.

Steve K's avatar

I know that for every action, there are unintended consequences. I am seriously interested, what harm will occur?

MH's avatar

Mark Zuckerberg and his growth team are truly disgusting. Everyone should close their accounts, permanently. The crazy thing is he has 3 young daughters of his own but I guess as long as it doesn't affect his kids then all systems a go.

Conni Jespersen's avatar

Thank you, thank you for posting about this. You are truly needed in this social media era.

Frank Dee's avatar

It’s all for the love of money and I’m afraid it will always be.

Rhymes With "Brass Seagull"'s avatar

Indeed. The love of money is the root of all evil.

Robc's avatar

It's Meta, Discord, Roblox -- time for the Feds to start subpoenaing all of their executive's tech devices / bank accounts and see if they track to traffickers or CSM rings.

Roblox: https://hindenburgresearch.com/roblox/

Digital Hygiene Coach's avatar

it s curious Australia didn't go for discord and roblox...

Meghan's avatar

Folks please contact the National PTA and ask why they are partnering with a sex trafficking facilitation company. Then head over to their Instagram where they have yet another call for us parents to organize community online safety meetings, aka paid ads sponsored by Meta. This endorsement of Meta by them is beyond outrageous.

Sustainability on the Inside's avatar

In Australia we are days away from a national ban on social media access by children under 16. I had reservations about it, given how important it is for social connection when kids are shy, but reading this I am now 100% supportive!

Rhymes With "Brass Seagull"'s avatar

I still think this ban will do far more harm than good on balance.

Zandra Pretorius Gotteberg's avatar

Perhaps it is time for judges in sex trafficking/child abuse cases to also demand reparation to the victims when connected via digital social media accounts. To hold social media platforms accountable for partaking/allowing criminal activity with minors to take place, might be more effective than hoping for responsible citizenship from these companies.

Digital Hygiene Coach's avatar

And for EU to stop letting Big Tech co-author, dilute and delay, children protection policies

adam k kocinski's avatar

https://www.samharris.org/podcasts/making-sense-episodes/213-worst-epidemic

A link to an interview with Gabriel Dance, New York Times investigative journalist, who in 2019 published a series of articles on child sexual abuse material and its availability on platforms like Instagram and Facebook.

Chris's avatar

I don’t understand why they can’t be classed as publishers and made liable for what’s sent on their network. Yes, it would probably destroy their businesses but so what? They seem to do more harm than good.

Here in the UK, if I’m caught with child pornography on my phone I would probably face criminal charges. Facebook etc have such material on their networks. What’s the difference?

Rhymes With "Brass Seagull"'s avatar

Even in the USA, we have that law called FOSTA/SESTA on the books since 2018. That alone should have been enough to stop them, if it had worked (and was enforced) as intended.