60 Comments
author

Thank you for the comments and feedback. I wanted to address some of them:

1. As Greg Baer wrote, love your kids unconditionally and make it as safe as possible to tell you anything.

2. If you can - don’t give your kids social media until they are 16. Besides the addiction and other issues well documented here, if you look at the table of harms you’ll see how tragically likely they likely have a harmful experience. I waited until my daughter was 14. Had I know then what I learned I would have waited until she was much older. At least 16 or 17.

3. I’m not recommending reporting for messages, or to hire anyone to do message report reviews. The proposal is detailed in the other notes. It is a button to flag a conversation as an unwanted advance. The content does not matter.

Expand full comment

Love the idea of a flag button

Expand full comment

The flag button you suggest does sound like a good idea to me.

Expand full comment

Had you waited though, she would most likely have gone on social media without your knowledge and perhaps even led a double life. Except then she would have had no parents to confide in to process anything harmful or questionable she would have encountered either way. So please be very careful what you wish for.

Expand full comment

I worked in IT for many years as an engineer. Banning pornographic content hasn't been a technical knowledge problem for at least 20 years. It has been a political will problem, like this is.

So change the incentives.

Fines and bad PR won't do it. Whistleblowers won't do it. Throw some Silicon Valley tech CEO's in prison for 30 days (a US attorney can find a charge). Indict a major venture capital firm for facilitating sexual trafficking of minors. There will be AI-driven scanning of all uploaded content implemented within 2 weeks and electronic age verification and cordoning off of minor from adult spaces within 2 more.

I greatly appreciate this former manager's courage for speaking out. But his solutions are woefully inadequate.

Expand full comment

I'm an engineer working with AI. Second this. Most solutions proposed by Arturo don't even need computer vision, merely natural language processing for filtering out harassing messages.

Expand full comment

> Banning pornographic content hasn't been a technical knowledge problem for at least 20 years. It has been a political will problem, like this is.

Are you sure? It was far less than 20 years ago when Facebook was publicly humiliated because one of its AI filters censored a highly provocative picture of a bunch of apples. (Maybe they were just too curvaceous?)

Expand full comment

Yes. I'm sure. I barely remember the event you describe, but I am 100% certain of this. When I was working in Silicon Valley developing the first generation of public, Internet based applications, we were already brainstorming strategies then. It can be done by brute force (lots of eyes on staff). It can be done by crowdsourcing (ala Captcha). It can be done by AI. We were fooling around with automated pornographic image sensors in 1999, long before "deep learning" systems became common.

In this particular case, the larger issue comes down to design. Given real age verification, you can fence off the adult and teen worlds from each other. Then patrolling the few boundaries between them is very manageable.

Expand full comment

Only rarely do I find these words erupting from my lips: Are you kidding? Really? The brilliant solution to the now-recognized epidemic of unwanted sexual solicitation or exposure to a minor is to create a way to report every instance to the social media platform? Again, are you kidding? And more government involvement? Do the math. Instagram alone would require tens of thousands of full-time monitors to adequately respond to every report. And respond how? They would be required to provide counselors for every victim and perpetrator? Really?

No, this is all silly. It’s past absurd. The responsibility here is with parents. Period. Jon Haidt has talked about the need for more and earlier independence, free play, and responsibility in the real world. That is a partial solution which presumes—unreasonably—that parents are capable of providing all that. I repeat: parents are the solution, but they have to be prepared to give their children the unconditional love—love without disappointment and irritation—that children must have before they are emotionally free to experience joy and to express who they are.

So what are the real solutions, simply expressed?

1. No phones or social media at all—no unobserved use of the Internet at all—for any minor child living at home. The kids don’t need it.

2. Teach parents how to unconditionally love their children, which very few parents can do. No indictment here, they just don’t know how. Controlling children or enabling them is not loving them. And parents don’t have to figure out unconditional love and guidance on their own. Just go to the free and agenda-free websites RealLove.com and RealLoveParents.com. I have nothing to sell. But I do offer thirty—30—years of intense experience with teaching parents and children all over the world. It’s love they need, not social media or phones or indulgence or entertainment. It’s loving and teaching, and it works.

Expand full comment
May 6·edited May 6

Just don't allow anyone under 16 on Instagram and fine the company $15,000 each time they let someone in. The way you stop underage drinking. These companies do it because it gets them engagement. Or something similar.

Maybe have all computers and phones have some identifier saying the person is under 16 by default. Or whatever age. Or say that adults get an extra identifier for their computers, phones, accounts, whatever. Something that needs to be checked. If you want it for an adult you need an adult to say they want it to have adult access, with id. Have the social media companies check for it and refuse those who are under that age and pay consequences if they nonetheless try to sell their engagement to a person who does not have a computer or other device that has been identified as for an adult.

Parents cannot stand over their kids 24/7, I don't understand how you think they are going to stop kids from following the suggestions which are fine tuned to get engagement. Their homework is on line. Will you watch all your children while they do their homework for several hours a night to make sure they don't switch screens? Some sites even have s button to push so you can hide what you're looking at from your parents.

Expand full comment

> Do the math. Instagram alone would require tens of thousands of full-time monitors to adequately respond to every report. And respond how?

Remember Bernie Sanders' controversial remark that "too big to fail" is too big to exist?

Allow me to posit a slightly less controversial standard, that really ought to be regarded as simple common sense, but probably won't be: Too big to *succeed* is too big to exist.

Expand full comment

No, the responsibility lies on the society collectively removing pedophiles from getting in touch with children.

Expand full comment

Though I applaud the whistle blower for the courage, his solutions are of a leftie ignorant of economics.

Just sue the hell out of the company for the every instance of left unaddressed abuse. That should do it. Make punitive damages prohibitively high.

Just to be sure, allow lawsuits against government officials, responsible for the public safety.

Put that Sherif from FL in charge of monitoring tech unit.

Expand full comment

It won't work. These companies rake in too much money.

Throw a few tech CEOs in prison for a few weeks for facilitating child trafficking...

Get a US attorney in build a "ham sandwich" indictment against the Meta executive team...

That will get changes quickly.

Expand full comment
May 6·edited May 6

What about Taliban justice? Would a few public stonings do the trick? 😁

Expand full comment

Agreed, remove the problem at the root - which is the addictive blue-lit black box of wireless radiation called a "phone"

Expand full comment

Throwing it all on parents is unfair. Parents have a right to government policies that will help them to raise functional kids. This is literally the first job of every human society: "produce, raise, and acculturate the next generation to be capable of doing the same." What ever happened to Hillary's "it takes a village to raise a child" idea?

It's parents job, yes. But it's the state's job not to tie both their hands behind their back. Treat smartphones and social media like cigarettes in law.

Expand full comment

Your quixotic and arbitrary "solution" is unrealistic and violates their civil and human rights. Especially after 18, when they are no longer "minors". They would be "adult children living at home", unless you plan on throwing them out on their 18th birthday.

Expand full comment

No phones at all until they leave home as adults. That takes care of phones. Homework on a screen? Easy. THey use screens only in a common room, with the screen visible to everyone. You can't stop everything, to be sure, but all the above will stop everything that can be stopped. No school or government or social media platform regulation will work.

Expand full comment

The vast majority of ADULTS probably shouldn't have smartphones. It's not just kids. These things are addictive and distracting. Most human beings can not handle having a pipe to all the world's information (including every seedy, dark alley) in their pocket 24x7.

I had a smartphone for many years. I finally cut the data plan way down (100MB/month) so I could use it for quick lookups but not for surfing. 3 years ago, after watching Citizen 4 about Edward Snowden, I bought a non-Android flip phone when my old phone died.

Expand full comment

Indeed, those who want to ban them for teens and young adults should lead by example and ditch their own phones first, lest they be flaming hypocrites.

Expand full comment
May 6·edited May 6

Parents play a clear role. Social media companies play a clear role. Of course, Congress plays a clear role. And.... all 3 have, largely, abdicated their role.

I wonder why the phone manufactures and providers escape judgement in this situation? Those phones are bought, and are designed with chips and capabilities that can be limited at the point of design and sell.

Personally.... a law to prevent the sale of a 'smart'/social media-enabled phone to be uaed by anyone under a certain age seems reasonable. Simple flip phones only until of age. BIG fine for parents, retailers… otherwise. BIG fine for social media companies use allow these coded devices on their networks.

Would take a lot of pressure off parents. "Blame your congressman, sweetheart." :)

Expand full comment

What age exactly? How big a fine? And what should be the penalty for the young people themselves, or perhaps those adults or slightly older peers who temporarily furnish (lend) phones to them? Or has the Overton window not shifted that far yet?

Expand full comment
May 7·edited May 7

As you know (you type it often), extremes across the Overton window are ‘Unthinkable.’ One extreme is ignoring the clear evidence and doing nothing.

My point was that much (not all) of this might be dealt with by legislation aimed at the source (manufacturers through age coding and retailers). I suggested a "big" fine, the amount of which is above my pay grade. $20,000 per violation? In reality, no number is high enough to solve the whole problem; a fine (traffic, assault, drugs, etc) merely helps to deter the more honest and moral of us.

As far as the kids....

The old joke goes, the farmer hired an elephant to guard his peanuts, but the elephant ate all the peanuts. After all, that's what elephants do. So don't get mad at the elephant, get mad at the farmer who hired the elephant.

My focus is on the farmers (parents, retailers, manufacturers, social media....).

Plus.... the kids are already fully paying for their obsession with social media and smartphones. They will be paying throughout their time on this flat planet. But if you/others think they should pay a fine, then so be it. Personally.....

I find dealing with the limited number of manufacturers and retails a better, more pollicable solution.

Expand full comment

I personally don't think the kids themselves should pay any penalty, but rather I was anticipating someone else in the near future advocating something like that eventually, per the...wait for it...Overton window (yet again).

Thanks for bringing up the elephant and peanuts joke, by the way 😊

Expand full comment

“No social media before 16”? Why not, “no church before 16,” “no Scouts before 16,” and “no families before 16”?

Yes, that’s sarcastic, to point out that the “protect kids” outrage against Facebook, Instagram, books, etc., fulminates against sexual solicitations, messages, and images on screens or in a libraries, while ignoring REAL, substantiated sexual abuses and rapes of children and teenagers by churches, schools, youth programs, and (especially) families.

The strangely muted, lack of concern over real violence against youth has been sublimated by the massive crusade to "protect kids" from virtual “messages.”

No one should receive unwanted sexual messaging, but that’s nothing compared to actually BEING violently abused and/or raped.

The Administration on Children and Families substantiates 60,000 sexual abuses and rapes of children and youth by parents and caretakers every year – 1,100 a week, a fraction of what actually occurs. The huge Catholic Church admits 5% of its personnel are abusers, and other churches also have scandals. The Boy Scouts estimates many thousands of its leaders were abusers.

Schools and sports programs, up to universities and the Olympics, have abuse epidemics. Even in sensational news stories, far, far more reports of real sexual abuses of children and youth are tied to schools than to very rare cases involving social media. In each case – like Facebook –real-world institutions’ leaders ignored and stonewalled efforts to investigate their abuses.

Social media, school, dating, and real-world victimizations are strongly linked and rooted in parent-adult-inflicted abuses – a serious, ignored issue. The CDC’s 2021 ABES survey shows that the 55% of teenagers who report emotional and/or violent abuses by parents and household grownups are 4 times more likely also to be bullied online, 5 times more likely to suffer dating violence, 7 times more likely to be raped, and 7 times more likely to suffer school violence.

Yes, online sexual harassments and unwanted messaging should be stopped. But the real story is how skilled teenagers are at protecting themselves from real-world violent and sexual victimizations amid the millions of virtual solicitations surveys estimate they receive.

Teens don’t need to be banned from social media or subjected to more parental controls to “protect” them; they need social media access to protect themselves from the vastly more damaging real-world violence grownups inflict that we are not even beginning to protect them from.

Expand full comment

And don't for a minute think they will stop at 16. Once the Overton window shifts in their favored direction, it will become 18, then 21, then God only knows how high. All based on the modern-day phrenology of pop neuroscience. See the comments on this very article, for example.

Expand full comment

Well-said as usual, Mike! You always know how to cut through the crap and see the forest for the trees. Thanks 😊

Expand full comment

As a child of the 90s, the fact that people are still surprised by pedophiles sexually harassing children on the Internet, is surprising to me. Remember Omegle?

Come on guys, have you not used the Internet yourself? Pedophiles have been sexually harassing children on the Internet for at least 20 years now. They just recently discovered Instagram where children congregate, as any pedophile with a brain cell would. Why is everyone so surprised by this?

In terms of solutions, some very simple natural language processing and age verification would do. Social media companies aren't doing them because, I suspect, facilitating communication between pedophiles and children is profitable for them.

Expand full comment

> Within the space of a typical week, 1 in 8 adolescents aged 13 to 15 years old experience an unwanted sexual advance on Instagram.

Assuming that the advances are distributed completely randomly, (which is probably not in fact true, but provides a decent starting point for some simple heuristics,) basic mathematics suggests that over the course of 8 weeks, about 65% of young teens receive such solicitations, and in a year, 99.9%.

Expand full comment

When parents at my homeschool coop ask me (the resident nerd) when their kids should get a smartphone, my answer is always the same: "whatever age you think they are ready for hardcore pornography."

Expand full comment

Lol im going to use this answer

Expand full comment

Parents consistently talk about what happens to their children online as if they aren't the ones who allowed them online in the first place. Letting a kid use the internet by him/herself is the problem, not the fact that there are bad people out there. The latter has always been true; the innovation here is parents letting their kids do whatever because they don't understand what the internet is and they have been brainwashed into thinking that they are keeping their kids "safe" by keeping them at home.

The obvious truth is there is no way to make social media "safe" for young people -- but even that realization does not go far enough. The missing piece here is that we no longer even ask the question "what is social media good for?" There is nothing of genuine value in it -- and for most people, it even flunks the modern-day substitute for the questions about virtue, "does it feel good?"

There are no solutions -- only trade-offs. Meaning, the way to make Instagram safe is to delete Instagram.

Expand full comment

Thank you for sharing.

I share your concern about protecting free speech, but I think the word “censorship” needs to be more clearly defined.

There’s direct censorship - as in government dictating what we can or can’t read, hear, or see.

When it’s good it protects children, who lack educated judgment, from being exploited or taken advantage of. When it’s bad, it prevents all adults, who are legally responsible for their personal decisions, from reading, hearing or seeing information.

But then there’s indirect censorship by businesses who control information and can reach many people at once. When it’s good it prevents malicious content from being spread. Because we all know that’s what people share the most and the fastest. When it’s bad it amplifies the malicious content so that the countering information doesn’t get through. To maximize profits, Social Media algorithms are designed to advantage content that people will click on. Research shows that negative content is more likely to be clicked. So aren’t Social media algorithms indirectly censoring and violating the free speech of those trying to defend themselves or ideas from negative attack?

Expand full comment

There should be different standards for the speech happening between pedophiles and minors. Just like there are different standards for adult porn and sexually exploitative material of children.

Expand full comment

Yes. The same standards as legacy media which Social Media is exempted from.

Expand full comment

Though I applaud the whistle blower for the courage, his solutions are of a leftie ignorant of economics.

Just sue the hell out of the company for the every instance of left unaddressed abuse. That should do it. Make punitive damages prohibitively high.

Just to be sure, allow lawsuits against government officials, responsible for the public safety.

Put that Sherif from FL in charge of monitoring tech unit.

Expand full comment

And of course schools have all the work online so unless you are standing over your child's shoulder and watching them nonstop it is very hard to know when they are looking at content like this.

they can have lots of windows open and switch back and forth very quickly.

Expand full comment

“I have often observed Meta CEO Mark Zuckerberg and his managers try to change the conversation to the things they measure. If the problems identified are not problems that the company’s systems are designed to detect and measure,”

Many in Congress, Supreme Court lack real world experience to understand that they could design policy to motivate better behavior. For example, if Social Media companies were liable for damages, they would minimize their risk by enforcing age restrictions in their terms of use, designing the algorithms to filter malicious content, and collect the data to prove it in court.

Expand full comment

Why not remove the problem at the root, and take away the phone, and letting children use a desktop instead? This minimizes the casino "scroll" effect, along with the magnetic field from the phone that can put childen in a trance by stimulating the right TPJ region of the brain:

https://romanshapoval.substack.com/i/142911438/magnetic-fields-change-our-morals

Expand full comment

The root of the problem is pedophiles, why don't we take away pedophiles from our society? Do you remove all children from grocery stores because sometimes there are also pedophiles shopping groceries?

Expand full comment

A little *snip snip* goes a long way, if you catch my drift 😉

Expand full comment

I hear you, but you can't eat your phone, and a cell phone destroys, harms, and maims, not like a bag of groceries, but like a pedo. Wi-Fi baby monitors are a prime example of opening ourselves up to pedos. https://romanshapoval.substack.com/p/babymonitor

Expand full comment

That's really reaching IMHO.

Expand full comment

Thanks for your post, Arturo. It was as frightening as it was educational. As I read the part about the responsibilities of SM companies, it reminded me of Molly Russell's father's recent article in the Guardian, where he argues that banning phones from schools may cause more harm than good. Personally, I'm not so sure. He believes, however, that the real problem lies with SM companies who need to get their house in order to better protect our children - one of the points you made. I was curious about your thoughts on banning phones in schools. Personally, I think it's a complex issue that needs to be tackled from several angles. Social norms on SM platforms, delaying access until high school, phones in lockers during the school day, regular learning experiences around digital literacy, parent workshops, and, most importantly, the fostering of trusting relationships between children and their parents that you spoke of.

Expand full comment

Not to mention the carcinogenic wireless radiation emitted by these devices:

https://romanshapoval.substack.com/p/techmyth

Expand full comment

The harms just keep multiplying. I hope we can get a handle on all of this soon.

Expand full comment

How about we declare a state of emergency and "quarantine" all social media for "just two weeks"?

Expand full comment