Congress Must Pass KOSA By Christmas. The Only Obstacle Seems to be Meta’s Influence
Now that Elon and X have addressed the last free speech concerns, Congress should pass the bill
Imagine the following situation: There is a toy that kills dozens or hundreds of children every year1 and harms millions more.2 The toymaker makes the toy available for free to any child who can reach the Internet, so parents cannot stop their children from playing with the toy unless they tightly monitor and control their children’s access to the Internet, even at school. Because parents feel so powerless,3 the toy becomes their greatest fear,4 and the vast majority5 want the government to compel the toymaker to remove a few of the toy’s most dangerous features. Year after year, the government does nothing.
After more than a decade of mounting evidence of harm and rising parental concerns, Congress finally acts in a stunningly bipartisan fashion to craft a modest bill that would remove a few (just a few) of the toy’s most dangerous features, without restricting children’s continued access to the toy. The bill is designed and modified carefully over several years and many hearings to take into account every conceivable objection from the left and from the right. It passes the Senate by a landslide vote of 91 to 3 and is then sent to the House of Representatives, where it also enjoys strong bipartisan support. And then, after all that work and all that support, the House leadership kills the bill without giving any believable justification.
This situation would be a travesty of democracy and common sense, and yet it is exactly what is happening with social media and the Kids Online Safety Act (KOSA), which has only one more week to be enacted. Speaker Mike Johnson said that he is killing the bill because he still has free speech concerns, but as we’ll show, this objection is not grounded in reality.
Industrial Scale Harm Committed by Snap, ByteDance, and Meta
For documentation of our claim that social media is as harmful as the hypothetical toy described above, you can see the footnotes and our many essays on After Babel. We’ll also add a few more facts here to establish that we are not exaggerating the harm or the need for greater protection of children and adolescents.
We know from the briefs filed by many attorneys general, which reveal internal communications from social media companies, that Snap gets 10,000 reports of sextortion per month. As reported in State of New Mexico vs. Snap Inc.,
Snap was specifically aware, but failed to warn children and parents, of ‘rampant’ and ‘massive’ sextortion on its platform–a problem so grave that it drives children facing merciless and relentless blackmail demands or disclosure of intimate images to their families and friends to suicide. Snap trust and safety employees acknowledged the ‘psychological impact’ of sextortion on its victims ‘especially when those victims are minor.’ By November 2022, Snap employees were discussing 10,000 user reports of sextortion each month, while acknowledging that these reports ‘likely represent a small fraction of this abuse’ given the shame and other barriers to reporting.6
The shame and fear that hits each and every one of these young victims is a huge cost in itself, and that cost climbs far higher when we consider that some of them then end their own lives. An FBI report investigated 12,600 cases of online sextortion that occurred between October 2021 and March 2023. They identified 20 deaths by suicide directly linked to these cases.7 Those teens would still be alive if not for a product that, by design, easily connects children with adult strangers, via disappearing photos, with few safeguards. In addition, according to the 2023 Federal Human Trafficking Report, Snapchat has been identified as the leading recruitment platform for sex trafficking victims.
Snapchat has also been a crucial marketplace for illicit drugs, compounding the teen opioid crisis. Snapchat’s internal researchers found that at least 700,000 Snapchatters are exposed to drug content every single day, “in the areas that we scanned” — and that “some teens have even died as a result of buying drugs that they found through Snapchat.”8
We know from the work of those attorneys general, and from investigative journalists, that Tiktok is also harming children at an industrial scale, and they’re well aware of it. We are fortunate that one of the briefs (Kentucky vs. TikTok) was posted to the internet with many sections redacted, but redacted improperly so that the hidden text could be copied out by anyone from behind the black bars. (We have done so in this PDF file, which shows all of the hidden text).
Let’s examine a few quotations from executives at ByteDance, which owns TikTok. One executive explained that “The product in itself has baked into it compulsive use.”9 Another stated, “The reason kids watch TikTok is because the algo[rithm] is really good. . . . But I think we need to be cognizant of what it might mean for other opportunities. And when I say other opportunities, I literally mean sleep, and eating, and moving around the room, and looking at somebody in the eyes.”10 Internal documents show that they know that teens are especially susceptible to compulsive use, with one internal report stating that minor users are “particularly sensitive to reinforcement in the form of social award,” have “minimal ability to self-regulate effectively,” and “do not have executive function to control their screen time.”11
Internal research from TikTok also reveals that they have failed at preventing highly inappropriate content from reaching users. For example, the researchers found that 35.71% of “Normalization of Pedophilia” content, 33.33% of “Minor Sexual Solicitation” content, 39.13% of “Minor Physical Abuse” content, 30.36% of “Leading Minors Off Platform” content, 50% of “Glorification of Minor Sexual Assault” and 100% of “Fetishizing Minors” content was missed by TikTok’s content moderation process.12
We know from many whistleblowers and journalists that Meta is also harming children at an industrial scale, primarily through Instagram. Arturo Bejar, a whistleblower and former Senior Engineer at Instagram, revealed internal research showing that on Instagram, about 20% of 13 to 15 year-olds say they were the target of bullying in the past seven days. Bejar also revealed that 13% of 13 to 15 year-olds said that they have received unwanted sexual advances… in the past seven days. Bejar describes these findings like this: “Instagram hosts the largest-scale sexual harassment of teens to have ever happened.”
Other internal research, brought out by whistle blower Frances Haugen, found that in or around 2021, 6% of teen girls in the U.S. and 13% of teen girls in the U.K. “traced their desire to self-harm/commit suicide to Instagram.” One in eight Instagram users told Meta’s researchers that they thought the platform made thoughts of suicide or self-injury worse. Not to mention the widely-known findings that Instagram “makes body image issues worse for one in three teen girls”; and that one in five teens state that Instagram makes them feel worse about themselves.13
There is an active debate among researchers about whether social media use, in general, is a cause (versus a mere correlate) of the rising levels of internalizing disorders (e.g., anxiety and depression) at the “population level” (meaning: those hockey stick graphs of rising anxiety and depression in the early 2010s that we showed in Chapter 1 of The Anxious Generation.) But when we look at the individual level, as Surgeon General Vivek Murthy has urged, the direct harm to specific children cannot be denied. Not one of the hundreds of parents whose sons and daughters died by suicide within a few days of being sextorted have mistaken correlation for causation.
It is the heroic parents of these dead and injured children who are the driving force behind KOSA.
KOSA is Not a Threat to Free Speech
KOSA is a straightforward bill that puts in place a few protections for kids online (defined as those under age 16). Its key features include: 1) setting the strongest privacy settings for kids by default, 2) restricting addictive product features and personalized recommendation algorithms for minors, and 3) mandating that companies remove design features for minors that are known to contribute to suicide, eating disorders, substance abuse, and sexual exploitation, or that are advertisements for certain illegal products (e.g. tobacco and alcohol).
This third protection has been most contested — opponents have raised concerns that the government will use the “duty of care” as a way to censor political or ideological content that a particular administration does not like. But this is a fallacy. KOSA does not limit free speech and it does not regulate content.
Here is the provision in the bill that specifically outlines this point:
Nothing in subsection (a) shall be construed to require a covered platform to prevent or preclude any minor from (A) deliberately and independently searching for, or specifically requesting, content; or (B) accessing resources and information regarding the prevention or mitigation of the harms described in subsection (a).14
In other words: KOSA places absolutely no restrictions on what any person can say on a social media platform—even if they want to say horrific things that promote eating disorders or suicide. KOSA also places no restrictions on what any child can search for—even if they are interested in finding horrific stuff about eating disorders or suicide. KOSA merely says, for the first time, that the platforms bear some responsibility for the content-neutral design choices they make. There are therefore no implications for free speech.15
But some opponents of regulation continued to say that KOSA could, conceivably, down the road, maybe pose a threat to free speech. That’s why Elon Musk recently got involved. Musk is among the most ardent free speech advocates in the world. He owns a platform that will be covered by KOSA. He hired a CEO (Linda Yaccarino) who had been a critic of social media, and who seems genuinely to care about children’s safety. In late November, Yaccarino and others from X engaged in negotiations with the two Senate co-sponsors, Richard Blumenthal and Marsha Blackburn, to add language that specifically says that KOSA cannot be used to censor anyone or to expose the companies to liability for what anyone posts. The new provision states:
Nothing in this section shall be construed to allow a government entity to enforce subsection (a) based upon the viewpoint of users expressed by or through any speech, expression, or information protected by the First Amendment to the Constitution of the United States.
Yaccarino then posted on X her report on the negotiations, which is worth quoting in full:
At X, protecting our children is our top priority. As I’ve always said, freedom of speech and safety can and must coexist. And as a mother, it’s personal.
When X testified before the Senate Judiciary Committee last January, we committed to working with Congress on child safety legislation. We’ve heard the pleas of parents and youth advocates who seek sensible guardrails across online platforms, and the Kids Online Safety Act (KOSA) addresses that need.
After working with the bill authors, I’m proud to share that we’ve made progress to further protect freedom of speech while maintaining safety for minors online. Thank you to @MarshaBlackburn and @SenBlumenthal for your leadership, dedication and collaboration on this issue and landmark legislation.
We urge Congress and the House to pass the Kids Online Safety Act this year.
Musk immediately posted his approval of Yaccarino’s post, adding on “Protecting kids should always be priority #1.”
With Musk and Yaccarino now backing the revised version of KOSA, many other prominent Republicans, including Donald Trump Jr. and Sarah Huckabee Sanders, have been standing up for KOSA and calling for its passage. As Senators Blackburn and Blumenthal put it, “These changes should eliminate once and for all the false narrative that this bill would be weaponized by unelected bureaucrats to censor Americans.”
Speaker Johnson’s expressed concerns are not relevant to KOSA. The free speech protections are as explicit as possible, thanks to Musk and X.
We cannot know what Johnson's real reasons are for blocking KOSA, but it may not be a coincidence that Meta, together with ByteDance, has spent more than $200,000 a day in the first half of 2024 to block KOSA. Meta also recently announced that it will spend ten billion dollars to build an AI facility in Louisiana, the home state of both Speaker Johnson and of House Majority Leader Steve Scalise.
Pass KOSA by Christmas
There is indisputable harm happening to children at an industrial scale—reaching literally millions of children. KOSA is a bipartisan bill that would begin to address those harms. Elon Musk and Linda Yaccarino stepped in to enshrine free speech protections in explicit language in the bill. Many tech companies now support KOSA. So did 91 Senators. So do leading Republicans and Democrats. So do most parents.
It’s time to pass KOSA. The clock is about to run out. The objections have been addressed. Pass the bill now.
See the FBI’s recent analysis showing a significant rise of online sextortion, primarily targeted toward teen boys. From October 2021 to March 2023, there were 12,600 reported victims, with 20 deaths by suicide directly linked to these cases. A recent unredacted Snapchat brief revealed that the company knew that there were far more cases of sextortion than that––around 10,000 every month––happening just on Snapchat.
Innumerable statistics are showing this level of harm from social media. From Instagram alone, it has been reported that 32% of teen girls feel worse about their bodies due to Instagram (see our collaborative review document Industrial Scale Harms of Social Media Platforms, section 1.5.1; 66% of teen girls and 40% of teen boys experience negative social comparison on social media (Industrial Scale Harms of Social Media Platforms, section 1.6.1); 21.8% of 13 to 15 year old children said they were targets of bullying on Instagram in a seven day period and 13% of 13 to 15 year old children said they received unwanted sexual advances on Instagram in the same period (Industrial Scale Harms of Social Media Platforms, section 1.7.1 and Industrial Scale Harms of Social Media Platforms, section 1.2.1), half of Instagram users of all ages, and more than 50% of teenagers, had a negative experience on Instagram in a seven day period (Industrial Scale Harms of Social Media Platforms, section 1.9.1); and 70% of U.S. teenage girls are seeing content associated with more negative appearance comparison (NAC) per Meta’s own research (Industrial Scale Harms of Social Media Platforms, section 1.5.5).
See the Surgeon General’s parental advisory on parental stress and his call for warning labels on social media platforms, referencing the powerlessness that many parents feel today around their children’s use of digital technologies.
A 2023 CS Mott Poll of 2100 U.S. parents found that the number one child health concern of parents was overuse of devices/screentime (67%) and social media (66%).
87% of Republicans and 88% of Democrats favor KOSA, according to August 2024 polling. In fact, the poll found that three out of four voters (76%) are more likely to vote for a U.S. representative who supports KOSA.
See New Mexico Brief, page 4, para 7.
Additional note: [From New Mexico V. Snap pg. 23-24 para. 67]: In January 2024, the Network Contagion Research Institute (NCRI) published its Threat Intelligence Report which warned of an ‘exponential increase in sextortion cases targeting minors and youth on social media platforms over the past 18 months. During this period, the FBI reported a 1,000% increase in financial sextortion incidents, while NCMEC reported a 7,200% increase in financial sextortion targeting children from 2021-2022.
See New Mexico Brief, page 94, para 229
See our PDF of the Kentucky Brief, page 8, para 19
See our PDF of the New Mexico Brief, page 8, para 19
See our PDF of the Kentucky Brief page 41, para 126
See our PDF of the Kentucky Brief, page 106, para 341
See Slide 21 of Meta’s internal research slidedeck, “Teen Mental Health Deep Dive.”
Subsection a is the “prevention of harm to minors” subsection under the Duty of Care.
We note that Jon Haidt has a long record of promoting free speech and free inquiry. He is a co-author of The Coddling of the American Mind and is a co-founder of Heterodox Academy, a community of over 7,000 professors who advocate for open inquiry, viewpoint diversity, and constructive disagreement on university campuses. He is also a signer of the Westminster Declaration.
One risk is that Section 9 of the bill (reproduced below) looks like a Trojan horse for requiring everyone (adults and children) to verify their identity before using social media and other Internet services, as I don’t see how verification of age can be done without requiring everyone to identify themselves a priori. While I can envision ways to do this while maintaining anonymity, I am deeply skeptical that a study conducted by the government would err on the side of user privacy over government access. If ID is required to access social media, this of course would be a great boon to authoritarian regimes everywhere as the ability to dissent anonymously a la Thomas Paine would be largely eliminated if people’s real world ID is tied to every post. Any thoughts on this aspect?
SEC. 9. AGE VERIFICATION STUDY AND REPORT.
(a) Study.—The Director of the National Institute of Standards and Technology, in coordination with the Federal Communications Commission, Federal Trade Commission, and the Secretary of Commerce, shall conduct a study evaluating the most technologically feasible methods and options for developing systems to verify age at the device or operating system level.
(b) Contents.—Such study shall consider —
(1) the benefits of creating a device or operating system level age verification system;
(2) what information may need to be collected to create this type of age verification system;
(3) the accuracy of such systems and their impact or steps to improve accessibility, including for individuals with disabilities;
(4) how such a system or systems could verify age while mitigating risks to user privacy and data security and safeguarding minors' personal data, emphasizing minimizing the amount of data collected and processed by covered platforms and age verification providers for such a system; and
(5) the technical feasibility, including the need for potential hardware and software changes, including for devices currently in commerce and owned by consumers.
(c) Report.—Not later than 1 year after the date of enactment of this Act, the agencies described in subsection (a) shall submit a report containing the results of the study conducted under such subsection to the Committee on Commerce, Science, and Transportation of the Senate and the Committee on Energy and Commerce of the House of Representatives.
I learned my lesson with the Patriot Act.