Snapchat is Harming Children at an Industrial Scale
In their own words, we see that Snap Inc’s design choices expose millions of kids to harm
Introduction
On October 1, 2024, investigative journalist Jeff Horwitz reported a startling statistic from an internal Snap Inc. email quoted in a court case against Snap Inc., the company which owns Snapchat. The email noted that the company receives around 10,000 reports of sextortion each month—and that figure is likely “only a fraction of the total abuse occurring on the platform.”
This statistic prompted us to investigate what else Snap Inc. knows or believes about the impact of its product on users, particularly teens (We estimate that roughly 13 million American 13-17 year-olds use Snapchat). Over the past several months, we have examined multiple court cases filed against Snap Inc., many involving severe or fatal harm that was (allegedly) facilitated by Snapchat’s features. From 2022 through 2025, as part of the Multidistrict Litigation (MDL) and Judicial Council Coordinated Proceedings (JCCP) against social media defendants, more than 6001 such lawsuits specifically named Snap Inc. as a defendant. In addition, state attorneys general from Nevada and New Mexico have brought significant cases against the company—two cases which we will draw heavily from in this post.
Following the format of our previous post about the “industrial scale harms” attributed to TikTok, this piece presents dozens of quotations from internal reports, studies, memos, conversations, and public statements in which Snap executives, employees, and consultants acknowledge and discuss the harms that Snapchat causes to many minors who use their platform. We group these findings into five key clusters of harms:
Addictive, Compulsive, and Problematic Use
Drugs and Guns
Child Sexual Abuse Material (CSAM), Sextortion, and In-person Sexual Predation and Assault
Cyberbullying
Knowledge of Harm and Underage Use, and Lack of Action
Similar to TikTok, we show that company insiders were aware of multiple widespread and serious harms, and in many cases did not act promptly or make substantial changes. As Snap’s director of security engineering said regarding Android users who are selling drugs or child sexual abuse material on Snap:
“That’s fine it’s been broken for ten years we can tolerate tonight.”2
With regard to sextortion on the platform, one employee had complained in a private channel:
“God I’m so pissed that were over-run by this sextortion shit right now. We’ve twiddled our thumbs and wrung our hands all f…ing year.”3
The briefs allege that the company is also aware of rampant underage use, and of the ineffectiveness of their age gating process. Snap executives have admitted that Snapchat’s age verification system
“Is effectively useless in stopping underage users from signing up to the Snapchat app.”4
Although the evidence below is all publicly available, no one has compiled and combined direct quotations from company insiders and internal reports across multiple alleged harms. We think this compilation gives vital information to parents, who might want some insight into the business practices of a company that hosts their children’s social lives, owns much of their attention, and influences their social development.
At the start of each section, we highlight a real-life example—drawn from relevant court documents—illustrating the specific harm in question. Each child’s story offers a human perspective on the broader statistics and quotations that reveal the far-reaching harms discussed within the company.
1. Our Conversations with Snap, Inc.
While working on this post we had four conversations with Snap’s leaders and employees during which we asked them about some of the harms that appear in the various briefs. In three of those meetings, we asked specifically about the claim, taken from an internal Snap email that was quoted in the New Mexico brief, that Snap gets 10,000 reports of sextortion each month. We did not get any rebuttals or explanations of that claim.
Snap’s Trust and Safety team made a point that we think is valid and important for readers to keep in mind: the briefs we are drawing from present the allegations of one side in litigation, and there is often another side. Some of the quotations may have been misinterpreted or taken out of context. Snap’s Trust and Safety team pointed us to a motion Snap made to have the New Mexico case dismissed. We read that brief and found that it contested only a few of the many claims made in the New Mexico brief. Two of these claims had been in our list of quotations, so we cut one from the post below, and we added a comment to the other.
Snap’s Trust and Safety team also shared various measures they take to mitigate harm to children and teenagers. The Trust and Safety team said that child safety is their top priority and they told us that they proactively remove significant amounts of harmful content. In Snap’s motion to dismiss they state that Snap has “doubled the size of its Trust and Safety team and tripled the size of its Law Enforcement Operations team since 2020” which has “improved Snap’s ability to act quickly when Snapchat users report harassment or improper sexual content on the platform.”
We have no doubt that Snap does remove large quantities of harmful content from its platform, or that they are trying to remove even more. However, it is difficult to assess whether Snap is solving 5% or 75% or 99% of the problem since its metrics focus on the number of pieces of content removed rather than the percentage of Snapchat users who experience harm on the platform. Even if Snap were to remove a billion pieces of drug or sex related content each year, many teen users may still encounter such content every day. And any teen who wants to buy drugs may still find it easy to find dealers, as has happened in many tragic cases of fentanyl poisoning, including very recent cases. Many may still report seeing sexual content, as there are many ways for users to be sexually explicit without violating policies.
From a parent's perspective: if you were choosing which summer camp to send your teen to, would it be reassuring to learn that a camp used to remove 100 sharks a month from its coastal swimming area, but now they remove 500 a month? Probably not reassuring at all. As a parent, you’d much prefer a camp that put its resources into prevention—such as by putting an effective shark barrier around the swimming area—rather than one that focused on catching sharks more quickly after campers report seeing their fins.
We also had the opportunity to discuss many of our observations about Snapchat’s features with both their leadership and their trust and safety teams. We appreciated their willingness to engage with us. We suggested to them some design changes that we believe would make the platform less addictive and less harmful:
Remove the Quick Add feature5 for minors, which is one of the main ways that adult predators and drug dealers get access to teens.
Remove the streaks feature,6 which causes many teens to send photos to each other compulsively, needlessly increasing their time on the app.
Remove beauty filters for minors
Remove engagement-based algorithms, at the very least, for underage users.
Stop deleting posts on Snap’s own servers. The fact that Snap does not store the content of conversations (beyond a limited period) is helpful for drug dealers, sextortionists, and others with criminal intent, but it does not improve the user experience of most children to know that even if something goes terribly wrong, their conversations cannot be discovered by law enforcement officers.
Do a lot more to remove underage users. Snap is widely used in middle schools. (A 2021 survey by Common Sense Media reported that 13% of 8 to 12 year olds said they had “ever used” Snapchat. We therefore estimate that in 2021 in the U.S., about 2.7 million children ages 8 to 12 had used Snapchat.)
One broader request we made was to collaborate on publicly accessible user experience research that could help quantify and reduce the harms we describe below. It would be important to know precisely what percentage of kids receive unwanted advances on their platform, or know how to access drugs - which are very different questions from the ones that they generally answer publicly. Such assessments have been called for by company whistleblowers and public health experts.
We remain hopeful that Snap will act on some of these requests. Nonetheless, we decided to move forward with this post because whatever safety improvements have been made in recent years, and whatever improvements Snap says it will make in the future, we believe it remains essential for the public to understand the dangers associated with Snapchat, as expressed by Snap’s own employees and consultants, who are quoted in numerous court documents that have emerged in recent years.
While the company has clearly made efforts to address some of these concerns, transparency about their past actions and insider beliefs about the platform's impact are critical for those of us trying to understand: how did we get here? What on earth happened in the early 2010s such that there is now an international youth mental health crisis, increasing evidence of attentional fragmentation and declining functional intelligence, and countless cases of severe harm—from fentanyl-laced drugs bought on Snapchat to suicide after sextortion that began on their platform.
The quotations we give below indicate that the harms that occur on Snapchat (as with TikTok and Instagram) are so vast that even a highly dedicated trust and safety team that removes hundreds of millions of pieces of harmful content cannot prevent millions of children from being exposed to serious harms on their platform. This is why design changes are so urgently needed. Better content moderation is not enough.
Here’s a simple way to determine whether an online platform is safe for kids: Does it connect children to anonymous unverified adult strangers? If so, then a great variety of harms are likely to ensue, and parents should be wary of letting their children use that platform until the company makes very substantial design changes.
2. What We Have From Snap
We draw primarily from Attorney General Raul Torrez of New Mexico who released a 165 page narrowly redacted complaint against Snap Inc. on October 2, 2024. We also draw on internal evidence from The State of Nevada’s Complaint by Attorney General Ford and Co., as well as the civil action second amended complaint, Neville et al. v. Snap.
We have created an annotated version of each of the three briefs (New Mexico, Nevada, and Neville et al.)7 so that you can see each selection we chose in the context of the rest of the brief.
There are also a variety of smaller individual and class action lawsuits which address how specific features of Snapchat caused specific harms on individual minor users. Though we do not rely on these for internal quotes from the companies, they offer testimonies of families who have been harmed by the platform.8
In the rest of this post, we organize the evidence of harm that is currently available to us, taken directly from employees and leaders within Snap Inc. and from reports that they commissioned.
3. The Harms They Believe or Know They Are Causing
[[Note that in this section, text in bold is direct quotations from company employees, consultants, executives, and internal memos. Text not in bold is direct quotations copied from the indicated portion of the indicated AG brief, which sets up the relevant quotation from company insiders. Italicized text in double brackets is annotations from us — Jon and Zach. Note that we include brief summaries of real life examples at the beginning of each harm cluster. These are written by Zach and Jon, drawing from quotes from the court documents. For each harm, we draw from the various briefs discussed above.]]
Harm Cluster 1: Addictive, Compulsive, and Problematic Use
[[According to the briefs, Snap Inc. designed its platform to maximize engagement and time spent by minors—thus driving problematic social media use among minors. This is done through features such as push notifications, Snapstreaks, Snap Stories9. and others.]]
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *.
[[In each harm cluster, we begin with the story of a child featured in a lawsuit filed against Snap. We share their experiences through direct quotes and, at times, by summarizing key details from the legal briefs. For ease of readability, we do not include italicizations or brackets for these stories]]
Real life example: The following facts are alleged in the public complaint Neville v. Snap P. 117-124, PARA 481-517: Jack McCarthy got his first phone at 12 years old, and opened a Snapchat account without his parents’ knowledge or consent. Jack’s use of Snapchat “coincided with a steady decline in his mental health.” Jack became “locked into Snap’s social media product, as intended, causing him to feel like he couldn’t sleep without it.” When his parents tried to limit his access to Snapchat, Jack “became agitated… He would become visibly panicked and irrational, willing to do and say anything to get his device back.” Although Jack’s sleep and anxiety worsened, he would claim his “insomnia” would be made worse if his phone was not at his side. After years of Jack’s declining mental health and increasing dependence on the Snap platform, Jack obtained drugs through an anonymous dealer on Snapchat who had added him through Snap’s Quick Add and mapping features. Jack was found dead on his family’s kitchen floor on the morning of September 25, 2021.
On March 22nd, 2025 we corresponded with Jack’s mother via email. She explained that “Jack died from fentanyl poisoning not an overdose. Jack took one pill which unbeknownst to him contained fentanyl… Enough fentanyl to kill four people. Jack never stood a chance.”
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *.
Internal Evidence
New Mexico (NM)
NM P. 111, PARA 273
In January 2017, an internal email titled, “Snap streak distribution first look” highlighted several comments from Snap employees concerning Snapstreaks:
“Wow, we should add more addicting features like this.”
“Think it would be interesting to investigate how healthy Snapstreak sessions are for users… If I open Snapchat, take a photo of the ceiling to keep my streak going and don’t engage with the rest of the app, is that the type of behavior we want to encourage? Alternatively, if we find that streaks are addictive or a gateway to already deep engagement with other parts of Snapchat, then it would be something positive for “healthy” long term retention and engagement with the product.”
“70% of our DAU visit the app everyday, but only 22% have streaks going.”
“Most streakers are our core demographic.”
“We should answer at the highest level, whether streaks are a by-product of high engagement or a driver of it. My hunch is that it starts off being the former, but eventually becomes the latter - and we should figure out when that magical transition point occurs.”
NM P. 112, PARA 27610
A December 2018 presentation titled “Understanding the Consumer and Snapchat Discover” outlined findings from online surveys and focus groups. The summary noted, “Streaks have become pressure filled…” and included data on users’ fear of missing out (FOMO):
“As the true digital natives, Gen Zs see their mobile devices as an extension of themselves, and while this allows constant access, it also creates constant pressure. There is never a break from the very real FOMO that exists.”
“Respondents in groups and via social media diaries expressed that if they’re not constantly checking social media they felt they were “missing” things (content, communications from friends, news, etc.).”
“45% of Snapchat Users 13-17 use Snapchat “almost constantly”
“41% of Snapchat Users 13-17 use Youtube “almost constantly”
“34% of Snapchat Users 13-17 use Instagram “almost constantly”
NM P. 113, PARA 278: In October 2019, a presentation acknowledged that “Streaks make it impossible to unplug for even a day” and that “Maintaining Streaks and keeping up with conversations… causes pressure,” which, heightened by notifications, can be stressful: (Fig. 42)
Nevada (NV)
NV P. 24, PARA 63
Disruptive use of Snapchat in the classroom was no surprise to Defendants. In the first post on Snapchat’s website, Defendants stated it was “thrilled” with the disruptions:
“[t]o get a better sense of how people were using Snapchat and what we could do to make it better, we reached out to some of our users. We were thrilled to hear that most of them were high school students who were using Snapchat as a new way to pass notes in class—behind-the-back photos of teachers and funny faces were sent back and forth throughout the day” [[You can still find this quotation on Snap’s website. The quotation continues like this: “Server data supported this and we saw peaks of activity during the school day and dips on the weekends.”]]
NV P. 51-52, PARA 157
As one example, in 2018, Defendants conducted internal research on SnapStreaks, which found that over a third of its users reported that keeping a Snap Streak alive was “extremely” or “very important,” and users further reported that the stress level they experience in keeping Streaks alive was “large” and even “intolerable.”
NV P. 52, PARA 158
Similarly, additional internal research demonstrates that Snapchat users are more compulsive in their use of the platform, engaging with it “right when I wake up,” “before work/school,” “during work/school,” “after work/school,” “on vacations,” and “when I’m with others[.]”
Harm Cluster 2: Drugs and Guns
[[According to the briefs, there is widespread exposure to violent and drug-related content on Snap. This content is often viewed via ‘Spotlight’11 and ‘Discover’, and is exacerbated by features such as ‘Quick-Add’.]]
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *.
Real life example: The following facts are alleged in the public complaint Neville v. Snap P. 80-86, PARA 299-322: It is believed that Alexander Neville started “using Snapchat sometime just prior to starting 8th grade.”
“As a proximate result of Snap’s products and features, i.e. push notifications, user recommendations, interface and operational extended use designs, rewards and gamification features, etc.– Alexander began suffering from severe mental health harms, including, but not limited to, social media compulsion, sleep deprivation, increased anxiety, and depression.”
“Snap also began directing and recommending drug advertisements to Alexander and connecting him to Snapchat Drug Dealers via its recommendations and mapping and location features.”
“[H]e received multiple Quick Add requests… Among the strangers to whom Snap connected Alexander were nearby Snapchat dealers –persons Alex did not know in real life.”
An anonymous dealer, “AJ Smokxy” sold Alex a pill that was “100% fentanyl.” Alex Neville was pronounced dead from fentanyl poisoning via that pill on the morning of June 23, 2020 at 14 years old.
“AJ Smokxy’s account remained active for roughly a year after Alexander’s death…”
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *.
Internal Evidence
2.1 Drug Exposure and Sales
New Mexico (NM)
NM P. 93-95, PARA 227-230
PARA 227: Internal documents show that Snap was aware that its platform was being used to market and sell illicit drugs. After an October 2019 news article described Snapchat's popularity with drug dealers, Snap’s Communications Director complained internally that while the company was “pushing back fiercely on the claim that illegal content was particularly bad on Snapchat… from what we can see, drug dealing—both buying and selling– has increased significantly.” She noted that dealers use Stories, which are recommended through Snap’s Discovery feed or set to allow communication with “Everyone” to “amass a huge amount of subscribers” with a “lack of repercussions”. While an account may be deleted if it is reported, “it is not necessarily device blocked, meaning accounts pop right back up. Nor is there any threat to the account being reported by law enforcement,” which complains about the “difficulty of apprehending bad actors on our platform.”
PARA 228: Indeed, later that year, meeting notes confirm Snap's recognition that “some bad actors prefer to transact on Snapchat given the ephemerality of communications on our platform.”
PARA 229: Snap employees also circulated media reports that dealers were finding buyers through its Quick Add feature and that, “per our analysis, on average at least ~700k Snapchatters are exposed to drug content daily in the areas we scanned.” A presentation by the security firm Crisp advised Snap, in a slide headed “Enabling Easy Access to Illegal Substances,” that: “It takes under a minute to use Snapchat to be in position to purchase illegal and harmful substances.” Still in 2022, another firm warned that Snap's features promoted the sale of drugs, warning that not only does Quick Add connect buyers and sellers of drugs, but that Snap's algorithm then “suggests users with similar names and profile types” and that “[a]rtificial intelligence is trained to link these similar accounts together” and that “adding drug or porn accounts leads to more suggested drug and porn accounts.”
PARA 230: In June 2020, Snap received a list of concerns from the Daniel Spargo-Mabbs (DSM) Foundation, a drug and alcohol education charity, regarding the availability of drugs on Snapchat. DSM noted, “It is far too easy to find accounts openly selling illegal drugs on Snapchat.” They further stated, “Snapchat is over-reliant on users reporting drug-related content, despite recognizing low levels of reporting by users.” In preparation for an August 2020 meeting with the founder of DSM to address concerns raised about drug dealing on Snapchat, an internal memo laid out Snap's approach to the meeting and draft responses. In part, Snap noted, “We apply different steps against illegal activity to different elements of the platform, some of which we do not publicize to prevent circumvention of those steps. The public side of Snapchat - our Discover platform - is curated and pre-moderated, which prevents opportunity for this kind of activity. When it comes to users' private communications - their Snaps, Chats and Stories - users do have a justifiable expectation that these aren't being monitored or scanned (just as is the case with iMessage, SMS, Whatsapp or private phone calls), and that's not something that we do. So we do rely on user reporting to alert us to illegal activity in this area....”
NM P. 95, PARA 232
In response to rampant drug trafficking on its platform, in 2021 Snap built Abacus, a ‘more proactive’ detection and enforcement model. An internal document noted, “Since we started in May, we have reviewed 1.5 million pieces of content for drugs, deleted a million of those and deleted half a million-drug sale related accounts. These dealers had previously gone undetected, and it is 35 times the number of dealers reported by end users.’ ‘Based on our current detection we see an average of about half a million unique users being exposed to drug related content every day…”
NM P. 100-101, PARA 241-242
PARA 241: According to an undated internal Snap presentation regarding a new safety measure, Snap acknowledged that it had a “problem” with drugs and guns on the platform.
PARA 242: The Snap presenter turned first to drugs, highlighting news articles and a tweet on his wife's feed conveying the ease with which a user could sell cocaine that had “almost a half million likes.” The presenter's notes explained that dealers are using Snapchat's “sharing mechanisms’ ‘to reach teens on Snapchat they would never encounter in real life’ and that ‘some teens have even died as [sic] result of buying drugs that they found through Snapchat.”
Neville v. SNAP (NvS)
NvS P. 48, PARA 172
Snap’s own disclosures further establish that Snap only enforces on a small fraction of reported drug activity, while continuously representing that Snap is taking all necessary action to protect minors on Snapchat.65
FOOTER 65: (reporting that Snap enforced on 270,810 of the 775,145 drug related reports it received during this recent six-month period) [[Snap Inc, Transparency Report January 1, 2022–June 30, 2022]]
NvS P. 54, PARA 195
Current and long-standing member of Snap Safety Advisory Board, Ed Ternan, also claims to have put Snap on explicit notice of what was happening on its platform in February or March of 2021 at the latest,
And we said to them, “you have a problem. What you don’t understand is that the pills being sold on your platform, they’re fake,” and their reaction was “what do you mean?” “Well, the Percs that are being advertised on Snapchat are not Percocet, that’s one thing. These are counterfeits made of fentanyl. You need to red flag this problem. You need to make this like child sex trafficking. This is child endangerment. You need to up your game.”
NvS P. 198, PARA 941
On more than one occasion, Snap itself told parents–behind closed doors and in writing–that it was aware of the fact that its young users “in fact much of society, remain frighteningly unaware of the opioid crisis and the deadly risks posed by counterfeit pills.”
2.2 Gun Exposure and Sales
[[According to the briefs, Snapchat has served as a market for illegal gun sales, connecting buyers to sellers through their search bar,12 Quick Add, algorithmic feeds, and Snap Map features.]]
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *.
Real life example: The following facts are alleged in the public complaint NM P. 100, PARA 239: “One New Mexico case demonstrates Snapchat’s use for gun-related crimes. Fourteen-year-old Ahmed Lateef and 15-year-old Collin Romero of Albuquerque were killed in 2018. The 22-caliber gun and bullets through [sic] Snapchat. Limited Snaps provided to law enforcement suggests that the victims met the seller through Snapchat. The three perpetrators, now serving life sentences, also recorded and saved on Snapchat’s Memories videos of beating their victims as they drove across Albuquerque. Snap did not report the activity to law enforcement at the time.” (P. 100, PARA 239).
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *.
New Mexico (NM)
NM P. 101-103, PARA 243-244
PARA 243: Turning next to guns, the employee continued down the Twitter thread to a user who responded with the image (Fig. 38) of gun he found for sale on Snapchat minutes before: The presenter notes to the slide (Fig. 39) explained, “[t]hese are not BB guns or hunting rifles, they are firearms and assault rifles” and “not registered, and they're often implicated in gang violence and murders[.]” Snap relayed that there were 50 posts related to illegal gun sales per day and 9,000 views per day of these marketed weapons. The presentation also acknowledged that “[m]ost bad content is not reported on Snapchat” and that even “[r]eported content is usually viewed hundreds of times before report.”
PARA 244: In response to a June 2022 Washington Post article titled, ‘Facebook's ban on gun sales gives sellers 10 strikes before booting them,’ Snap revisited its strike policy on weapons. One Snap executive noted, “Our strike system isn't yet activated at this point, and the silver lining there is that our draft approach can be adjusted without creating any operational headaches. For consistency across our enforcement framework, my bias is for launching the strike system with three consistent tiers - zero tolerance; 3-strike violations; and 5-strike violations - so here, we'd be contemplating moving weapons into the zero-tolerance tier. I'm very sensitive to the risks of weapon sales on our platform and I'm open to stricter prohibition. But I also appreciate our platform's primary use case is very different from TikTok's - enforcement of this prohibition on Snapchat would, for example, implicate user privacy[[ [[sic]] expectations in ways that I wouldn't expect to be applicable at TikTok.”
Harm Cluster 3: Child Sexual Abuse Material (CSAM), Sextortion, and In-Person Sexual Predation and Assault
[[According to the briefs, Snap employees have been aware of rampant cases of sextortion, child sexual abuse material (CSAM) and predatory behavior taking place on their platform through features such as ‘Quick Add’, ‘Snap Map’ and ‘My Eyes Only.’13]]
3.1 Child Sexual Abuse Material (CSAM)
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *. Real life example: the following facts are alleged in the public complaint NM P. 33-41, PARA 84-94:
“The New Mexico Department of Justice’s investigation uncovered an ecosystem of sites dedicated to sharing stolen, non-consensual sexual images from Snap accounts, some of whom appear to be underage.”
… “One of these dark websites includes a comprehensive handbook” that “describes Snapchat as an ideal vehicle for sextortion because of its intimacy and the belief in privacy, based on Snap’s promises of screenshot detection and its ephemerality settings.”
… “Snapchat was, by far, the largest source of leaked videos and images. Seller accounts openly captured, circulated, and sold sexually explicit content involving children on Snapchat and were recommended to users by Snapchat’s algorithm”


* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *.
Internal Evidence
New Mexico (NM)
NM P. 51-52, PARA 108-110
PARA 108: In November 2021, Snap circulated an external report that identified specific types of harm on Snapchat with examples. These harms included: facilitating sexual exploitation and grooming of children; child predator “capping” (the capture of a webcam conversation with a child, usually with the aim of getting them to perform sexual acts or undress); bad actor advice and requests to evade Snapchat safety measures; sale of CSAM; allowing users to share and trade CSAM; known predators directing minors to Snapchat; human exploitation and prostitution.
PARA 109: For example, numerous Snapchats included details of predators finding minors as young as 8-years-old through Snapchat or obtaining or selling CSAM: (Fig. 28 [[text transcribed below]])
TEXT FROM FIG 28:
“Ffrancious69: 13 year old punk girl i found on snapchat, up for chatting and more!”
“So guys last night I was on Snapchat when I saw Lucas train post he was horny. Sadly as much as I tried I couldn’t convince him to show me his dick but we had an interesting conversation. Obviously we talked about his journey as a TBM model and he had a lot of fun shooting and dancing for the photographer. He said that not only did he do nudes but there is footage of him doing stuff with the photographers son. I would assume this means that there is footage filmed on Snapchat of him and Xavier, he said that you can buy the[m] via DM on Snapchat and this gives me hope.”
“Freddy666 (@f666cosmo): Hi @yeraltin
I have regularly talked to an 11YO boy on a cam chat his cam freezes but he was so cute to let go and I was about to get to the good parts so I add him on Snapchat with my friends fake snap. And yes I did see what I wanted he did see my girl so far so good. Now he keeps typing to me he don’t want to see more or don’t ask for it, I keep putting him off as kind I can but he told me he don’t have any friends which broke my heart and I just can’t delete him.”
“DtravisBick287: Hey guys, I am chatting with this incredibly handsome and hot boy on snapchat and he has a dropbox that he sells. He won’t give it to me for free and I cannot afford it right now. Anyone know a way I can get hold of his content. He has solo videos and photos with his brother as well.”
(Snapchat) my 8yo boyfriend 16 Videos / 52 photos: https://xxxxxxxxxxxxxxx.com thanks”
PARA 110: Snap complained that requirements to identify grooming would be too invasive of user privacy, an especially problematic position, given Snap’s age verification failures, and “would create disproportionate admin costs.” Snap also created a view that “[i]t shouldn’t be a private operator's responsibility to determine what constitutes grooming.”
NM P. 50, PARA 105-106
PARA 105: In one internal exchange, Evan Spiegel rejected a suggestion that Snap retain images it categorized as abuse, which would enhance the platform's credibility in administering its rules, shifting the burden to young users to capture and report the content. In comments, Spiegel wrote, “Yeah, except we don’t want to be responsible for storing that stuff. Better if they screenshot and email ghostbusters to report.”
3.2 Sextortion
New Mexico (NM)
NM P. 3-4, PARA 6
PARA 6: Adult strangers can then take advantage of Snap’s algorithm, its appearance of safety and impermanence, and features like Snap Map, which allows them to find and meet these children in the real world. For years, Snap has been on notice from both external and internal sources of the dangers its platform presents for children but has nonetheless failed to stem the tide of damaging sexual material, sexual propositions, and dangerous drugs delivered to children.
NM P. 4, PARA 8
Instead of implementing additional safeguards to address the unique susceptibility of Snapchat, Snap has done the opposite. While recognizing the need to ensure that “user reports related to grooming and sextortion are not continuing to fall through the cracks” and that “no action is taken by agents” in instances where users report “being sextorted or asked for nudes (which we know is often the start of sextortion),” Snap also complained internally that identifying and protecting minors from sexually explicit content and predatory users would overburden its moderators, “create disproportionate admin costs” and should not be its responsibility. Snap employees pointed to a “case where an account had 75 different reports against it in Oct ‘21, mentioning nudes, minors and extortion, yet the account was still active.”
NM P. 7, PARA 15
Former Snap trust and safety employees complained that “they had little contact with upper management, compared to their work at other social media companies, and that there was pushback in trying to add in-app safety mechanisms because [Snap CEO] Evan Spiegel prioritized design.”
NM P. 53-54, PARA 115-118
PARA 115: Nearly a year later, in March 2023, Snap noted another “gap” in addressing sextortion on the platform. In addition to finding that many sextortion reports are “typically not associated with violating media, and therefore, were not actionable under existing policies,” an internal chat noted that “an investigation of confirmed sextortion cases involving nine distinct bad actors and 279 victims concluded that 70% of victims didn’t report their victimizations (and of the 30% that did report, there was no enforcement action by our team for the reasons noted above).”
PARA 116: Snap employees also complained about being understaffed to appropriately handle trust and safety functions.
PARA 117: That same month [after an investigation revealing the volume of sextortion on the platform], another Snap internal thread flagged that the platform was “leaving a lot on the table with CSAM sales” and advocated applying a rule that would address “thousands” of child pornography Dropbox accounts. When told that the proposed solution would have to be evaluated by “legal and privacy” and discussed at the next group meeting, the employee questioned, “I would think our legal obligations to remove CSAM from our platform at least somewhat mitigates the burden of legal review for a Rapid Rule with a very high enforcement rate.” Later that day, on the same communication channel, Snap’s director of security engineering addressed a fix to address Android users who are selling drugs or CSAM on Snap: “that’s fine it’s been broken for ten years we can tolerate tonight.” With regard to sextortion on the platform, one employee had complained in a private channel: “God I’m so pissed that were [sic] over-run by this sextortion shit right now. We’ve twiddled our thumbs and wrung our hands all f…ing year. [...] My concern is not really the ‘what’ [[publicity on Snap’s lack of action on sextortion cases]] but the ‘when.’”
[[In Snap’s motion to dismiss, they respond to parts of Para 117 stating: “The State also claims, for instance, that Snap lacks “urgency and commitment to addressing CSAM” because employees discussed that a “proposed solution” to CSAM “would have to be evaluated by ‘legal and privacy.’” (Id. ¶ 117.) However, the State omits that in the same communication, employees expressed confidence such review would be “simple and swift” because combatting CSAM is a “#1” priority for Snap. Contrary to the State’s insinuations, Snap’s employees act exactly as a responsible corporation should when faced with these issues—i.e., they work together to prioritize the safety and wellbeing of the Snapchat community and address potential criminal activity.]]
PARA 118: As laid out above and below, Snapchat’s dangerous design features and platform management decisions, including, but not limited to, its algorithm, have made and continue to make it easy for predators to find, connect with, and harm young victims. Some of these features include ephemeral or “disappearing” Snaps, Quick Add, and Snapmap.
NM P. 56, PARA 123
The FTC said Snapchat had also failed to put up basic safeguards, such as verifying users’ phone numbers. Some users had ended up sending “personal snaps to complete strangers”... A Snapchat representative admitted that “while we were focused on building, some things didn’t get the attention they could have.”
NM P. 75, PARA 182
NCOSE [National Center on Sexual Exploitation] stated in part, “It is vital that Snapchat takes a more proactive approach to websites or online personalities funneling audiences toward Snapchat for sexually exploitative purposes. This is especially true, given Snapchat’s own admission on July 17, 2019, where they noted, “We are concerned predators are using other, less private, apps to locate potential victims and then steer them to Snapchat or other private messaging platforms.”
NM P. 78, PARA 19114
Snap assures parents that the company “ban[s] public profiles for minors and friend lists are private.” Yet, Snap fails to tell parents that unknown adults can still contact their children through private chat requests, which creates a false sense of safety. Snap compounds this failure by filtering the communications available to parents; in Snap’s Family Center “Parents can only see who their kids sent a message to - not who has sent a message to their teen.”
NM P. 52-53, PARA 111-114
PARA 111: Snap employees on an internal Slack chat regarding trust and safety goals in January 2022 discussed the fact that “by design, over 90% of account level reports are ignored today and instead we just prompt one person to block the other person.”
PARA 112: Yet even these reports were often ignored. In August 2022, a Snap employee raised concerns about the need to take steps to ensure that user reports of grooming and sextortion were not “continuing to fall through the cracks,” making clear that Snap was aware of the ongoing problem and its failure to adequately address even the dangerous, violating conduct brought to its attention:
“I am surfacing this thread regarding the guidance previously provided to our vendor agents with the hopes of better understanding the existing guidance so that we can determine how we might need to expand it to ensure that user reports related to grooming and sextortion are not continuing to fall through the cracks. This afternoon [we] discovered that a quick search for the term “nudes” in OhSnap comments surfaces a number of tasks that entered the Account Reporting - Impersonation queue… in which the users’ reports detail the user being sextorted or asked for nudes (which is often the start of sextortion), but no action is taken by the agents. While we’ll need to be mindful of how our guidance to the vendor agents influences the flow of escalations to FTEs, I do think we should revisit this to make sure we are being adequately strategic and responsive to our users’ reports.”
PARA 113: Others agreed and commented:
“I think want [sic] to add criteria for escalating suspicious accounts, but also don’t want to overwhelm FTE Specialists”
“I’m glad you raised this, as it’s something I wanted to talk to you about after reviewing a big chunk of the 350 names sent to us by NCMEC last week, most of which were sextortion accounts.”
“...current guidance meant that vendors were not raising these for further review, so I’m sure this is something we should address straight away.”
PARA 114: Snap failed to disclose this security failure to its young users and parents.
NM P. 59-60, PARA 132-134
PARA 132: Indeed Snap was well aware–and failed to inform users, parents and the public–that sextortion was a rampant, “massive” and “incredibly concerning issue” on Snapchat. In a November 2022 internal email trying to confirm data queries, a T&S [Trust and Safety] team member stated, “They indicate that we are getting around 10,000 user reports of sextortion each month. If this is correct, we have an incredibly concerning issue on our hands, in my humble opinion. It seems to me that having an accurate understanding of the magnitude of this issue is extremely important given the psychological impact of sextoriton [sic] on victims, especially when those victims are minors.”
PARA 133: A T&S Investigations employee replied:
“I think our teams understand this is a huge problem. Curious if Exec is aware just how massive and impactful the scale of the issue is. Worth noting that 10k monthly reports likely represents a small fraction of this abuse as this is an embarrassing issue that is not easy to categorize in reporting.”
PARA 134: A December 2022 draft Snap Marketing Brief titled “Sexting and Sextortion,” recognized that adults were targeting minors for “deeply pernicious and dangerous” conduct on the platform but did not want to “strik[e] fear” among its young users:
In the eyes of many, Snapchat is associated with “sexting” - and believe it’s what the app was designed for. It is undeniable that over the last 10 years, “sexting” or sending of nudes has become common behavior across many age demographics. Sexting has become a “regular behavior” amongst Generation Z, and we know it happens on Snapchat. In many (though by no means all) cases, sending what seems like run-of-the-mill sexual content can lead to disproportionate consequences and severe harms.
We believe that one of the upstream issues for many (but not all) of these harms involves young people being friended by individuals that they don’t know in real life and furthermore being able to recognize demands for sexual content, the performance of sexual acts and other suspicious activity that can lead to sexting/sextortion cases. Reporting violating content or concerning contact with/behavior by strangers is a key action that teens/Snapchatters can take when confronted with these situations.
As a platform that has significant reach and engagement with the Gen-Z community, we recognize our responsibility to ensure teens are educated and informed about the potential consequences of some of the behaviors that currently feel very normalized.
We are keen to avoid a finger-wagging tone and want the key messages to be shared in an informative and non-judgemental way. We can’t tell our audience NOT to send nudes; this approach is likely futile, “tone deaf” and unrealistic. That said, we also can’t say, ‘If you DO do it: (1) don’t have your face in the photo, (2) don’t have tattoos, piercings or other defining physical characteristics in view, etc.’ Bottom line: We cannot be seen as aiding and abetting the production of (at a minimum) child sexually exploitative material. We need to run through a very thoughtful messaging & visual storytelling exercise/session on how to best balance education without striking fear into Snapchatters. (emphasis added) [[The original quote bolds “without striking fear into Snapchatters”]]
NM P. 70, PARA 167
Additionally Snap’s internal documents also contain a “Sextortion handbook” which shows how to use Snap Maps to target a school where they can, “tap on the screen to view any snap stories that might have been shared by students who share snap stories with the ‘snap maps’ options enabled.” [[This is referencing a “sextortion handbook” that was developed by sexual predators for sextorting minors using Snapchat.]]
NM P. 50-51, PARA 107.
Snap continued to discuss–internally–evidence of ongoing child sexual exploitation on its platform. An internal email dated June 7, 2021, noted “Flagging this piece looking at the % of child sexual assaults that were facilitated by technology. Between 2007-2013 FB was the highest, then dating apps until 2017, after which Snapchat is recorded as the most used platform.” The attached article, “Jump in sexual assaults of children groomed online,” called out the prevalence of child sexual abuse on Snapchat stating, “They found a big upswing since 2015 in perpetrators using social media platforms, especially Snapchat and dating sites, to communicate with children aged between 12 and 17 before meeting and assaulting them.” The article continued, “In the early years of the study, between 2007 and 2013, three-quarters of offenders had used Facebook to communicate with child victims, but between 2014-2016 dating apps, many that children should be too young to access, started to feature. Between 2017 and 2020, Snapchat had been the platform employed by nearly half of offenders.” [[You can find this article here.]]
NM P. 62, PARA 142
In May 2021, Snap employees discussed this ongoing problem in an internal email titled “Responsible growth initiative,” stating, “We need to come up with new approaches that ringfence our most vulnerable users (minors) and make it hard for predatory users to find them via quick add, search, etc. We believe we can achieve this without meaningfully degrading the product experience for these users if we pursue new strategies in inventory generation/constraints and other techniques to more effectively silo minors from people outside their networks. This is probably the most important long-term thing we need to work on…” One employee continued, “I wish we had more metrics to frame these clearly. What does success look like if we make progress here - obviously large-scale friending spam numbers goes down but what about low-grade “creep” attacks. How does proactively playing D here help us unlock more growth?” As this email indicates, the choice to address features that introduced minors to predators was harnessed to, and would only be pursued to service of, Snapchat's growth.
NM P. 65-67, PARA 149-155
PARA 149: Thus, a February 2022 PowerPoint prepared by Snap's consultant reported that “many young people reported being added by bots on Snapchat. This seemed to be particularly pervasive issue through 'Quick Add' feature as people described being added and receiving unsolicited messages from unknown senders.”
PARA 150: It was clear to Snap that allowing minors to be recommended to users with two friends in common failed to provide meaningful protection to children. A May 2023 internal email described results from a quality assurance test session to “pressure test the friending, chat, and registration experience for minors.” The findings included:
“Minors may receive a ton of random Quick Add suggestions: In tests where someone registered as a minor with their “Contact Book” sync off, they received random Quick Add suggestions once they added 1 – 2 friends. Alternatively, if you add multiple 18+ accounts, a lot of your Quick Add suggestions are adults... Minors can communicate with adults they are not friends with through group chats: There are a number of ways minors (or anyone) can be added to group chats without being friends with people who could abuse them. [[redacted text]] These are difficult features to solve for, but we may want to consider exploring additional safeguards for minors as they pertain to group messages and invite links (e.g. callouts that they are joining a group with people they may not know/aren't friends with; warning when clicking invite links; etc).”
PARA 151: Thus, Snap recognized that restricting Quick Add to friends of friends still exposed minors to introductions to adult strangers. If one or more minors in a network fall victim to an adult groomer, that pedophile can contact everyone in that network.
PARA 152: Snap acknowledged internally that “Bad actors” would groom 2-3 friends on other platforms, such as gaming platforms, in order to jumpstart the algorithm to suggest additional minor friends.
PARA 153: Consistent with Snap's own findings, the 2023 Federal Human Trafficking Report noted that Snapchat was one of the “Top Platforms used in the recruitment of Victims 2019-2023.”
PARA 154: On January 31, 2024, Snap published the written Congressional testimony of Evan Spiegel on its Safety Blog, quoting his statement that: “Snapchat's default “Contact Me” settings are set to friends and phone contacts only for all accounts, and can't be expanded.”
[Quote by Spiegel] “We want Snapchat to be safe for everyone, and we offer extra protections for minors to help prevent unwanted contact and provide an age-appropriate experience. Snapchat's default “Contact Me” settings are set to friends and phone contacts only for all accounts, and can't be expanded. If a minor receives a friend request from someone they don't share a mutual friend with, we provide a warning before they start communicating to make sure it is someone they know. As a result, approximately 90% of friend requests received by minors on Snapchat are from someone with at least one mutual friend in common. Our goal is to make it as difficult as possible for people to be contacted by someone they don't already know.”
However, an internal survey conducted by Snap's Product Research team in August of 2022 indicated that Snap users of all age brackets can toggle their “Contact Me” settings to “Everyone.” In addition, the survey showed that a large number of users who had “Everyone” enabled to contact them were under the impression that “Everyone” only applied to “Just Friends that I Added” (24.8%), a percentage that was highest for users in the youngest age brackets (13-17 and 18-24).
PARA 155: Additionally, Snap's search term tool allows unknown adults to identify minor accounts. In January 2019, in an internal discussion regarding how to respond to a press inquiry in the United Kingdom, a Snap T&S employee admitted, “I wasn't aware that you were able to use search terms to bring up accounts. Using 'underage' just now there are accounts like 'These Girls R Underage' or 'Underage Nudes' and 'Nude Underage Girls'. I thought that you needed to know an account name in order to be able to search for accounts.” This design defect provides another means for adult predators to find and solicit minors on Snap's platform.
3.3 In-Person Sexual Predation and Assault
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *. Real life example: The following facts are alleged from New Mexico v. Snap, P. 33 PARA 83: An 11-year-old girl was introduced to an anonymous user, under the Snapchat account “‘sugar_daddy4u29” “through Snapchat’s Quick Add feature”. This user (who was a 27-year-old male) “offered her money and she agreed to meet him in person, where, feeling pressure to do something, she performed oral sex on him. The girl continued to communicate with Marquez on Snapchat and arranged to meet him again on several occasions, where he again sexually assaulted her.”
Real life example: The following facts are alleged in the public complaint Nevada v. Snap P. 49, PARA 143: “In another instance, a 25-year-old used Snap Map to hunt down and sexually assault a 16-year-old in Florida. Per a local news report, the man: [U]sed Snapchat to reach out to the girl, then, unbeknownst to the teen, track[ed] her down in real-time using the Snapchat feature called Snap Map. ‘Our victim posted a life story, and then he used Snap Map to track her down because of the meta tags that's [sic] in the photo,’ [a law enforcement official] said. Detectives said the Snap Map allowed the suspect to see the data the teen posted and know just where she posted it from. [‘]If you don’t hide your location where you make that [sic], take that photo or that posting from, they can use data that's hidden in the photo to track you down,’ [the same official] said.”
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *.
New Mexico (NM)
NM P. 68-69, PARA 158-164
PARA 158: Snapchat was long aware of the potential safety issues regarding Snap Map from direct user reports. For example, in June 2017, Snap’s T&S Team Leads discussed “Snap Map Privacy Concerns.” The first report:
“2. User wrote in stating the following:
1. I was at a party and a group of older men found us all on snapmap and they tried to come in and threatened to hurt us. We put our bitmojis on ghost but…
2. My account was hacked and the hackers could of seen my location from the snapmap and this makes me extremely worried and concerned
I don’t feel this feature is safe and even if you people were told to put themselves on ghost mode, what would the relevance of the feature be then anyway. It’s endangering people.”
PARA 159: Snap T&S employees internally voiced disbelief that there was a way “for a group of strange men to find them on Snap Map without being friends,” but responded to the user encouraging her to report any crime to law enforcement. The T&S representative also informed the user about “Only Me (Ghost Mode)”; the feature a user can affirmatively activate to prevent being “visible to anyone else on the Map.” However, they acknowledged, “even with Ghost Mode enabled, if you choose to submit a Snap to Our Story, it may show up on the heat map for Snapchatters to view.”
PARA 160: On July 16, 2017, Snap employees circulated an article which included an interview with the Chief Executive of Parent Zone, who warned of the risks to children posed by Snap Map. She noted the connection between Snap Map and fear of missing out (FOMO) and social exclusion—particularly powerful with adolescents—but also noted, “We very rarely say this, but in this instance we are saying ‘This feature is adding nothing to your life and it's a threat to your security, so turn it off.’” [[You can find this article here.]]
PARA 161: In the same article, a television show panelist noted the danger of “strangers or online acquaintances [users] have never met in real life” being able to see children's exact location. The article reported that police used a decoy account to “pinpoint where videos of an 18-month-old toddler, a two-year-old girl and teenagers drinking alcohol at parties had been made.”
PARA 162: In a November 2020 internal document, Snap acknowledged that “Previously public content (e.g., posts to the Map) could generate 'Friend Requests' from illegitimate friends (people who the account holder did not know and may not have wanted to be connected with).” Thus, Snap Map might not only disclose a user's current location but allow followers to stay in touch with that user.
PARA 163: In September 2022, Snap employees proposed additional safety controls for Snap Map and acknowledged that it was making young users even more vulnerable to predators, including from friend requests from strangers: “My only suggestion is we consider you ACCEPT a friend request from someone who appears outside your normal friend graph (e.g., - no friends in common). A lot of the predatory/abusive friending that leads to real world harms will typically happen on an inbound basis rather than an outbound basis, i.e., usually the predator is trying to add a lot of kids, rather than the other way around.”
PARA 164: Snap employees agreed, stating, “Underaged users become even more vulnerable if the predators make friends with them and see their trail on the map.”
NM P. 70, PARA 165-167
PARA 165: On June 25, 2024, Snap stated that “Snapchatters can only ever share their whereabouts with their existing Snapchat friends — there is no option to broadcast their location to the wider Snapchat community.”
PARA 166: However, an internal custodial document titled, “Snap Safety and Privacy Principles for Minors (13-15-year-olds),” stated that “geofilters” were considered a “residual risk” in terms of “expos[ing] precise location of minors beyond their opted-in friends.”
NM P. 75, PARA 183
Despite its public statement regarding its commitment to privacy, Snap knows that its privacy settings are frequently misleading, especially to young users. A survey by its [[Snap’s]] Product Research team revealed that one-quarter of users thought that “enabling ‘Everyone’ to contact them applies only to ‘Just My Friends that I added’....”
Harm 4: Cyberbullying
[[According to the briefs, the company is aware of cyberbullying on their platform that is exacerbated by specific design features, including its anonymity and the app’s ephemeral nature seen in disappearing photos and messages. Between 2019-2021, Snap enabled third-party Apps such as YOLO and LMK15 on their platform. These apps were often used by teens to post anonymous polls and Q&A’s. Note that Snap did remove such apps from their platform after the Carson v. Snap litigation.]]
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *.
Real life example: The following facts are alleged in the public complaint Bride v. Snap Inc. P. 18-21, PARA 66-85, P. 2 PARA 6, P. 23 PARA 89, P. 52 PARA 209: Carson Bride was 16 when he took his own life by hanging himself at his home on the morning of June 23, 2020.
“On or about July 4, 2020, it was revealed that Carson had been bullied on Defendants’ apps Snapchat, YOLO and LMK prior to his suicide. After Carson ended his life, two psychologists who provided care to Carson and his family opined that Carson’s suicide was triggered by cyberbullying.”
“Upon information and belief, from January 23, 2020 to June 22, 2020, Carson received 105 messages via YOLO… of the 105 anonymous messages Carson received via YOLO, 62 messages included content that was meant to humiliate him, often involving sexually explicit and disturbing content.”
Through Carson’s internet search history, investigators concluded that Carson made multiple attempts to reveal the identities of the bullies. “Carson relied on YOLO’s misrepresentations that it would reveal the identities of aggressors” on their platform. Snap also failed to deliver on its statements that “it would remove any third party apps that allow bullying and harassing behavior on its platform.”
“On the first screen of the user’s interface with the app, YOLO states, ‘No bullying. If you send harassing messages to our users, your identity will be revealed.’” The lack of follow through on this statement is seen in Carson's internet search history of items such as “YOLO Identity Reveal”, and multiple in-app attempts to have abusers “‘S/U’ (Swipe Up) to reveal their identities”, with no results. And as further evidenced in Carson’s final attempt to find out who was sending abusive YOLO messages to him, on the morning of his suicide, Carson’s final phone search was “Reveal YOLO Username Online.”
[[You can read an essay from Carson’s mother, Kristin, about her story and her experience dealing with Snap Inc. here.]]
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *.
Internal Evidence
New Mexico (NM)
NM P. 60-61, PARA 136
Snap's own research demonstrated that ephemerality was directly connected to parental concerns about the safety of their children. In the July 2023 Snap Parent Perceptions Research, it noted, “Core Snap features – specifically ephemerality, location sharing, and streaks – are directly connected to specific parental concerns like bullying, inappropriate contact with either peers or strangers, and mental health.” In the study, parents' views on Snap's ephemeral messaging were highlighted:
“Ephemeral messaging is what parents most strongly associate Snapchat with, and in turn, this feature creates the most concern for them
“Ephemerality exacerbates parents’ worries about their inability to properly supervise their teens’ communication with friends, or even strangers.
“Parents also believe this aspect of Snapchat encourages their teens to behave without regard for possible consequences, and enables cyberbullying, contact from strangers, or inappropriate behavior such as the sending or receiving of sexual or explicit messages, images, or videos.
“While somewhat more rare, a few parents raised concerns about ephemeral messaging making Snapchat an easier platform on which to conduct the purchase and sale of illegal drugs. Topical concerns about drugs and sextortion were quite rare and infrequent.”
NM P. 126-127, PARA 317
Given Snapchat's disappearing messages and popularity with minors, the platform serves as a hub for cyberbullying and harassment — with bullies having little to no fear of consequences. In a February 2022 “In-App Reporting Research” deck by Snap's consultant, they found, “cyberbullying, both anonymous and from known contacts, was a commonly cited problem... Disappearing messages can embolden bullies to harass people with less fear of consequence.”
Harm Cluster 5: Underage Use and Lack of Age Verification
[[Snap has an age minimum of 13 for users but has an age verification system that relies on them honestly reporting their own birthday and is easily bypassed.]]
Internal Evidence
New Mexico (NM)
NM P. 3, PARA 5
Teens and preteens can easily register for unrestricted accounts because Snap lacks any meaningful mechanism to verify their ages - a child-endangering design failure that Snap has known for years. Indeed, in 2022, a Snap executive emailed: “I don’t think we can say that we actually verify….” And Snap’s platform facilitates underage use even though Snap has the capability of both determining that users are minors and providing warnings or other protections against material that is not only harmful to minors but poses substantial dangers of solicitation, trafficking, and other harms.
NM P. 18-19, PARA 61-62
PARA 61: In a March 2022 internal email thread regarding Snap’s response to U.S. & Global age verification legislation, Snap’s Senior Director of Public Policy International responded, “There’s only so many times we can say ‘safety-by-design’ or ‘we’re kind’. Politicians and regulators are looking for tangible, substantive progress/initiatives. I’m not saying we should do that because we’re told to do so, but we should be aware that our current position, having used it for so long, is wearing very thin. Age assurance, in particular, remains a real weakness.”
NM P. 116, PARA 289
“Currently this type of [suggestive] content equates to ~5% Spotlight Story views for 13-17-year-olds globally.”
Nevada (NV)
NV P. 67-68, PARA 202
…Defendants’ executives have admitted that Snapchat’s age verification “is effectively useless in stopping underage users from signing up to the Snapchat app.” Not surprisingly, underage use is widespread. As of 2021, 13% of children aged 8-12 use Snapchat.
NV P. 68, PARA 203
Snap routinely obtains actual knowledge that its Youngest Users are on Snap’s platforms without parental consent. A UK report from March 2023 supports this proposition. Ahead of Britain’s planned Online Safety Bill, TikTok and Snapchat were asked how many suspected users under the age of 13 they had removed from their platform in a year. TikTok reported that between April 2021 and 2022 it had blocked an average of around 180,000 suspected underage accounts in Britain alone every month (totaling around 2 million, in Britain, for a 12-month period). For this same period of time, “Snapchat had disclosed that it had removed approximately 60 accounts per month, or just over 700 total.” A source within Snapchat acknowledged that “It makes no sense that Snapchat is blocking a fraction of the number of children that TikTok is.”
4. Conclusion
Snapchat has been running an advertising campaign with the theme “Less social media. More Snapchat.” The implication is that Snapchat is not social media, it’s just a way for close friends to keep in contact with each other, similar to texting but with more photos.
If that was truly the way it was used by nearly all of its users, it would not be particularly harmful and there would be no need for this post. But as multiple legal briefs and hundreds of quotations have shown, design choices made long ago and more recently have turned the platform into something that shares a lot with Instagram and TikTok. In addition, Snapchat’s unique combination of quick-add, disappearing messages, Snapmap, and no record of the content of conversations makes Snapchat particularly well suited for ill-intentioned adults who want to interact with or sell things to children.
As Sarah Wynn-Williams recently said about Meta, “It didn’t have to be this way.” The same is true for Snapchat. As we have suggested in this post, Snap could fix many of these problems quickly if they were to make the platform less addictive to children and less inviting for criminal activity. For example, they could:
Work harder to identify underage users and remove them from the platform. At present, at least several million children in the U.S. alone who are under 13 have Snapchat accounts. Social media companies could age-gate in a variety of ways if they wanted to.
Remove the Quick Add feature, which is one of the main ways that adult predators and drug dealers get access to children and teens.
Remove the streaks feature, which leads many teens to send photos to each other compulsively, needlessly increasing their time on the app.
Remove beauty filters for minors.
Remove engagement-based algorithms, at the very least, for minors.
Stop deleting posts on Snap’s own servers. The fact that Snap does not store the content of conversations (beyond a limited period) is helpful for drug dealers, sextortionists, and others with criminal intent, but it does not improve the user experience for most children to know that even if something goes terribly wrong, their conversations cannot be discovered by law enforcement officers.
We understand that it is very difficult to run a platform used by hundreds of millions of people in many countries. But if you operate a platform that is central to the lives of children, including millions of 10-12 year olds, then you have a moral responsibility to make design choices for their benefit, even if those changes reduce engagement and revenue. You can’t just pull more sharks out of the water. You have to put up a shark barrier, no matter the cost.
We are grateful that Snapchat has been willing to engage with us, and we recognize that their Trust and Safety team is working hard in a difficult and ever-changing environment. We hope to continue our conversation with them, and we hope to write about the platform in the future with news of major improvements.
But in the meantime, we believe that the quotations we have presented—from Snap’s leaders, employees, and consultants—provides strong evidence for parents and legislators to take action on our second of four norms16 for rolling back the phone-based childhood: No social media before 16.
Social Media Victims Law Center is currently litigating 394 lawsuits in the JCCP and 197 in the MDL against Snap Inc. This number is a low estimation based on the cases being handled by this law firm.
New Mexico v. Snap Brief P. 54, Para 117
New Mexico v. Snap Brief P. 54, Para 117
Nevada v. Snap Brief P. 67-68, Para 202
Quick Add: a list of potential friends generated by Snap’s Artificial Intelligence for users to add other users as their friends with a click of the “Add” button. Once a new friend is added, they have the ability to directly send disappearing messages and Snaps. Quick Add can be turned off in settings. It would be much safer to set it to off by default and require users to turn it on if they want the feature. We believe it should not even be offered to minors.
Snapstreaks occur when two users exchange at least one Snap in three consecutive 24-hour periods. When the streak is achieved, users earn a fire emoji next to their profile avatar, and incentives are in place to continue the streak perpetually.
New Mexico v. SNAP: A 2024 legal brief of a lawsuit filed by State of New Mexico Attorney General Raul Torrez against Snap Inc. in the First Judicial District Court in Santa Fe County, New Mexico.
Nevada v. SNAP: A 2024 legal brief of a formal complaint and demand for jury trial filed by the Plaintiff State of Nevada, by the Office of the Attorney General, Bureau of Consumer Protection against Snap Inc. in the District Court of Clark County Nevada.
Neville v. SNAP: A 2023 brief covering the complaint and demand for jury trial class-action lawsuit initiated by the families of multiple fentanyl overdose victims as the plaintiffs in the Superior Court of Los Angeles, California against Snap Inc.
One such brief is Bride v. SNAP, a 2021 brief of the class action complaint filed by the parents of suicide victim Carson Bride, along with several other parents of teen suicide victims. This complaint demanded that Snap remove third party SnapKit apps, YOLO and LMK due to cyberbullying and accuses Snap of knowingly hosting apps that are linked to abusive behavior.
[[Stories are public/friends-only photo/video posts that disappear within 24 hours.]]
[[Snapchat Discover: Snap’s curated feed-based feature, in which content is only produced by verified users. Snap uses an algorithm to determine the feed. Verified users (well-known publishers and content creators) are invited by Snapchat for their content to be placed on Discover.]]
[[Spotlight: the feed where users can view stories and video posts based off of their algorithm and the posts popularity.]]
[[Search bar: enables users to find other users, as well as specific content. Its ‘intended’ use is to find other profiles of friends by searching their name. It is also used to find specific genres of content through hashtags.]]
[[My Eyes Only: an encrypted “vault” where users can privately and permanently save Snaps and Stories that were once temporary. The content is encrypted and protected with a password (only photos and short videos under 10 seconds)]]
[[Family Center: is an in-app feature which has the marketed purpose of giving parents a way to monitor their child’s use of the Snapchat app.]]
[[YOLO and LMK were apps designed to allow Snapchat users to send messages anonymously.]]
In The Anxious Generation we advocated for four new norms that, if enacted as norms or as laws, would help parents and teens escape from multiple collective action problems: 1) No smartphones before 14, 2) No social media before 16, 3) Phone-free schools, and 4) More independence, free play, and responsibility in the real world.
The design flaws stem directly from the CEO’s brain:
“Yeah, except we don’t want to be responsible for storing that stuff. Better if they screenshot and email ghostbusters to report.”
Of course predators are drawn to this app. No child should have access to a platform that welcomes predators and deletes evidence. If calling “ghostbusters” is your best chance of reaching customer support, then clearly proper regulations are not in place.
Evan Spiegal has built a small empire on a foundation of sewage. As the father of a teen mentioned in this article, who died four months ago from a Snapchat dealer, I will only feel a sense of relief when this empire finally caves in on itself. Because no child using Snapchat is safe.
Thank you for this tremendous work in compiling such a detailed exposé of harms. Adding the personal stories of the children who lost their lives to this app is especially important, as talking in numbers and abstractions can make us numb to the terrible real life consequences.
Thanks to your work these companies can no longer feign ignorance of the terrible harms they help cause. Nor can we after reading this.