Don’t Let Big Tech Hide Behind a Rainbow Flag
Guest author Lennon Torres on why social media regulation is a prerequisite for LGBTQ safety.
Introduction from Jon Haidt and Zach Rausch:
A year and a half ago, we teamed up with Lennon Torres, senior campaign manager at The Heat Initiative and LGBTQ+ advocate, to write an article for The Atlantic, “Social-Media Companies’ Worst Argument” (reposted here with no paywall). Together, we refuted the social media companies’ claims that using these platforms is net-positive for teens in historically disadvantaged communities and that regulation would do more harm than good for adolescents in these groups.
Since then, however, these claims have continued to surface as an argument against regulation. In the below piece, originally published by The Hill, Lennon draws on her own experience as a trans woman who grew up sharing her life on social media. She argues that the social media companies use LGBTQ+ kids as an excuse to avoid accountability and reminds the public that despite what the companies claim, “queer people are the ones these platforms fail first and protect last.”
Thank you to Lennon and The Hill for allowing us to share this piece directly with After Babel’s readers. We hope you’ll read it and share it widely.
– Jon & Zach
Don’t Let Big Tech Hide Behind a Rainbow Flag
With Big Tech companies recently losing two key lawsuits over the harm they do to youth — both in rulings they have promised to appeal — a false narrative has begun to re-circulate. The claim is that requirements making digital communities safer for young people will somehow undermine queer expression.
Here is my message, coming from a transgender woman who grew up with and was badly harmed by exploitative social media: Do not let Big Tech hide itself behind a rainbow flag. The truth is, queer people are the ones these platforms fail first and protect last.
Many gay, transgender and queer kids lack supportive families and affirming schools. To them, digital spaces may seem like a lifeline — a place where they can be themselves. Unfortunately, those digital spaces are often built on the same logic that once targeted kids with cigarettes: Maximize use, minimize accountability and monetize vulnerability. These platforms were designed not to empower us but to get and keep us hooked.
In the social media addiction trial that recently wrapped up in Los Angeles, plaintiff attorney Mark Lanier asked Meta whistleblower Arturo Béjar how Facebook’s leadership dealt with the issue of “addiction.” Béjar replied: “They changed the name of it” — specifically, they stopped calling it “addiction” and called it “problematic use” instead. He added, “You couldn’t talk about it.”
I joined social media at age 13, just as the iPhone became the center of adolescent life. I was attending a performing arts school after five years at a public school where I was teased for being too feminine. I turned to Instagram, Facebook, Snapchat, and YouTube — platforms that gave me access to a community I had never had. But this came with life‑threatening side effects I couldn’t yet see clearly.
Online, I found attention — first from classmates, then from strangers. When I started working professionally as a dancer, hundreds of thousands of followers watched my every move. What felt at first like affirmation quickly became the only place I thought I had value. I got so consumed with how I was being perceived that authenticity didn’t stand a chance.
At some point, it stopped mattering whether the comments were praise or cruelty — what mattered was the hit. I began refreshing comments in bathroom stalls between classes and rehearsals, scrolling before bed and learning how to curate myself for algorithms I didn’t understand. The behavior was compulsive. I didn’t know to call it “addictive design” — I just knew I couldn’t stop scrolling.
Chasing the algorithm for validation wasn’t the only risk. The real danger often arrived in my private messages. Adults I didn’t know approached me with explicit messages and nude images. I was only 13, and I did not yet understand what grooming was. I did not have the language for it — I only knew that the attention I could not find offline seemed to appear online.
I know now that the platforms and their algorithms were delivering me up to these predatory strangers, serving them my profile as engagement bait.
The Los Angeles lawsuit pointed to Internal Meta documents showing that Instagram’s “Accounts You May Follow” feature1 actively connects predatory adults to minors: “In 2023, this tool recommended to adult groomers ‘nearly 2 million minors in the last 3 months’ — and ‘22 percent of those recommendations resulted in a follow request.’”
Employees warned leadership. Leadership rejected fixing the system, maintaining a 17-strike policy for predators — including sex-traffickers — before suspending the offenders’ accounts.
The architecture of these platforms placed me in the path of adults who saw opportunity in a lonely queer kid. Because queer kids come to online spaces for identity and survival, we are the ideal product: highly engaged, highly vulnerable and highly profitable.
Big Tech claims to defend queer kids’ rights by opposing regulations like requiring age-appropriate design and limits on addictive features. In reality, they are using us as a shield to avoid accountability. They weaponize our dependence on online connection to argue that any safety guardrail is “anti‑LGBTQ.” They warn lawmakers that protecting kids will erase queer expression. This is a lie, and a strategic one.
In reality, features that harm young people — endless scroll, autoplay, compulsive engagement loops, recommendation pipelines driven by surveillance data, settings that expose kids to ill-intentioned adult strangers — do not create queer communities. They create dependency. They bury our identity in algorithms optimized for outrage, objectification and profit.
Big Tech claims to defend queer kids’ rights by opposing regulations like requiring age-appropriate design and limits on addictive features. In reality, they are using us as a shield to avoid accountability.
Queer kids do not need online platforms that claim to celebrate us in Pride campaigns while exploiting and exposing us to harassment at disproportionate rates. We need them to prioritize our safety and mental health.
I know this because I lived it. Only after a decade of anxiety, addictive patterns, algorithmic harm, grooming, and harassment could I finally withdraw from exploitative social media. Even then, the choice felt impossible. Most of my childhood had unfolded online. The most intimate parts of my life — my gender transition, top surgery, and coming out — became content opportunities to me. That is the cruelty of these platforms: They teach you to equate visibility with safety, engagement with belonging, and exploitation with connection.
Regulation is not a threat to queer expression but a prerequisite for queer safety. It won’t solve every problem, but it will do the first and most important thing: force the companies profiting from our attention to finally take responsibility for the harm they have caused.
Reprinted with permission from The Hill.
We made a minor correction to the original piece, replacing the word “algorithm” with the more precise “‘Accounts You May Follow’ feature.”




where else do you get 17 chances to stop doing harm? makes me very angry...
I admit my patience for this Stack is waning. It's one thing to use social/commercial pressure social media companies to "do better." It's fine to pressure schools to keep cell phones out of kids' hands during school hours. It's fine to educate parents (and kids and people in general) on the dangers of phone and social media over-exposure.
But there doesn't seem to be any discussion on how quickly "social media [government] regulation" can become fundamentally and incontrovertibly authoritarian, antithetical to democracy, and incongruent with free speech and 1st Amendment principles. "Smily face censorship" is still censorship and every government censor in history used "the greater good" as part of their argument.
I clearly won't be moving my donation dollars from FIRE to After Babel anytime soon.