The Tech Industry’s Playbook To Prevent Regulation
The Tech Industry is using an old trick – shifting the responsibility entirely to us
Intro from Zach Rausch and Jon Haidt:
In Chapters 9 and 10 of The Anxious Generation, we examine how social media companies are themselves caught in a collective action trap. Each feels compelled to introduce more addictive features and reach younger users because failing to do so risks falling behind rivals who will. It’s exactly these kinds of industry-wide dilemmas that make regulation necessary.
Legislation helps overcome the collective action trap by ensuring that all companies shift their behavior simultaneously. Yet, industries often fiercely resist moving from an unregulated environment to one with these safeguards. For example, when the government tried to regulate the tobacco industry, they would fund political campaigns, employ dedicated lobbyists, spread misinformation about the health risks of tobacco, and create front groups to appear as grassroots movements. Social media companies—particularly Meta and TikTok—are engaging in similar practices right now to avoid the passage of the Kids Online Safety Act, and it’s important for the public to recognize these patterns. (For the latest on KOSA, and how Elon Musk and Linda Yaccarino (the CEO of X) have helped get it back on the negotiating table, see here)
To better understand the playbook some social media companies are now using to avoid meaningful regulation—a pattern that stretches back to Big Tobacco, we reached out to Gaia Bernstein, a law professor and author of Unwired: Gaining Control over Addictive Technologies. In this post, Bernstein highlights the strategy of corporate blame-shifting—diverting full responsibility onto users, including children and overwhelmed parents.
— Zach and Jon
The tech industry1—and especially social media companies—have long resisted regulation. Tech companies persistently claimed they could fix their own problems.2 For years, this strategy worked.3 But accumulating scientific evidence documenting harm to children, combined with damning whistleblower testimonies created change. These days, few believe that Big Tech will or can regulate itself.
Since the end of the pandemic, across the country and in Congress, a wave of legislative action emerged. Hundreds of bills targeting excessive screen time and other online harms sprouted. Some have already become law. This movement includes the Kids Online Safety Act (KOSA), which passed the Senate with overwhelming bipartisan support, and is now before the House. There are many proposed solutions. They range from implementing an age minimum on children's social media access to imposing duties of care on online platforms and eliminating addictive design features. But, while seemingly diverse, these solutions converge into two exclusive models and a hybrid option. Each alternative endorses a different idea of who should be the gatekeeper over kids' online safety.
Should tech companies, parents, or both be responsible for protecting kids online? One model is the Tech Liability Model, which holds technology companies responsible for preventing harm. For example, some laws require social networks, games, and other digital platforms to eliminate design features known to manipulate user behavior and extend screen time. The other model is the Parent Gatekeeping Model, which shifts responsibility to parents by requiring platforms to provide parents with monitoring tools and controls. For example, some laws require parental consent for minors to use social media. The third alternative is the hybrid approach, where legislatures combine the two models, giving both tech companies and parents roles.
What would the tech industry prefer? No doubt its first choice is to avoid regulation altogether. But if regulation is inevitable, it would opt for laws that place exclusive responsibility on parents. Tech companies endorse the parent as the gatekeeper.
Tech Resistance Follows an Old Playbook Drafted by the Tobacco Industry
The tech industry is in the business of innovation. Yet its response to evidence of child harm shows little creativity. Tech companies are following a well-established playbook, which originated with the tobacco industry. For decades, industries have resorted to this playbook when evidence revealed that their products harm their consumers. In my book Unwired: Gaining Control over Addictive Technologies, I looked at how industries developed this playbook, and how the tech industry applied it.
A key strategy is shifting responsibility from the industry to consumers. “You chose to use our product, so you are fully responsible for the harms.” The tech industry applied this strategy by extending responsibility to the parents of their minor users. They have so far implemented the strategy in two steps:
Declared that kids chose to use their products, and the parents are responsible for their children’s choices.
Provide parental controls to shift the responsibility for kids’ online access to parents.
These two steps then lead to a logical third: opting for parent gatekeeping laws.
Let us look into each of these steps in more detail to better understand how this plays out in the real world, drawing on historical examples.
The Playbook, Step 1: It’s Not Us, It’s You
When smokers and their families sued the tobacco industry for smoking-related devastation, including lung cancer and early death, tobacco companies quickly resorted to shifting the responsibility. For decades, smokers lost lawsuits because the tobacco industry argued that they chose to smoke and were, therefore, responsible for the consequences.4
The tech industry started out by employing the same strategy. Confronting challenges about addictive features, tech companies argued that users voluntarily engage with their platforms. When the Federal Trade Commission evaluated restricting loot boxes, an addictive feature5 common in video games, video game manufacturers claimed players choose to play and spend money. No one forces them to do so.
The Playbook, Step 2: Our New Tools Will Help You Control Yourselves
These industries don't just blame users; they also provide "solutions" that reinforce their strategy of shifting responsibility. They offer products or tools that help consumers make “better choices.” In the 1950s, researchers published the first studies showing the connection between smoking and lung cancer. In response, tobacco companies introduced filtered cigarettes, marketing them as healthier options "just what the doctor ordered." However, these filters were a facade – to compensate for lost flavor, companies used stronger tobacco, maintaining similar nicotine and tar levels as unfiltered brands.
Once again, the tech industry followed the traditional playbook by offering “digital well-being" tools. They offer us tools like Apple Screen Time, which notify us of how much time we spend on screens. They also allow us to restrict time on certain apps, but then we can override these restrictions. We can choose to set our devices on “do not disturb” or “focus times.” Instagram set a feature to remind us to take breaks. Yet, screen time continues to creep up. These tools are not successful because just like the “filtered cigarette,” they are not meant to solve the problem. Their goal is to shift responsibility to us, as we unsuccessfully face devices and apps that manipulatively entice us to stay on.
For children, tech companies have developed specialized “digital well-being tools” known as "parental controls”. The tech industry introduced two forms of parental controls:
Device-level controls that permit parents to limit children’s time and access on phones and computers.
Platform-specific controls allowing parents to set notifications and time limits on different platforms, most commonly social media accounts.
Once again, parental controls, while helpful to some parents, largely failed to change kids’ screen time but served the tech industry’s goal of shifting responsibility– in this case, to the parents of their consumers.
The Playbook, Step 3: Parent Gatekeeping Laws
Parent gatekeeping laws represent an evolved version of this responsibility-shifting strategy. Under the Parent Gatekeeping Model, some laws require parental consent for minors to use online platforms, most commonly social media. These laws vary in their application, covering children under the ages of fourteen to eighteen. Other laws mandate that technology platforms enable parental supervision, including setting time limits and scheduling breaks on children's accounts. Regardless of the specific mechanism, these laws ultimately absolve tech companies of responsibility by positioning parents as the gatekeepers.
The tech industry’s primary preference remains blocking any regulatory intervention. However, if regulation becomes inevitable, the industry's second-best strategy is to push for laws that place responsibility exclusively on parents. These laws will provide the industry with a convenient legal shield, effectively allowing them to argue that they have been absolved of responsibility because a legally mandated intermediary – the parent – now stands between the tech product and the child.
Why Fight for KOSA?
KOSA is a hybrid law: it draws from the Tech Liability Model and the Parent Gatekeeping model. KOSA provides parents with tools to monitor their children online. For example, it requires online platforms to let parents set their children’s privacy preferences and view their screen time on a given platform. However, KOSA does not rely on parents exclusively as the gatekeepers. The heart of KOSA involves imposing a duty of care on online platforms, including social media. This means platforms must exercise "reasonable care" in making sure that their design features do not cause harm, including mental health harms, addiction, and sexual exploitation. For example, KOSA would impose liability on a social media platform, which uses a design feature that is detrimental to kids’ mental health and fails to mitigate the problem. KOSA is a powerful law because it places the responsibility where it belongs - with the tech companies.
But KOSA is currently stalled in the House, with only eight days remaining in the legislative session before it dies. Now is the time for us to act. Your voice can help push this bill over the finish line by contacting your House representative using this resource from the Anxious Generation’s website.
Social media companies (particularly Facebook, now Meta) and video game manufacturers represented by the Entertainment Software Association (ESA) were early resisters. See Unwired pages 81-90.
See Unwired pages 81-89.
See Unwired pages 81-90.
See Unwired pages 53-56.
See Unwired pages 150-151.
I had a meeting with my local school district to advocate for no-phone-bell-to-bell and I was amazed at how the admins shifted blame to parents saying “well don’t give your kids a phone.” While I agree with that it’s a simplified answer that isn’t helpful when classrooms use QR codes and kids are rewarded for having a smartphone by gaining popularity among classmates. Interesting to see this is a sort of playbook.
But... but... Freedom! Liberty! Parents' Rights!
Social media regulation (and other behavioral restrictions) always runs into the brick wall of John Stuart Mill's maximal individual autonomy: "my rights only stop at your nose". Mill never wrote those exact words, but the idea is the cornerstone of modern liberal philosophy and has been the underlying theology of America for at least 100 years.
Take the tobacco regulation example used in this article. People knew that tobacco was bad for you for decades, but smoking bans never went anywhere. Why? Because smokers have a inalienable right to choose to poison their own lungs. That's Mill's Harm Principle ("the only legitimate use of coercion is to prevent physical harm to others") in action. Only when activists started pushing "secondhand smoke kills" did anti-smoking laws take off -- you can't regulate human freedom in any way unless it hurts other people.
Social media is bad: for kids, for teens, for adults, for pretty much everyone. But until we demonstrate that John's use of social media is bad for Mary, we will always run up against the "but I have a right" argument from both of them.
Of course, we could exorcise the ghost of John Stuart Mill from our society and enthrone something other than "maximal individual autonomy" as our highest good. Left-wing wokeness (ala Kendi) is an attempt to do that as is Right-wing postliberalism (ala Deneen). However, that's a long-term undertaking. In the meantime, take a lesson from the tobacco people and figure out how to talk about "secondhand Instagram".