Why Every Country Should Set 16 (or Higher) as the Minimum Age for Social Media Accounts
Four features of strong age-limit policies for countries ready to follow Australia’s brave lead
The biggest news of 2025 regarding kids’ online safety was Australia’s new social media age-limit law, which set the minimum age for opening or maintaining a social media account to 16. The second-biggest news? As Australia’s law went into effect, there was a global chorus of parents, journalists, and political leaders who stood up, applauded the bold move, and asked, “Can we do that, too?”
Bloomberg, in an article titled “TikTok, Instagram Ban for Australian Kids Heralds Global Curbs,” provides a list of countries in which legislation has been, or soon will be, introduced:
The idea is spreading, and each nation considering such a policy should ask two important questions:
Should the age be 16, as in Australia, or should it be 15, as might become the case in France?
Should there be an option for parents to give consent for adolescents below that age?
The correct answers: 16, and no.
Here’s why:
We must protect puberty, and 15 is still puberty
I devoted an entire chapter of The Anxious Generation to puberty because it is such a crucial period of brain re-wiring and identity formation. Developmental psychologists see puberty as a “sensitive period” in which the brain is especially “plastic” or malleable based on incoming experience. The brain is changing over from the child form to the adult form, and those changes are guided by whatever a child does repeatedly. Neurons that fire together wire together, as brain researchers say.
The average adolescent in the U.S. now spends around five hours a day using social media. Their brains will enhance whichever neurons and circuits are activated repeatedly, at the expense of neurons and circuits that are underused. This brain sculpting happens throughout childhood, and continues on in the pre-frontal cortex until around age 25. But in the earlier part of adolescence — specifically puberty — the sculpting is more intense, and the changes are more likely to be permanent.
The age range of puberty varies across cultures and historical eras, but in modern developed nations it generally begins between ages 8–13 for girls and a year or two later for boys. By almost any measure, the median boy and the median girl are still in puberty on their 15th birthday. Most are still getting taller. Their secondary sex characteristics are still changing. Large population studies of American and European teens show that the median girl reaches Tanner stage 5 (the last stage of genital development) between 15 and 16, while the median boy reaches stage 5 around 16 or 17. There is wide variation for both sexes. The ability to self-regulate improves steadily throughout adolescence, only reaching a plateau in the mid 20s.
In other words: Half or more of all girls are still in puberty on their 15th birthday, and half or more of all boys are still in puberty on their 16th birthday. This is a major reason why 16 is a much better choice for a minimum age than 15. (Of course, 18 would be even better than 16, but we nominated 16 as the norm in The Anxious Generation because our goal was to pick the highest age that we thought could actually get enacted across many jurisdictions.)
Puberty is the period when parents should be most careful about how their children spend their time and who (or what) is influencing their developing brains and identities. Traditionally, human societies helped children make the jump from child to adult during this crucial period, with rites of passage in which trusted, non-parental adults guided them through challenges, hardships, and lessons.
But what do we do in Western nations? We generally mark the beginning of puberty by giving kids smartphones (average age in the U.S. is around 11 or 12). We then outsource their social and neural development to Instagram, TikTok, and YouTube. The results have been catastrophic for their mental health, social relationships, education, and ability to focus for more than a few minutes.
So does it matter whether the age cutoff is 15, rather than 16? Yes. Puberty is the time when social media is likely to do the most damage, and most adolescents, including the large majority of boys, are still in puberty at 15. A 2022 paper by Orben and her colleagues even found that there was a peak “developmental window of sensitivity to social media,” such that heavy use by boys at ages 14 and 15 most strongly predicted decreased life satisfaction a year later. (For girls the peak sensitivity was ages 11 through 13).
Sixteen may feel like a more obvious or natural choice for the age of “digital adulthood” in the U.S. because the minimum age for a driver’s license is 16 in most states, (though it is higher in most other countries). Similarly, people in some European countries may see 15 as an obvious or natural choice because that is the age of consent in some countries, the age at which adolescents can legally engage in sexual activity.
But the fact remains: Any nation that sets 15 as the minimum age rather than 16 will condemn its children to an extra year of brain-sculpting by social media at a time when their brains are still highly sculptable. It will also greatly increase the risk of exposure to pornography, sextortion, online cruelty, and other risky interactions with anonymous strangers at an age when teens have less ability to self-regulate or know what is safe.
Parental-consent exceptions put parents right back into the trap
Parents everywhere have heard their children invoke the mantra “but everyone else has one! I’m being left out!” in their daily struggles over smartphones, tablets, social media, video games, and other screen-based activities. And the kids are largely correct. Now that almost everyone else has one, everyone feels that they, too, have to have one. That’s a perfect example of what economists call a collective action trap, where everyone ends up doing something sub-optimal because if they were the only one to choose the better action, they’d actually end up even worse off.
The way to escape from a collective action trap is collectively. If most families only give basic phones before age 14, then no 13-year-old can say “but I’m the only one who doesn’t have an iPhone!” If most families wait until 16 before allowing their kids to open social media accounts, that would also reduce the pressure on everyone younger than that to open a social media account.
But while parents can choose the age at which their child gets a phone, no parent has full control over when their child opens social media accounts. If the child can get to the internet anywhere, including at school, she can open as many accounts as she likes as long as she’s old enough to say she’s 13.
This is why parents need help from their governments, and from the platforms (which have shown repeatedly that they will not protect children unless forced to by law). This is why the Australian law is so important: It delays the struggle over social media until the age of 16.
Any country that adds in a provision for parental consent at younger ages plunges everyone back into the collective action trap. We’re right back to “But all of my friends’ parents gave them permission!”1
Simple uniform laws are more effective than a variety of complicated ones
A third consideration is that simple rules are generally best for a complex world, as legal scholars Richard Epstein and Philip Howard have long argued. People understand and remember them. They are easier to enforce. And for social media — by its nature international and placeless — a patchwork quilt of different age and parental-permission rules means that underage kids could (and many will) use a VPN to find a country in which they can easily open a social media account.
As a bonus, a simple and widespread age limit of 16 would be much easier for social media platforms to enforce effectively. They don’t want different rules across different countries. If we make things easy for them, they’ll be more effective at enforcing the law.
As an additional bonus, large majorities of parents, and adults more broadly, say in surveys that they support laws that set an age limit for opening social media accounts. See findings from the U.S., from Australia, from the UK, France, and Germany (twice). Any politician who gets out in front of this issue will find voters from right, left, and center standing up and applauding.
At the beginning of 2025, we worked with partners to establish five principles for effective phone-free school legislation. We have been thrilled to see many countries and U.S. states adopt policies that follow these recommendations. The model provided clarity about the choices to be made. Phone-free school policies have been widely successful across many jurisdictions.
So, as countries and states consider following Australia’s lead in 2026, we want to offer a similar set of features for an ideal social media age-limit policy.
Four Recommended Features of Age Limit Policies
Feature 1: Set the minimum age at 16 or higher
As discussed previously, protecting children during puberty is essential. Social media is wildly inappropriate for children and younger adolescents. One internal Instagram study found that 11% of 13-15 year olds reported being a target of bullying, 13% reported receiving an unwanted sexual advance, 19% reported seeing unwanted sexually explicit content, and 21% reported seeing posts that made them feel worse about themselves every seven days. In a recent Pew poll, 45% of teens reported that they themselves felt they used social media too much with many suggesting that it affects their sleep and their grades. Fifteen is too young. It also undercuts the power of the norm in countries that set it to 16.
Feature 2: Do not make exceptions for parental consent
Don’t make parents’ jobs even harder by giving their kids one more thing to beg for. Setting a single, clear age minimum with no loopholes does parents a favor. Imagine if every child could plead with their parents to get a drivers license at any age. Governments routinely set minimum ages for products and activities that could harm or exploit children, such as driving a car, signing up for a credit card, or drinking alcohol — social media is no different. Keep the policy simple and uniform for everyone and make parenting a little bit easier for us all.
Feature 3: Focus on account creation, not access to content
Setting age-limit policies based on content (what kinds of things kids see) prompts never-ending debates about what content is inappropriate for children (e.g., what counts as too sexually explicit, how violent is too violent?, etc.). It can also lead to charges of content-based or viewpoint-based censorship. This is why we recommend orienting the law not around content but around the age at which minors can sign contracts with companies in which they agree to give away personal data and expose themselves fully to the company’s addictive algorithms, without their parents’ knowledge or consent. We think it is important to allow logged-off access to content, as Australia’s policy does. Children under 16 can still search sites such as YouTube for whatever content they want. They can easily view any video that a teacher assigns or a friend recommends. But if they do not have an account and have not signed a contract with the company, then they cannot compare the popularity of pictures of themselves, receive tailored late night notifications, be served more and more extreme content, or be contacted by strangers via messaging. Without this inappropriate business relationship and access to the extensive data they currently collect from kids, companies will find it much harder to train algorithms and use design features to manipulate and exploit kids.
Feature 4: Define “social media” in terms of design features
While several platforms are currently the obvious targets of legislation because of the outsized role they play in kids’ lives, any definition of “social media” will inevitably invite challenges by companies who host different activities and have varied feature sets. By focusing on the design features that cause harm, we can capture video-game platforms that facilitate adult/minor solicitation and video-hosting platforms that maximize engagement through algorithms. Platforms that do not need these potentially harmful features will want to avoid the increased regulation and risk, and will therefore have an incentive to keep them out.
Conclusion
We would love to take credit for the success of phone-free school policies, and for the growing international interest in social media age-limit policies. The reality is, though, that these policies and ideas succeed because they address something that most parents, teachers, and children experience every day: the technology-facilitated manipulation of one of our most precious resources — the time and attention of our kids. Policymakers, parents, and kids themselves are fighting back. In fact, on the New York Times tech podcast Hard Fork, co-host Casey Newton offered his number-one prediction for a 2026 tech trend:
Sixteen plus becomes the new norm for social media accounts worldwide… by the end of 2026 we’ve seen at least five other democracies introduce similar rules.
Bravo to Australia, and bravo to the five (or more) countries that will turn Australia’s bold move into a new international standard. Let’s make 16+ the standard around the globe.
These parental consent laws are also necessarily complicated, because it is an inherently difficult task for a company to link two people together and obtain parental permission in a reliable way. These complications create a maze of workarounds children can use to make it seem as though they have parental permission when they do not.





This is so eminently sensible.
Once seen, a disturbing memory can never be erased x
Canada’s Bill C-63 (Online Harms Act) effectively ignores the "collective action trap" mentioned here. The legislation focuses on "age-appropriate design" rather than a hard age floor, which leaves parents negotiating with their kids about consent. A clean 16+ standard would give the incoming Digital Safety Commission a single, enforceable metric rather than asking them to police vague "risk mitigation" strategies.