The Surgeon General Has New Social Media Guidelines. Minnesota Already Made Some of Them Into Law.
As Minnesota’s policies gain national attention, their recent social media law is an example worth following.
Introduction from Zach Rausch and Ravi Iyer:
Last year, Ravi Iyer—our close collaborator, a former Meta research manager, and now managing director at the USC Neely Center—was invited by Minnesota Attorney General Keith Ellison’s office to help assess emerging technology’s impact on youth, with an eye toward informing state-level social media legislation. The legislation that was introduced and ultimately passed into law, championed by MN Rep. Zack Stephenson, did not include everything that Ravi advocated for, but it marked a significant victory for state-level legislation. Social media regulation bills were also successfully passed in Maryland and New York.
In the face of the gridlock that is often seen in national politics, states are uniquely positioned to serve as laboratories of democracy, testing, and refining policies that can later be adopted at the national level. The Minnesota law requires specific disclosure as to how these platforms’ designs facilitate unwanted contact and unwanted content for many teens. It mandates disclosure of how often one can contact strangers, how algorithms are designed, and what experiments lead to better or worse experiences. As Minnesota’s policies come under increased scrutiny, particularly with Tim Walz’s VP candidacy, we hope that this law also becomes part of the national conversation.
Below is a reprint of an important Boston Globe op-ed that Ravi published on July 16th about the bill, the lessons he learned working on it, and how it might inform further efforts to regulate social media across state, federal, and international jurisdictions.
— Zach and Ravi
In June, US Surgeon General Vivek Murthy called for social media companies to take measures to better protect young people online. These measures include adding warning labels to social media platforms, limiting addictive features like push notifications and autoplay, and sharing internal data on how their products affect young people.
The good news is we won’t have to wait long to see whether Murthy’s recommended actions will make a difference. A bill just signed into law in Minnesota independently implements many of the reforms Murthy called for.
The legislation, introduced by state representative Zack Stephenson and which I helped design, will force social media platforms like Facebook, Instagram, TikTok, and Snapchat to reveal the results of their user experiments, disclose how their algorithms prioritize what users see on their feeds, explain how they treat abusive actors, and reveal how much time people spend on these platforms, including how often they receive notifications.
Sharing internal data is particularly valuable — something I know firsthand from my time working at Facebook (now called Meta).
In the four years I worked there, I helped produce dozens of internal reports on the company’s news feed algorithms. Despite finding evidence that optimizing for engagement often increases exposure to harassment, misinformation, and graphic content, Facebook continued to prioritize engagement over content quality as part of an effort to beat rivals like TikTok.
Stephenson and I wanted the Minnesota legislation to focus on limiting experiences online that present the most harm to children and teens. Unwanted contact, especially from strangers, is chief among them. Social media platforms make it easy for bad actors to contact people they don’t know at scale and without accountability. One in eight Instagram users under 16 years old experiences unwanted sexual advances on the platform. Earlier this year, the FBI issued a public safety alert after at least 12,600 minors were the victims of “sextortion” on a range of social media sites, messaging platforms, and video games. Motivated by sexual gratification or money, individual bad actors and organized groups targeted young people, coerced them into creating and sharing explicit images, and threatened to make the images public if the minors didn’t continue producing explicit content or pay them a fee.
In a recent report I helped write, Minnesota Attorney General Keith Ellison spotlighted harrowing testimonials received by his office. One minor reported that they “received sexually explicit photos from men who added my account. I did not need to add them back to see the image they had sent me.” Another teen blocked people bullying him on Snapchat — yet “they found ways to add me to group chats” and continue the harassment.
The new legislation in Minnesota will require social media companies to disclose what limits they have on how often users can send private messages, friend requests, and group invitations to people they don’t know, and whether the choices they enable, such as making it possible for an adult to contact a minor with whom they have no mutual friends or contacts, are encouraging such behavior.
In one leaked study, nearly a fifth of teens saw sexually explicit content at least once a week on Instagram. AI recommendation systems optimize for clicks, likes, and time engaged rather than the quality of content. And as we found at Facebook, sometimes the most toxic content attracts the strongest responses, fueling the algorithms that show it to more people.
The Minnesota legislation, which will take effect in July 2025, will make social media companies that have more than 10,000 in-state users reveal how their algorithms decide what appears in people’s feeds, as well as how much users’ preferences affect what they see.
This new data will allow Minnesota officials to compare how well platforms are doing at protecting young people, and the rest of us, too, from potentially harmful content and interactions online.
I hope every state will consider following Minnesota’s lead. Here are some lessons for other jurisdictions considering social media legislation:
First, focus on regulating algorithmic design, not just moderating content. Strangers are able to contact minors online because social media companies choose to let them. Regulating what the stranger says is content moderation. Stopping that stranger from contacting a minor is a design choice. It’s easier to stop harassment at the source than by trying to filter all the messages people exchange. Design approaches can also reduce free speech concerns, as they aren’t based on the content itself.
Second, don’t let companies decide for themselves whether their platform is safe. There have been a lot of efforts to require platforms to assess safety but fewer initiatives to establish safety standards or criteria. For example, Europe’s Digital Services Act mandates risk assessments but doesn’t specify how to carry them out. We should be requiring social media companies to demonstrate that their products adhere to independently established definitions of safety.
Third, recognize every piece of legislation as part of a long-term battle. Some provisions we included in our original draft of the Minnesota law did not survive because of industry opposition, like requiring companies to make it harder for strangers to contact minors. But we can continue to advocate for their inclusion in future bills that might more tightly focus on banning design features like infinite scroll and aggressive notifications for minors — both steps that Murthy has recommended. I’m working with people at the state, federal, and international levels to revisit the best ideas from all our efforts and develop legislative proposals for other jurisdictions.
Kids and teens are being harmed by a race to the bottom to grab their attention spans, with toxic content and harassment as the collateral damage of profit. Minnesota’s new law is an important step to begin holding social media companies accountable and serves as a call to action for other states.
Retired teacher here (taught 2005-2018). I saw the whole phone/social media thing unfold right in front of my eyes, and was keenly aware of the damage it was doing. It was real. I was competing with Silicon Valley MENSA's for the students' most important asset -- their attention -- and I was losing.
This effort may have started sooner had we not gone through the COVID debacle. But now that it has, I hope the momentum continues. One caveat: many districts have distributed laptops (chromebooks) to high school students. The kids are tech savvy and find their way around firewalls. Just a thought ...
Interesting that Minnesota will protect kids from social media companies because they are vulnerable but the governor also thinks that kids should be able to get a sex change operation.