Disruption

Advocates overhaul youth online safety model bill

It’s designed to withstand tech industry lawsuits.
A line-up of Apple iPhone 13s. (AP Photo/Richard Drew, File)

Advocates for youth online safety are gearing up for the fourth year in a row to push for state legislation that would require social media platforms and other online services to prioritize teenagers’ privacy and protections.

So-called Age-Appropriate Design Code laws borrow from regulations in the United Kingdom and generally require online platforms that children are likely to access to be designed with their best interests in mind.

California passed the nation’s first privacy-by-default, safety-by-design law in 2022. It is currently caught up in a legal challenge brought by the tech industry. Maryland Gov. Wes Moore (D) this year signed a retooled version of the law which has so far not been challenged in court.

Bills were introduced but not passed this year in Hawaii, Illinois, Michigan, Minnesota, New Mexico, South Carolina and Vermont.

For 2025, the Kids Code Coalition’s model bill has undergone a major overhaul with the goal of further inoculating the policy from tech industry legal challenges.

“It is a more concerted effort to protect kids effectively and comprehensively while reducing vulnerability to challenges under the First Amendment,” said Thomas Jones, an attorney who played a key role in rewriting the model legislation.

Notably, the updated version — which the coalition says is still in discussion draft form — does not include a requirement that online services conduct a Data Protection Impact Assessment to assess their compliance with the law. But the draft would require annual independent third-party audits of platforms with the goal of providing transparency into how the platform is engaging young users.

The discussion draft is designed to be more flexible so that lawmakers can add and subtract sections based on what they view as their state’s needs. It also places more specific requirements on companies.

“It’s a more complete list of the prescriptive protections,” Jones said.

The effort to pass Age-Appropriate Design Code laws is part of a broader trend of state lawmakers from both parties seeking to regulate social media and other online spaces with the goal of protecting kids. It’s a response to growing rates of depression, anxiety and suicide among teens.

“We’re going to see a lot of different bills across the country [in 2025], and this is one that balances empowering roles for both kids and parents — that’s always been what we’ve been about,” said Marjorie Connolly, a spokesperson for the Kids Code Coalition.

The model bill for 2025 aims to prevent compulsive use of online services by teens and to protect them from online harms, including by mandating higher privacy settings by default.

Platforms would have to provide tools that make it easy to limit the amount of time a user can interact with the service each day, restrict who can message them on the platform, and prevent others from seeing the user’s personal data or their precise geolocation.

They would also have to grant users the ability to disable design features that are not necessary for the service to function, such as infinite scroll, notifications, push alerts, in-game purchases and appearance altering filters.

On social media sites, teens could request a chronological feed instead of an algorithmic feed. They could also opt out of receiving certain categories of content. Push notifications would be prohibited during the school day and overnight — something California and New York mandated this year.

The draft legislation would require companies to provide parents with tools to support an “age-appropriate” experience for their children, including the ability to view and change privacy and account settings for kids under 13 and restrict purchases for teens up to 18. Parents would also be able to set daily time limits for children up to age 13.

Several of the protections in the discussion draft are borrowed from social media regulations state lawmakers passed in recent years and from changes that leading platforms made after the UK regulations took effect in 2020. They also mirror some of the steps social media companies have recently taken to provide teen users with a more protected online experience.

The bill targets companies that earn more than $25 million in annual gross revenues or that trade in the data of at least 50,000 consumers or that earn at least 50% of revenues from personal data sales. State attorneys general would enforce the law. 

While not explicitly called out in the legislation, the coalition says it would also apply to emerging artificial intelligence technologies such as companion chatbots that have been linked to recent lawsuits and a Florida teen’s suicide.

Despite the changes to the model bill, tech industry trade group NetChoice criticized the Age-Appropriate Design Code approach and advocated for more enforcement of internet-based crimes and online safety education. 

“NetChoice will continue in 2025 to fight for meaningful solutions that will actually help kids be safer online — like properly funding law enforcement to stop predators and cybercriminals that are targeting children, pushing for digital safety educational programs, and more,” NetChoice spokesperson Krista Chavez said. “We will continue to push back against proposals that undermine data security and violate Americans’ rights to access information online and engage in free, protected speech. An unconstitutional law protects no one.”

NetChoice is leading the challenge to California’s Age-Appropriate Design Code and has won injunctions to block several other social media laws around the country on First Amendment grounds. It is currently seeking an injunction to keep a new California law barring addictive social media feeds from taking effect on Jan. 1. New York lawmakers passed a similar law this year. 

But the slew of lawsuits has not deterred state lawmakers. The sponsors of Age-Appropriate Design Code bills in Michigan, Minnesota and Vermont have already signaled they will make another attempt next year.

“Children’s online safety isn’t optional,” Vermont Rep. Monique Priestley (D) said. “Platforms must be designed with kids’ best interests at heart – fostering safe spaces and protecting them from exploitation.”

Minnesota Rep. Kristin Bahner (D), who has tried the past two years to pass an Age-Appropriate Design Code bill, said she is “all in on the bill.”

“We will be back [in 2025], and there is some good momentum building among the advocates,” Bahner said.