Disruption

AGs push for laws to regulate social media, protect kids

They’ve also filed lawsuits against companies.
Washington Attorney General Nick Brown (D) speaks during a news conference Tuesday, Jan. 21, 2025, in Seattle. (AP Photo/Lindsey Wasson)

Attorneys general in several states are spearheading legislation to regulate social media platforms to protect kids.

Officials in California, Connecticut, Minnesota, Nebraska, Nevada and Washington have introduced or recommended bills this year.

As their lawsuits against social media companies wend their way through the courts, these chief legal officers are increasingly looking to also enact change in social media usage by shaping legislation — especially after their pleas for Congress to act have gone unanswered.

“We need to give parents back some control, and families can’t afford to wait for the federal government or the tech giants to do it,” Connecticut Attorney General William Tong (D) said in a statement this week announcing bipartisan legislation to address social media addiction.

Tong’s bill would require parental permission for minors to access algorithmic-fed social media feeds, heightened privacy settings for youth accounts, one-hour-a-day limits, and a ban on overnight notifications. The measure is modeled on laws in California, New York and Utah.

California’s laws also informed legislation requested by newly sworn-in Washington State Attorney General Nick Brown (D). His bill borrows elements from two pioneering California statutes: the Age-Appropriate Design Code passed in 2022 and last year’s Protecting Our Kids from Social Media Addiction Act.

Both laws are tied up in court due to industry challenges. Brown’s bill was crafted with that in mind. For example, rather than require platforms to determine a user’s age — which the tech industry and groups including the ACLU have challenged as unconstitutional — it would instruct companies to estimate a minor’s age “with a reasonable level of certainty.”

The bill would also require online platforms to provide higher privacy and data protection to minor users, bar “addictive feeds” for teenagers, and restrict notifications to youth during the school day and overnight unless a parent allows otherwise.

“We must prioritize protecting the privacy and mental health of young Washingtonians,” Brown said in a statement. “The harms of addictive social media use are complex for young people, which is why we are working with the Legislature to ensure businesses are accountable and responsive to those impacts.”

Nevada Attorney General Aaron Ford (D), who in January 2024 sued Meta, TikTok and Snap for alleged harms to youth, introduced the Nevada Youth Online Safety Act. It would require age verification for social media platforms, bar access for minors under 13, and require parental permission for those 13 to 17 to have a social media account. Parents could revoke their consent at any time. It shares some similarities with a Florida law passed last year that is now under legal challenge.

Other elements of the Nevada bill would prohibit a platform from using a minor’s data to offer personalized content, and instruct platforms to disable infinite scroll, auto-play, live streaming and “likes” for teen accounts. It would also block schoolday and nighttime notifications and allow parents to expand those blackout hours to other hours of the day.

“Plenty of research has shown the detrimental effects of social media on our youth, and it was important to me to bring forward litigation that would address these impacts and provide parents with the tools they need to make sure their children grow up as happy and as healthy as possible,” Ford said in a statement to Pluribus News.

California Attorney General Rob Bonta (D), who last year sponsored the social media addiction bill, is backing legislation this year to require warning labels on social media. Bonta was among 42 state and territorial attorneys general who signed a letter in September urging Congress to pass a bill that would require a U.S. Surgeon General warning on algorithm-fed social media.

Minnesota Attorney General Keith Ellison (D) this month released his second annual report on the risk social media and AI pose to young people. The report called for legislators to ban “deceptive” features such as infinite scroll and auto-play, require higher privacy settings by default, and place limits on algorithms designed to keep users scrolling.

“We must continue to establish reasonable guardrails to protect young people online and prevent tech companies from sacrificing the well-being of children and teenagers just to turn a profit,” Ellison said in a statement announcing the report.

On the Republican side, Nebraska Attorney General Mike Hilgers (R) has joined with Gov. Jim Pillen (R) to back four bills this year related to youth online wellbeing, including Age-Appropriate Design Code legislation and a requirement that youth get their parents’ permission to have a social media account.

“One of the top priorities for our office is protecting children. And right now, some of the biggest threats to children originate online, through cell phones, social media, and child sexual abuse material,” Hilgers said in a statement last month. “I am proud to work with Governor Pillen and senators to enact laws that hold those perpetrating harm against our children accountable.”

State efforts to regulate social media to protect kids have accelerated over the past two years. The tech industry has responded by heavily lobbying in statehouses and, when laws are enacted, suing to overturn them.

Industry group NetChoice has led the litigation efforts, winning injunctions in multiple states to block kids’ online safety laws from taking effect.

The litigation is likely to continue if any of the attorney general-backed bills are passed. NetChoice has already put Connecticut on notice.

“Not only is the proposal legally questionable, but it also overlooks the crucial role algorithms play in safeguarding children online,” Amy Bos, NetChoice’s director of state and federal affairs, said in a statement. “Instead of pursuing this flawed approach, Connecticut should focus on more effective strategies, such as promoting digital literacy education and equipping parents with the tools and resources they need to better guide their children’s online experiences.”