State lawmakers are pushing to regulate the tech industry over children’s online privacy, an issue likely to gain steam in legislatures across the country as they reconvene next year.
The wave of potential youth digital privacy laws in the states is coming in the absence of federal legislation and with growing concern about how kids are tracked online and the effects of social media on their mental health.
California bolted ahead of other states this summer when lawmakers unanimously passed the California Age-Appropriate Design Code Act. Modeled after a similar law in the United Kingdom, it requires companies whose online products are likely to be accessed by kids to “consider the best interests of children” when designing their services or features.
“We are living in a time of [the] wild, wild West on this issue,” said Assemblymember Buffy Wicks (D), the primary author of the new California law. “We need more regulation around … how and what kids access online, especially our younger kids.”
Rep. Jordan Cunningham (R) was the measure’s joint author.
The law, which takes effect in 2024, requires companies to set higher privacy settings by default and places restrictions on the collection of personal information of users under 18 years old, including their precise geolocation.
The tech industry opposed the final bill, despite working with the sponsors on several amendments. The law has some “really good aspects” but is burdensome, said Dylan Hoffman, executive director for California and the Southwest at TechNet, an industry group whose members include Google, Meta and Snap Inc.
“The real concern and risk that this bill presents and one that … I think has the possibility of being litigated in the courts is whether this creates a chilling effect on lawful speech,” Hoffman said.
The trade group NetChoice takes an even harder line.
“The advocates may be well-intentioned,” said Jennifer Huddleston, NetChoice’s policy counsel. “[But] ultimately it should be parents and other trusted adults that are teaching young people how to safely navigate the internet and helping them make good choices.”
Still, California’s age-appropriate design concept may be used in other states. The London-based 5Rights Foundation, which played a key role in passage of California’s law, has developed model legislation that it hopes other states will take up in 2023.
“We have a lot of momentum,” said Nichole Rocha, 5Right’s director of U.S. affairs. “This was a really significant thing that happened in California, and I think there is pretty universal and bipartisan agreement throughout the country right now that something needs to be done to protect youth online.”
Rocha said 5Rights has so far been in touch with lawmakers in New York, New Jersey and Utah.
While not an exact replica, New York state Sen. Andrew Gounardes (D) said he looked at California’s law for inspiration as he crafted his own child data privacy bill, which he introduced in September. Gounardes said his bill would put “commonsense guardrails” in place to better protect kids who often spend hours each day online.
Among other things, Gounardes’ bill would prohibit digital advertising aimed at kids and grant the Attorney General’s Bureau of Internet and Technology the authority to ban features such as video autoplay and push notifications.
“Every day that we delay is a day that kids are left vulnerable and they’re left exposed and they’re put in harm’s way. People have died because of social media,” Gounardes said.
In Minnesota, state Rep. Kristin Robbins (R) took a narrower approach earlier this year with a bill she said was prompted by a Wall Street Journal investigation into the social media app TikTok. The report from last December highlighted concerns among health professionals that TikTok’s algorithms were contributing to a spike in eating disorders among young girls.
“I literally couldn’t sleep for a couple nights,” Robbins said. “It was shocking to me how much these kids are driven to sites that they’re not even necessarily looking for by these algorithms.”
Robbins’ measure, which died after passing out of committee, would have prohibited social media sites with more than 1 million account holders to use algorithms to target content at users under age 18.
“As a mom I wish I was actually doing a bit more, but I just feel like we have to start with a narrow approach that really gets at the main harm, which is targeting kids under 18,” Robbins said.
Unlike California’s law, the proposals in Minnesota and New York would allow individuals to bring lawsuits against companies for alleged violations — what is known as a private right of action.
Robbins said she believes the threat of a lawsuit is the linchpin to getting companies to comply.
“I’m a rock-solid conservative Republican,” Robbins said. “I do not like class action lawsuits, I do not like regulating private companies for the most part, but this is in my mind the only tool that will and has gotten their attention to get them to stop it.”
Robbins said her bill died following a tech industry lobbying blitz. She intends to try again next year.
As momentum grows for legislation in the states, several proposals to beef up online protections for kids have languished in Congress. They include the Kids Online Safety Act introduced by U.S. Sen. Richard Blumenthal (D-Conn.) and the PROTECT Kids Act sponsored by Rep. Tim Walberg (R-Mich.).
“While Congress continues to not be able to take action here, it’s really incumbent on us at the state level to do everything we can to protect these kids,” Gounardes said.
But the state-by-state approach concerns the tech industry, which would “prefer a federal solution rather than a patchwork,” TechNet’s Hoffman said.
While state legislators may face tough opposition from tech companies, Wicks said digital privacy and the internet’s effect on kids is one issue that can unite Democrats and Republicans.
“Because it has bipartisan support, it means a version of this bill could be run in Republican-controlled legislatures too, so it’s not hinged on party ideology,” Wicks said.
A separate bill co-sponsored by Wicks and Cunningham, which did not pass, would have allowed social media companies to be held liable for implementing features that are addictive to kids.