Disruption

Calif. lawmaker aims to ban addictive social media algorithms

The bill, authored by state Sen. Nancy Skinner (D), represents a renewed attempt in the state to regulate addictive social media designs and features that target youth.
FILE – In this April 23, 2019, file photo, State Sen. Nancy Skinner, D-Berkeley, chairwoman of the Senate public safety committee, displays a copy of Democratic state Sen. Anna Caballero’s police-backed law enforcement training bill during a hearing at the Capitol in Sacramento, Calif. One of the hallmark bills of California lawmakers’ policing reform efforts cleared a hurdle Wednesday, Aug. 26, 2020 but faces tough going as the Legislature races to adjourn for the year on Monday. (AP Photo/Rich Pedroncelli, File)

Social media algorithms that encourage youth to buy fentanyl, lose weight, participate in “choking challenges” or engage in self-harm would be prohibited under a proposed law in California.

The measure, authored by state Sen. Nancy Skinner (D), represents a renewed attempt in California to regulate addictive social media designs and features that target youth after a similar, but narrower, bill died last year.

The effort comes as lawmakers in multiple states consider new internet protections and restrictions for younger users amid growing concern about data privacy and the effects of social media on teenagers’ mental health.

“We know that there’s value [in social media], but there’s also harm, and algorithms can be designed to not direct harm and that’s what we want social media to take responsibility for,” Skinner told Pluribus News.

Under Skinner’s proposal, social media sites would be prohibited from deploying an algorithm, design or feature that causes youth users to become addicted to the platform. The measure would also specifically bar algorithms that direct content or messages to youth on how to die by suicide, how to purchase fentanyl, or that encourage the use of diet pills or products.

Additionally, Skinner confirmed she plans to amend the bill to include a specific prohibition on features that prompt the purchase of illegal firearms, including untraceable “ghost guns.”

Each knowing and willful violation of the proposed law would carry a maximum fine of $250,000.

The bill specifically addresses the algorithmic targeting of youth and would not impose liability on a social media company for content that is uploaded or shared by users of the site.

The proposed law has the support of Common Sense Media and the Children’s Advocacy Institute at the University of San Diego School of Law. But it has drawn tech industry opposition.

In a statement, Carl Szabo, vice president and general counsel at NetChoice, a leading tech industry trade group, compared the effort to “the same failed approach California took twenty years ago when video games were the moral panic of the day.”

“It also injects government and technology in between parents and their teenagers,” Szabo added. “And, it does little to address the underlying issues raised by social media — responsible use of technology.”

The California proposal is just one high-profile example of a wave of state legislation under consideration this year aimed at regulating how internet companies engage with younger users. To date, state lawmakers in at least a dozen states have introduced children’s privacy legislation, according to tracking by the law firm HuschBlackwell.

“It’s really grown momentum this year,” said David Stauss, head of HuschBlackwell’s privacy practice. “There were a few children’s privacy bills last year, but nothing like we’ve seen with the number of bills filed this year.”

The state-level momentum has been spurred by an absence of federal action and signals that state lawmakers in both parties are increasingly convinced that social media is doing serious harm to teens. That concern was heightened by Covid-19, which drove a 17% spike in the amount of time tweens and teens spend each day being entertained online, according to a 2021 Common Sense Media report.

Some of the legislative efforts this year are new, others are carryovers from 2022.

For instance, Minnesota Rep. Kristin Robbins (R) recently reintroduced bipartisan legislation to bar social media sites from using algorithms to target content at users under age 18. Robbins has said her bill was inspired by a Wall Street Journal investigation into TikTok’s use of algorithms.

A bill has also been reintroduced in Connecticut to require parental permission for teens 16 and younger to open a social media account.

In Utah, Republicans have fast-tracked a pair of social media crackdown measures. The House overwhelmingly passed a bill that would prohibit addictive design features and make it easier for parents to sue social media companies for alleged harms to their children.

“We have a public health crisis on our hands today,” Rep. Jordan Teuscher (R), the prime sponsor of the House bill, said in a floor speech. “It’s social media.”

A previous version of the measure would have gone much further, requiring parental permission before a child under 18 opens a social media account. That drew opposition from the tech industry, which said age verification presents a “complex challenge” for companies.

A separate bill in the Utah Senate aims to shield younger users from targeted advertising and restrict direct messaging.

Texas Rep. Jared Patterson (R) introduced a bill to ban anyone under 18 from having a social media account.

While these bills focus specifically on social media, broader children’s internet privacy bills have been introduced this year in Massachusetts, Maryland, New Jersey, New Mexico, Oregon and West Virginia.

This trend follows passage last year of a first-in-the-nation age-appropriate design law in California. Modeled after a similar law in the United Kingdom, it requires online products that are likely to be accessed by kids to be designed with their “best interests” in mind. The California law is now under legal challenge.

But the threat of litigation is not deterring other states. On Monday, Democratic lawmakers in Maryland plan to formally unveil two age-appropriate design bills at a virtual news conference that will feature Baroness Beeban Kidron, a member of the UK House of Lords and founder of the youth privacy advocacy group 5Rights Foundation.

In Virginia, the House of Delegates this month gave near unanimous approval to a bill sponsored by Del. Emily Brewer (R) that would, among other things, prohibit targeted advertising at youth. That bill has since stalled in the Virginia Senate, but Brewer said she is not done working on the issue.

“The bullying that happens to teenagers alone on social media is enough, having their biggest fears marketed back to them in a malicious way is too far,” Brewer said, citing concerns about youth self-harm.

Last month, New York Sen. Andrew Gounardes (D) reintroduced his Child Data Privacy and Protection Act, which he said would “protect our children from digital dangers such as adults targeting illegal drug sales to minors, revenge porn attacks, and more.”

The sweeping bill would give the state attorney general’s office the power to ban features such as auto play, push notifications and in-app purchases. The measure would also prohibit digital advertising aimed at children.

“Let’s face the facts: no matter what steps we take as parents, our children are increasingly living their lives online. We need to recognize that and protect them there,” Gounardes said in a statement.

The issue has also drawn the attention of President Biden, who in his State of the Union speech once again called for stricter data privacy protections for children and teens. Existing federal law addresses the privacy of users under age 13.

“It’s time to pass bipartisan legislation to stop Big Tech from collecting personal data on our kids and teenagers online,” Biden said to bipartisan applause.