California Forges Ahead With Social Media Rules Despite Legal Barriers

An unrecognizable teen is checking social media notifications on their smartphone while lying on a couch at home.

California lawmakers are pursuing legislation aimed at protecting children from the dangers of social media, one of many efforts around the country to confront what U.S. Surgeon General Vivek Murthy and other public health experts say is a mental health emergency among young people.

But California’s efforts, like those in other states, will likely face the same legal challenges that have thwarted previous legislative attempts to regulate social media. The tech industry has argued successfully that imposing rules regulating how social media operate and how people can use the online services violates the free speech rights of the companies and their customers.

A previous effort at confronting the issue, the California Age-Appropriate Design Code Act in 2022, now rests with the U.S. Court of Appeals for the 9th Circuit. A tech trade association sued to block the law and won an injunction from a lower court, largely on First Amendment grounds. The appeals court heard oral arguments in the case on July 17.

“At the end of the day, unconstitutional law protects zero children,” said Carl Szabo, vice president and general counsel for NetChoice, which argued for the tech giants before the federal appellate court.

Like the design code act, the two proposals now working their way through the California Legislature would reshape the way social media users under 18 interact with the services.

The first bill, by state Sen. Nancy Skinner (D-Berkeley), prohibits sending push notifications to children at night and during school hours. Skinner’s measure also requires parental permission before platforms can send social media offerings via algorithms, which are designed to offer feeds that children didn’t ask for but might keep them looking at their phones longer, rather than the traditional chronological feeds of those they follow on the app.

The second measure, by Assemblymember Buffy Wicks (D-Oakland), would amend California’s privacy laws to prohibit businesses from collecting, using, selling, or sharing data on minors without their informed consent — or, for those under 13, without their parents’ approval.

Both bills have bipartisan support and are backed by state Attorney General Rob Bonta. “We need to act now to protect our children,” Bonta said earlier this year, by “strengthening data privacy protections for minors and safeguarding youth against social media addiction.”

California Gov. Gavin Newsom, a Democrat, has been vocal about youth and social media, too, and recently called for a statewide ban on cellphones in schools. His positions on the two social media proposals are not yet known. “But I think the governor, like most every other Californian, is concerned about the harms of social media on kids,” Skinner said.

California’s efforts are especially significant because its influence as the most populous state often results in its setting standards that are then adopted by other states. Also, some of the big tech companies that would be most affected by the laws, including Meta, Apple, Snap, and Alphabet, the parent company of Google, are headquartered in the state.

“Parents are demanding this. That’s why you see Democrats and Republicans working together,” said Wicks, who with a Republican colleague co-authored the design code act that is tied up in litigation. “Regulation is coming, and we won’t stop until we can keep our kids safe online.”

The fate of the design code act stands as a cautionary tale. Passed without a dissenting vote, the law would set strict limits on data collection from minors and order privacy settings for children to default to their highest levels.

NetChoice, which immediately sued to block the law, has prevailed in similar cases in Ohio, Arkansas, and Mississippi. It is challenging legislation in Utah that was rewritten after NetChoice sued over the original version. And NetChoice’s lawyers argued before the U.S. Supreme Court that efforts in Texas and Florida to regulate social media content were unconstitutional. Those cases were remanded to lower courts for further review.

Though the particulars differ in each state, the bottom line is the same: Each of the laws has been stifled by an injunction, and none has taken effect.

“When you look at these sweeping laws like the California laws, they’re ambitious and I applaud them,” said Nancy Costello, a clinical law professor at Michigan State University and the director of the school’s First Amendment Clinic. “But the bigger and broader the law is, the greater chance that there will be a First Amendment violation found by the courts.”

The harmful effects of social media on children are well established. An advisory from Surgeon General Murthy last year warned of a “profound risk of harm” to young people, noting that a study of adolescents from ages 12 to 15 found that those who spent more than three hours a day on social media were at twice the risk of depression and anxiety as nonusers. A Gallup survey in 2023 found that U.S. teenagers spent nearly five hours a day on social media.

In June, Murthy called for warnings on social media platforms like those on tobacco products. Later that month came Newsom’s call to severely restrict the use of smartphones during the school day in California. Legislation to codify Newsom’s proposal is working its way through the state Assembly.

Federal legislation has been slow to materialize. A bipartisan bill to limit algorithm-derived feeds and keep children under 13 off social media was introduced in May, but Congress has done little to meaningfully rein in tech platforms — despite Meta’s chief executive, Mark Zuckerberg, apologizing in a U.S. Senate hearing for “the types of things that your families have had to suffer” because of social media harms.

It remains unclear what kinds of regulation the courts will permit. NetChoice has argued that many proposed social media regulations amount to the government dictating how privately owned firms set their editorial rules, in violation of the First Amendment. The industry also leans on Section 230 of the 1996 Communications Decency Act, which shields tech companies from liability for harmful content produced by a third party.

“We’re hoping lawmakers will realize that as much as you may want to, you can’t end-around the Constitution,” said Szabo, the NetChoice attorney. “The government is not a substitute for parents.”

Skinner tried and failed last year to pass legislation holding tech companies accountable for targeting children with harmful content. This year’s measure, which was overwhelmingly passed by the California Senate and is pending in the state Assembly, would bar tech companies from sending social media notifications to children between midnight and 6 a.m. every day, and 8 a.m. to 3 p.m. on school days. The bill also calls for platforms to require minors to obtain parental consent to use their core offerings, and would limit their use to an hour to 90 minutes a day by default.

“If the private sector is not willing to modify their product in a way that makes it safe for Californians, then we have to require them to,” Skinner said, adding that parts of her proposal are standard practice in the European Union.

“Social media has already accommodated users in many parts of the world, but not the U.S.,” she said. “They can do it. They’ve chosen not to.”

Wicks, meanwhile, said she considers her data bill to be about consumer protection, not speech. The proposal would close a loophole in the California Electronic Communications Privacy Act to prevent social media platforms from collecting and sharing information on anyone under 18 unless they opt in. The Assembly approved Wicks’ measure without dissent, sending it to the state Senate for consideration.

Costello suggested that focusing the proposals more narrowly might give them a better chance of surviving court challenges. She is part of an effort coordinated by Harvard’s T.H. Chan School of Public Health to write model legislation that would require third-party assessments of the risks posed by the algorithms used by social media apps.

“It means that we’re not restricting content, we’re measuring harms,” Costello said. Once the harms are documented, the results would be publicly available and could lead state attorneys general to take legal action. Government agencies adopted a similar approach against tobacco companies in the 1990s, suing for deceptive advertising or business practices.

Szabo said NetChoice has worked with states to enact what he called “constitutional and commonsense laws,” citing measures in Virginia and Florida that would mandate digital education in school. “There is a role for government,” Szabo said. (The Florida measure failed.)

But with little momentum on actual regulation at the national level, state legislators continue to try to fill the vacuum. New York recently passed legislation similar to Skinner’s, which the state senator said was an encouraging sign.

Will NetChoice race for an injunction in New York? “We are having lots of conversations about it,” Szabo said.

This article was produced by KFF Health News, which publishes California Healthline, an editorially independent service of the California Health Care Foundation. 

Exit mobile version