The authoritative news source for communications regulation
‘Troubled’

Calif. DOJ Argues Social Media Design Law Isn’t Speech-Related

California’s age-appropriate design law doesn’t violate the First Amendment because it regulates social media data practices, not content, the office of Attorney General Rob Bonta (D) argued Wednesday before the 9th U.S. Circuit Court of Appeals. The court’s three-judge panel suggested the First Amendment applies.

TO READ THE FULL STORY
Start A Trial

The panel heard oral argument in NetChoice v. Bonta. The Age-Appropriate Design Code Act (AB-2273) requires that social media companies with child users follow “age-appropriate” design principles to shield children from online harms (see 2209150070). The U.S. District Court for the Northern District of California in September granted NetChoice's request for a preliminary injunction against the law. The lower court ruled the state has “no right to enforce obligations that would essentially press private companies into service as government censors, thus violating the First Amendment by proxy.” Bonta appealed.

AB-2273's objective is to prevent companies from using data practices that result in exposing children to harmful content and activities, Deputy Attorney General Kristin Liska said during Wednesday's argument. For example, she said, companies use data to deliver to children targeted ads for illegal tobacco products, or they use data to deliver content promoting eating disorders.

Judge Milan Smith countered that if California forces a company to determine what’s harmful, it amounts to “compelled speech” and raises First Amendment issues. Moreover, content about climate change and war could potentially disturb children, but it has merit, he said. “You’re asking these companies to have an opinion about these things and express it,” the judge said. “You’re asking them to make a determination, and that’s what I’m troubled about.”

AB-2273 asks that companies make subjective determinations, not just conduct “high-level data analysis,” Judge Mark Bennett said. Judge Anthony Johnstone agreed, suggesting that while climate change- or war-related content could be harmful, it’s First Amendment-protected.

Liska argued that determining what’s harmful isn't the concern. The law directs enforcers to look at how data is used to expose children to this “universe” of harm, she said: “Everyone” would agree there’s harmful content and online contacts for children.

Are you confessing that the state is trying to regulate content?” Smith asked.

No, your honor,” she responded. “We’re concerned about the data practices.”

An attorney arguing for NetChoice described AB-2273 as a “speech regulation masquerading as a privacy law.” Robert Corn-Revere, chief counsel at the Foundation for Individual Rights and Expression, said the thrust of the measure is to regulate speech. He noted the district court found that more than a dozen of the law’s provisions violate the First Amendment.

The three-judge panel discussed the possibility of remanding the case to the district court, given complications with NetChoice’s facial challenge. A facial challenge alleges a statute is unconstitutional in all applications. An as-applied challenge limits a claim to specific online activities, allowing a court to weigh the constitutionality of specific online functions. This was key in the U.S. Supreme Court’s recent decision to remand the tech industry’s lawsuits against Florida and Texas social media laws to the lower courts (see 2407010053). Justice Elena Kagan’s majority opinion found the 11th U.S. Circuit Court of Appeals and the 5th Circuit failed to conduct a “proper analysis” of the tech industry’s facial First Amendment challenges. Smith, Bennett and Johnstone suggested they might need to reach a similar determination.

Corn-Revere said that if the case is remanded, the court should uphold the injunction due to the irreparable First Amendment harms associated with the law. Liska argued the district court failed to conduct a section-by-section facial analysis of AB-2273 in a decision that preceded the Supreme Court remand. The high court in its Moody decision made clear that in a facial challenge, the court must examine “what activities by what actors” are prohibited under the regulation. In a law this complex, it’s necessary to look at each separate provision, Liska said.