FCC and FTC Action on Section 230 Seems Inevitable, Despite Unclear Authority
The FCC and FTC are moving toward trying to rein in what they see as overly broad applications of Section 230 of the Communications Decency Act and to reverse what their agency leaders call censorship by social media platforms. Agency watchers said they expect the FCC to issue an advisory opinion soon, though some see such an opinion as more performative than practical. FCC Chairman Brendan Carr has repeatedly said that addressing "the censorship cartel" is one of the agency's priorities (see 2411210028). His office and the FTC didn't comment. FCC Commissioner Anna Gomez has been critical of the possibility of a Section 230 advisory opinion (see 2502240062).
Sign up for a free preview to unlock the rest of this article
Communications Daily is required reading for senior executives at top telecom corporations, law firms, lobbying organizations, associations and government agencies (including the FCC). Join them today!
The FCC's general counsel issued an opinion in 2020 that the agency had the authority under the Communications Act to issue a rulemaking clarifying the scope of the immunity shield (see 2010210062). Given legal developments since then, such as the U.S. Supreme Court's Loper Bright decision curbing the deference that courts give expert agencies, and President Donald Trump's executive order barring the federal government from pressing social media platforms to moderate content, "the law has changed dramatically," said Phoenix Center President Lawrence Spiwak. While Carr and FTC Chairman Andrew Ferguson seem determined to pursue Section 230-related issues, "there are not a lot of [legal] tools left in the toolbox," Spiwak said. A formal FCC rulemaking probably isn't legally supportable, but an advisory opinion could carry weight in future court decisions, he added: “The imprimatur of a government agency shouldn't be discounted.”
Spiwak said the FTC's tools, including the use of unfair practices and antitrust law, face their own challenges. There's no evidence that's been made public of collusion among social media companies, and claims of advertising boycotts of platforms have to overcome the fact that there's no duty to deal in antitrust law, he said. But the very threat of the FTC bringing a case is powerful, as companies would then have to spend resources defending themselves.
At the end of the first Trump administration, NTIA petitioned the FCC to start a rulemaking on Section 230. Daniel Cochrane, a senior research associate at the Heritage Foundation's Tech Policy Center, told us it's "a matter of when, not if" the agency takes up the issue again. Less clear is whether the agency goes the advisory opinion route or starts a rulemaking proceeding, potentially picking up where it left off, he said.
Authority Questions
Carr and Ferguson insist the Communications Act and FTC Act, respectively, give their agencies the tools they need.
At an event convened earlier this month by DOJ's Antitrust Division, Carr said that in cases where platforms are "censor[ing] outside whatever the right you had to do in the First Amendment ... that's where the FCC can step in and clarify and say 230 does not grant you the sweep of immunity that courts have incorrectly read into 230." He added that he sees the agency using its Communications Act authority "to require much more transparency about how these decisions are being made."
At the DOJ event, Ferguson said SCOTUS' 2024 decision affirming social media platforms' ability to moderate content (see 2407010053) "makes the Communications Act and the Federal Trade Commission Act super important, because they aren't speech laws directly." The FTC Act, in particular, is more about regulating the power that might let platforms decide who gets to speak, he said. Ferguson said the agency's February request for information from people who have been banned, shadowbanned or demonetized by social media platforms is especially looking for instances that run contrary to the platforms' terms of service or community standards. "That's exactly what the FTC Act generally forbids -- a business from telling consumers one thing and doing something else."
Traditionally, the FCC's role has revolved around technical decisions, University of Massachusetts Amherst associate professor Ethan Zuckerman told us. But promises to interpret Section 230 very differently must be taken seriously, and it's hard to know how a court might look at that interpretation in a legal challenge, said Zuckerman, who teaches public policy, information and communication.
Eric Goldman, co-director of Santa Clara University's High Tech Law Institute, said the FCC would be "out of their wheelhouse" getting involved in Section 230 issues. But he said he expects the agency to make extensive performative efforts regardless, and any interpretation or policy statement would inevitably be cited by some judges as authority as they look to disregard Section 230 precedent. Goldman said the FTC is "identically situated" to the FCC in lacking Section 230 authority.
The Supreme Court's Loper Bright decision largely neuters whatever weight a new FCC interpretation of Section 230 might have carried, said University of Colorado associate law professor Blake Reid. Online platforms also enjoy First Amendment protections, which makes it hard to imagine Carr getting the result he's seeking, Reid added.
While agency action looms, social media platforms have acted.
Some are trying to get into the good graces of the White House, Zuckerman said, pointing to Meta's move away from professional fact-checking and toward more moderation via community notes (see 2501070055). A lot of the administration's Section 230 policy talk "has been a cudgel" to force changes in platforms' behavior. He said some platforms probably swung too far in aggressive moderation, citing the suppression of the Hunter Biden laptop story as an example.
Yet absent any moderation, platforms become rife with objectionable content such as pornography, which then hurts their bottom lines as advertisers shy away to protect their brands, Zuckerman said. The current momentum of relying on user reports about objectionable content will probably swing the pendulum too far in the other direction, with platforms losing some users and advertisers, he added.
The Heritage Foundation's Cochrane said that while platforms are indicating they have changed the most egregious behavior, he's skeptical. "Making a public statement does not equate to real and actual change that protects Americans online."