Communications Daily is a Warren News publication.
NetChoice Preps Texas Lawsuit

Carr Maintains FCC Has Section 230 Role, Cites Transparency

The FCC “absolutely” still has a role in interpreting Communications Decency Act Section 230, Commissioner Brendan Carr told us last week on the sidelines at the Technology Policy Institute conference in Aspen. He believes ISP-like transparency rules can be used “as a foundation” for increasing social media content moderation transparency.

Sign up for a free preview to unlock the rest of this article

Communications Daily is required reading for senior executives at top telecom corporations, law firms, lobbying organizations, associations and government agencies (including the FCC). Join them today!

President Joe Biden revoked in May Donald Trump’s executive order aimed at addressing what Trump saw as social media censorship. Trump sought an FCC rulemaking to clarify its interpretation of Section 230 liability protections (see 2105140074).

Though Carr noted the FCC might not have three votes for his position, he said more content moderation transparency is an area of agreement between Republicans and Democrats. He said he agreed with specific Section 230-related comments last summer from acting Chairwoman Jessica Rosenworcel: Speaking before she was named acting chairwoman, she said social media can be “frustrating,” but turning the agency into the president’s “speech police is not the answer.” Her remarks, however, didn’t address the fundamental goal of restoring Section 230 to its foundational purpose, said Carr. Rosenworcel’s office didn’t comment.

A bill from Senate Commerce Committee ranking member Roger Wicker, R-Miss., includes many provisions that would impose affirmative transparency obligations for platforms (see 2106100070), said Carr. Asked if he had any input on Wicker’s Promoting Rights and Online Speech Protections to Ensure Every Consumer is Heard (Pro-Speech) Act, Carr said, “I think we’re always providing different feedback to members of Congress in various capacities for various bills.”

Platform users should be able to file claims against politically biased takedowns, said Carr. And platforms need affirmative, anti-discrimination obligations, just as do public accommodation laws and common carrier regimes, he said. Policymakers don’t need to fit platforms into the square hole of common carrier law, he added.

Getting Republicans and Democrats to agree on a Section 230 update is going to be a “challenge,” Carr said, noting the two sides are “pulling opposite ends” of the same thread: “Republicans want more speech and less censorship, and Democrats, in my view, want less speech and more censorship.”

Republicans at the state level are targeting Big Tech’s content moderation practices due to what they perceive as politically biased censorship. Legislation in Texas, HB-20 and SB-5 (see 2105140069 and 2105270073), would prohibit larger platforms from blocking, deplatforming or otherwise discriminating against users based on viewpoint or location within the state. If the bill ultimately passes, NetChoice is prepared to file a lawsuit like it did against a similar measure in Florida, CEO Steve DelBianco told us in Aspen: The organization already has a complaint drawn. DelBianco defended Section 230, saying it protects platforms against frivolous lawsuits when a platform has nothing to do with illegal content, for which a user is responsible. Section 230 also protects platforms for removing illegal content in good faith, he said: “It quickly ends those lawsuits.”

There’s some common ground between industry and advocates over the removal of illegal content from platforms, DigitalFrontiers Advocacy Principal Neil Fried told us. The goal should be for platforms to take “reasonable” actions when moderating illegal content, he said. Both Fried and DelBianco agreed the First Amendment protects “lawful but awful” content. Pornography, hate speech, white supremacy, and all manner of pedophilic child-grooming up to the point of solicitation make the list, said DelBianco. He noted platforms have a financial incentive to remove such content. Fried said platforms can’t legally turn a blind eye to material on sex trafficking, opioid sales, the sale of stolen goods or other illegal activity.

The Biden administration’s recent response to Cuban internet access concerns (see 2108110049) was “really disappointing,” said Carr: The administration’s letter was “simply a reiteration of the status quo” on long-standing exemptions and wasn’t any progress or action. The administration can make progress by providing new infrastructure or surge funding, said Carr. He suggested the government introduce internet connections with off-island technologies: high-altitude balloons, for example, which have been used in Puerto Rico after hurricanes. The other, almost-immediate option, he said, is additional funding for circumvention tools with sophisticated VPNs. He noted his discussions with the State Department about the second option. The White House didn’t comment.