Communications Daily is a Warren News publication.
Tech ‘Weaponizing’ 230

House Commerce Members Promise Update for Section 230

House Commerce Committee members on Thursday vowed to find a bipartisan solution for updating Communications Decency Act Section 230.

Sign up for a free preview to unlock the rest of this article

Communications Daily is required reading for senior executives at top telecom corporations, law firms, lobbying organizations, associations and government agencies (including the FCC). Join them today!

Chair Cathy McMorris Rodgers, R-Wash.; ranking member Frank Pallone, D-N.J.; House Communications Subcommittee Chairman Bob Latta, R-Ohio; and House Communications Subcommittee ranking member Anna Eshoo, D-Calif.; agreed Section 230, written in 1996, doesn’t account for the role algorithmic decision-making plays in social media harm. Speaking during a subcommittee hearing, they didn’t offer specific legislation but repeatedly mentioned social media's harm to children and tech’s ability to hide behind the liability shield in court.

Big Tech is “weaponizing” Section 230 by moderating content in a way a newspaper curates its stories, said Rodgers. As tech platforms act more like publishers, the responsibility to uphold American values like free speech increases, she said.

Pallone cited an opinion from a circuit court judge who ruled in Gonzalez v. Google, saying there’s no question Section 230 shelters more activity than Congress intended when it passed the statute. Broad interpretations have raised pressing questions Congress should address, said Pallone. He called Thursday’s hearing a “first step” in trying to find a bipartisan solution. Section 230 has contributed to platforms’ “unchecked power” and allowed them to operate in a state of “lawlessness,” he said.

When algorithms select what content will appear for each user, the platform is more than just a conduit transferring speech, said Eshoo. When social media harms children and families, they should have the opportunity to confront platforms in court and prove they didn’t meet an established duty of care, she said.

Congress should retain the statute’s “Good Samaritan” provision because it encourages platforms to act against harmful content. Still, lawmakers should craft a knowledge standard so platforms face consequences, testified University of Georgia law professor Mary Graw Leary.

Companies should be immune from liability only in narrow circumstances like when they’re taking “active steps” to mitigate the harm, George Washington University law professor Mary Anne Franks testified. Platforms shouldn’t be able to solicit, encourage, profit from or show deliberate indifference to harmful content, said Franks.

Platform executives have been less than honest when discussing the potential impact of a Section 230 repeal, said Middlebury College economics professor Allison Stanger. Tech’s public reaction to the prospect of a repeal is “outrage,” but company executives concede privately their businesses would be “all right” without Section 230 protections, she said. She noted how former Google CEO Eric Schmidt last week called for an overhaul of Section 230.

It’s unacceptable that state attorneys general often can’t file charges against platforms because Section 230 preempts state laws, said House Innovation Subcommittee Chairman Gus Bilirakis, R-Fla. Congress should “close the loophole on that,” he said, citing social media harms to children.

It’s clear Congress didn’t envision today's social media landscape when it approved Section 230, said Latta: It’s time to review the current legal framework and strike a balance between protecting speech and holding online platforms accountable for amplifying content.

Incompas in a statement Thursday urged Congress to protect Section 230 and the “delicate balance” struck between speech rights and user safety. Section 230 is the “cornerstone of the internet's infrastructure, enabling a vibrant online economy and facilitating the free exchange of ideas,” said CEO Chip Pickering.