FCC Chairwoman Jessica Rosenworcel agrees content moderation and Section 230 of the Communications Decency Act could be improved, she said during a Q&A at the Knight Foundation Media Forum Thursday: "I think a lot of people would say there must be a way to do better. I'm among them." Section 230 is important and helped the internet grow, but “we might over time want to condition its protections on more transparency, complaint processes, things that make you a good actor,” Rosenworcel said, conceding that creating an alternative to 230 would be difficult. Asked about FCC authority over 230, Rosenworcel condemned the previous administration’s efforts on that as “not particularly well-developed” but also seemed to indicate the agency could be involved in future 230 revisions. After Gonzalez v. Google, “we’re going to have to have some discussions about what changes we might see in Congress or what changes we might see at the FCC, but I don’t think that earlier petition that was filed was it,” she said, referencing a case argued Tuesday at the Supreme Court (see 2302210062). Rosenworcel said the agency has done a lot of “incredible things” with four commissioners, but she hopes it gets a fifth soon. One policy she would tackle with a majority is the FCC’s definition of broadband speeds, she said. “If I have five people we’re gonna up that standard,” she said. “It’s really easy to decry polarization and politicization in any environment in Washington,” she said. “But I think the more interesting thing is to put your head down and see what you can do. History is not interested in your complaints.” Asked about FCC efforts to improve connections for the incarcerated, Rosenworcel touted her recent circulation of an item on prison phone rates. She's “optimistic” about having unanimous support for the item at the agency, she said.
Conservative and liberal Supreme Court justices appeared skeptical Tuesday that a social media platform's inaction in removing terrorist content amounts to aiding and abetting terror plots. The court heard oral argument in Gonzalez v. Google (docket 21-1333) (see 2301130028).
Sens. Brian Schatz, D-Hawaii, and John Thune, R-S.D., announced reintroduction Thursday of a bill to hold tech companies liable for hosting content that violates their own policies or is illegal. The Internet Platform Accountability and Consumer Transparency (Internet Pact) Act would amend Communications Decency Act Section 230 and require “large online platforms” to “remove court-determined illegal content and activity within four days.” The bill would exempt “enforcement of federal civil laws from Section 230” so online platforms can’t “use it as a defense” when federal regulators like DOJ or the FTC “pursue civil actions online.” The bill would require platforms to share publicly available content moderation practices. The bill is co-sponsored by Sens. Tammy Baldwin, D-Wis.; John Barrasso, R-Wyo.; Ben Ray Lujan, D-N.M.; Bill Cassidy, R-La.; John Hickenlooper, D-Colo.; and Shelley Moore Capito, R-W.Va.
A New Hampshire House committee Wednesday soundly defeated a bill to regulate social media. But in Kansas, state senators at another hearing the same day appeared largely supportive of a proposed bill that would restrict online platforms from editing or removing political speech. Many state legislators have floated measures to regulate or investigate social media this session while the Supreme Court considers whether to hear industry challenges to Texas and Florida laws from 2021 (see 2301230051).
Sens. Joe Manchin, D-W.Va., and John Cornyn, R-Texas, on Monday reintroduced legislation that would amend Communications Decency Act Section 230 and require platforms to report illegal drug sales and other illicit activity (see 2009290065). Originally introduced in 2020, the See Something, Say Something Online Act would require platforms to “report suspicious activity to law enforcement, similar to the way that banks are required to report suspicious transactions over $10,000 or others that might signal criminal activity.” The legislation “strong-arms online platforms into handing over the correspondence of civil rights and parents groups to law enforcement simply because speech and chat websites would need to take a better-safe-than-sorry approach to maintain protection from liability,” said NetChoice Vice President Carl Szabo.
The House Commerce Committee will explore ways to combat illegal online drug sales and the liability protections potentially facilitating the activity, said Chair Cathy McMorris Rodgers, R-Wash., Wednesday.
Congress needs to come together to establish national privacy standards, President Joe Biden and House Commerce Committee Chair Cathy McMorris Rodgers, R-Wash., said separately Wednesday, renewing attention to an issue that saw bipartisan progress in 2022.
The Supreme Court should “narrow the scope” of Communications Decency Act Section 230 and reverse the 9th Circuit’s decision shielding YouTube from liability in Gonzalez v. Google (docket 21-1333), Texas Attorney General Ken Paxton (R) wrote in a merits-stage amicus brief announced Thursday (see 2212070026). The 9th U.S. Circuit Court of Appeals in June 2021 dismissed a lawsuit against YouTube for hosting and recommending ISIS proselytizing and recruitment videos. The 9th Circuit affirmed a decision from the U.S. District Court for the Northern District of California shielding YouTube and its algorithms from liability. Plaintiff in the litigation and SCOTUS petitioner is the estate of Nohemi Gonzalez, an American student who was killed in Paris in 2015 during an ISIS attack. The petitioner asked SCOTUS to revisit the 9th Circuit's decision. Google didn’t comment. The case is scheduled for oral argument on Feb. 21 (see 2212190042). Though Section 230 was designed in 1996 to allow online publishers narrow protections from defamation liability, courts have “misinterpreted the law and allowed it to become a nearly all-encompassing blanket protection for certain companies, specifically internet and Big Tech companies,” Paxton said. These limitless legal protections prevent states from holding Big Tech accountable for law violations, even when infractions are unrelated to content publishing, said Paxton.
A bipartisan group of senators introduced legislation Wednesday to increase transparency into social media companies’ internal data. Introduced by Sens. Rob Portman, R-Ohio; Chris Coons, D-Del.; Bill Cassidy, R-La.; and Amy Klobuchar, D-Minn., the Platform Accountability and Transparency Act (PATA) would require social media companies to deliver internal data to independent researchers. The researchers’ proposals would be subject to review and approval from the National Science Foundation. Companies that fail to comply would face FTC enforcement and potential loss of liability protection under Communications Decency Act Section 230, sponsors said. Platforms would be required to maintain a comprehensive ad library, content moderation statistics, data about viral content, and information about platforms’ algorithm rankings and recommendations.
Senate Antitrust Subcommittee Chair Amy Klobuchar, D-Minn., struck back Tuesday against opponents of her Journalism Competition and Preservation Act (S-673) following a wave of outcry against a bid to attach the controversial bill to the FY 2023 National Defense Authorization Act (see 2212050067). Text of a pending compromise version of the annual measure, to be filed as an amendment to shell bill HR-7776, again failed to materialize by Tuesday afternoon, amid fractious negotiations.