A New Hampshire House committee Wednesday soundly defeated a bill to regulate social media. But in Kansas, state senators at another hearing the same day appeared largely supportive of a proposed bill that would restrict online platforms from editing or removing political speech. Many state legislators have floated measures to regulate or investigate social media this session while the Supreme Court considers whether to hear industry challenges to Texas and Florida laws from 2021 (see 2301230051).
Section 230
Sens. Joe Manchin, D-W.Va., and John Cornyn, R-Texas, on Monday reintroduced legislation that would amend Communications Decency Act Section 230 and require platforms to report illegal drug sales and other illicit activity (see 2009290065). Originally introduced in 2020, the See Something, Say Something Online Act would require platforms to “report suspicious activity to law enforcement, similar to the way that banks are required to report suspicious transactions over $10,000 or others that might signal criminal activity.” The legislation “strong-arms online platforms into handing over the correspondence of civil rights and parents groups to law enforcement simply because speech and chat websites would need to take a better-safe-than-sorry approach to maintain protection from liability,” said NetChoice Vice President Carl Szabo.
The House Commerce Committee will explore ways to combat illegal online drug sales and the liability protections potentially facilitating the activity, said Chair Cathy McMorris Rodgers, R-Wash., Wednesday.
Congress needs to come together to establish national privacy standards, President Joe Biden and House Commerce Committee Chair Cathy McMorris Rodgers, R-Wash., said separately Wednesday, renewing attention to an issue that saw bipartisan progress in 2022.
The Supreme Court should “narrow the scope” of Communications Decency Act Section 230 and reverse the 9th Circuit’s decision shielding YouTube from liability in Gonzalez v. Google (docket 21-1333), Texas Attorney General Ken Paxton (R) wrote in a merits-stage amicus brief announced Thursday (see 2212070026). The 9th U.S. Circuit Court of Appeals in June 2021 dismissed a lawsuit against YouTube for hosting and recommending ISIS proselytizing and recruitment videos. The 9th Circuit affirmed a decision from the U.S. District Court for the Northern District of California shielding YouTube and its algorithms from liability. Plaintiff in the litigation and SCOTUS petitioner is the estate of Nohemi Gonzalez, an American student who was killed in Paris in 2015 during an ISIS attack. The petitioner asked SCOTUS to revisit the 9th Circuit's decision. Google didn’t comment. The case is scheduled for oral argument on Feb. 21 (see 2212190042). Though Section 230 was designed in 1996 to allow online publishers narrow protections from defamation liability, courts have “misinterpreted the law and allowed it to become a nearly all-encompassing blanket protection for certain companies, specifically internet and Big Tech companies,” Paxton said. These limitless legal protections prevent states from holding Big Tech accountable for law violations, even when infractions are unrelated to content publishing, said Paxton.
A bipartisan group of senators introduced legislation Wednesday to increase transparency into social media companies’ internal data. Introduced by Sens. Rob Portman, R-Ohio; Chris Coons, D-Del.; Bill Cassidy, R-La.; and Amy Klobuchar, D-Minn., the Platform Accountability and Transparency Act (PATA) would require social media companies to deliver internal data to independent researchers. The researchers’ proposals would be subject to review and approval from the National Science Foundation. Companies that fail to comply would face FTC enforcement and potential loss of liability protection under Communications Decency Act Section 230, sponsors said. Platforms would be required to maintain a comprehensive ad library, content moderation statistics, data about viral content, and information about platforms’ algorithm rankings and recommendations.
Senate Antitrust Subcommittee Chair Amy Klobuchar, D-Minn., struck back Tuesday against opponents of her Journalism Competition and Preservation Act (S-673) following a wave of outcry against a bid to attach the controversial bill to the FY 2023 National Defense Authorization Act (see 2212050067). Text of a pending compromise version of the annual measure, to be filed as an amendment to shell bill HR-7776, again failed to materialize by Tuesday afternoon, amid fractious negotiations.
FCC Chairwoman Jessica Rosenworcel confirmed Thursday she has received a letter from acting FAA Administrator Billy Nolen asking that the agency mandate voluntary protections for radio altimeters agreed to by Verizon and AT&T in the C band (see 2206170070) for 19 other providers who bought spectrum in the record-setting auction. “I have seen the letter” and “we are in discussions with our colleagues at NTIA,” Rosenworcel told reporters after the FCC meeting. Commissioner Brendan Carr said he was happy to look at FAA concerns, but believes the time to raise new objections has passed.
Social media platforms lack accountability for hosting harmful content because of Communications Decency Act Section 230, New York Attorney General Letitia James (D) and Gov. Kathy Hochul (D) said in a report released Tuesday. The report showed Payton Gendron, the alleged mass shooter who killed 10 black people in Buffalo in May, was radicalized on fringe platforms like 4chan. Platforms largely provided an uneven response to his livestreaming efforts, the report said. James’ office reviewed thousands of pages of documents and social media content to explore how the alleged shooter used platforms to “plan, prepare and publicize his attack,” James said. Gendron was radicalized through “virtually unmoderated websites and platforms that operate outside of the mainstream internet, most notably 4chan,” James said, and livestreaming platforms like Twitch were “weaponized to publicize and encourage copycat” attacks. Section 230 allows “too much legal immunity” for platforms, even “when a platform allows users to post and share unlawful content,” James said.
The Supreme Court will consider two appeals of appellate court decisions on social media companies' legal protections when their platforms are used in conjunction with terror attacks. On Monday, SCOTUS granted certiorari in docket 21-1333 in an appeal of a 9th U.S. Circuit Court of Appeals decision tossing out a suit against Google's YouTube for hosting and recommending ISIS proselytizing and recruitment videos. Plaintiff in the litigation and SCOTUS petitioner is the estate of Nohemi Gonzalez, a U.S. citizen who was killed in ISIS attacks there in 2015. The petitioner asked SCOTUS to revisit the 9th Circuit's holding that the Communications Decency Act's Section 230 protects YouTube's algorithm for recommending videos. Google didn't comment. The court also granted cert Monday in docket 21-1496, in which Twitter is appealing another 9th Circuit decision. In that decision, the appellate court found Twitter and co-defendants Facebook and Google could be held liable for aiding and abetting an act of terrorism. Twitter and the others were sued by American relatives of Nawras Alassaf, a Jordanian killed in an ISIS attack in Istanbul in 2017. “These cases underscore how important it is that digital services have the resources and the legal certainty to deal with dangerous content online," Computer and Communications Industry Association President Matt Schruers said in a statement. “Section 230 is critical to enabling the digital sector’s efforts to respond to extremist and violent rhetoric online, and these cases illustrate why it is essential that those efforts continue.” SCOTUS "can really do something useful by constraining Section 230 protections to hosting content instead of targeting content," tweeted Matt Stoller, American Economic Liberties Project research director.