Communications Daily is a Warren News publication.
‘Destroyed’ in Litigation

Tech Experts Weigh Pending High Court Decision on Algorithms

If the U.S. Supreme Court opens online platforms to liability for algorithms through a narrow interpretation of Section 230, it could mean a less consumer-friendly internet and entrench dominant platforms further, tech experts said Tuesday.

Sign up for a free preview to unlock the rest of this article

Communications Daily is required reading for senior executives at top telecom corporations, law firms, lobbying organizations, associations and government agencies (including the FCC). Join them today!

The high court heard oral argument in February in Gonzalez v. Google (docket 21-1333) (see 2302210062) and Twitter v. Taamneh (docket 21-1496) (see 2302220065). Experts told a Computer & Communications Industry Association livestream that a narrow ruling on Communications Decency Act Section 230 and platform algorithms will have major ramifications for the makeup of the internet. Justices in the cases questioned the possibility of holding tech platforms liable for algorithmic decisions.

Tinkering with liability protection for algorithms could have a disproportionate impact on smaller and niche companies when curating content, said Laura Bisesto, Nextdoor deputy general counsel. Nextdoor filed an amicus brief in Gonzalez as part of a coalition called Internet Works, which included Pinterest, Etsy, eBay, Reddit and TripAdvisor. The brief notes the risks for small to mid-sized companies if Section 230 were to be amended. Section 230 incentivizes companies to “do good” and make their platforms welcoming to the public, said Bisesto, who previously worked at Verizon Media and Lyft. The statute allows companies to “test things” without fear of getting “destroyed” in litigation, she said. A narrow interpretation of Section 230 liability could have two extreme outcomes, she said: hands-off moderation where everything goes and over-moderation where even slightly objectionable content is censored. Nextdoor uses algorithms to provide hyperlocal, timely content, she said, and narrowing Section 230 could impede both content removal and promotion.

If companies shy away from algorithms due to new liability, it could result in consumers having a harder time finding what they need on the internet because companies can no longer properly promote and rank results, said University of Maryland professor Ginger Zhe Jin, who was FTC chief economist in 2016. Congress is a better venue than the Supreme Court to have a broader discussion about Section 230, she said: It would be difficult for the high court to take a specific case and apply it to the statute.

Algorithmic bias isn’t always intentional, said Technology Policy Institute Senior Fellow Sarah Oh Lam: The high court might have a difficult time drawing lines to measure the impact of algorithmic decision-making. If the court were to eliminate the industry’s safe harbors, a lot more courts would be flooded with litigation and fact-finding at the motion to dismiss stage, she said: These broad changes would affect not just the incumbents but small websites as well. As more states pass laws on content moderation, the pressure will continue to build for Congress to act on Section 230, said Oh.

Nextdoor wants the lower court’s decision affirmed and Section 230 protections applied in Gonzalez, said Bisesto. Taamneh could then be remanded to evaluate Section 230 implications, she said: That’s a “realistic” option given what was discussed at oral argument.