The authoritative news source for communications regulation
In ‘Big Leagues’

Parler Executive Defends Section 230, Platform Moderation Practices

If Communications Decency Act Section 230 is revised with more regulatory burden, it will entrench incumbents and result in more government involvement in communication channels, Parler Chief Policy Officer Amy Peikoff told C-SPAN's The Communicators, scheduled to have been telecast over the weekend. She dismissed concerns about hate speech and hate groups proliferating on Parler, saying the tech industry’s liability shield is working as intended. Her comments followed a Simon Wiesenthal Center report claiming the platform is attracting online extremists and harmful content.

Parler’s “voluntary jury system” for content moderation isn’t “up to doing the job,” Simon Wiesenthal Center’s Rabbi Abraham Cooper said in an interview. Parler, a social media network popular with conservatives, is “in the big leagues now,” he said, so it has to accept greater accountability.

Parler said the platform has about 10.5 million accounts, up about 6 million in recent weeks. Peikoff credited the company’s growth to users losing trust in platforms like Facebook and Twitter. Parler gives a chronological feed without any effort to enhance engagement, which is friendlier to a “good social media diet, not something you get addicted to,” she said.

The Simon Wiesenthal Center relayed concerns about extremism to Peikoff and Chief Operating Officer Jeffrey Wernick. What SWC has seen with Parler is no worse than what's shared on Facebook and Twitter, noted Cooper. “We welcome another major player, and if it’s going to be free of the political machinations, fantastic,” he said: But the platform needs to figure out how to handle hate speech, because without action, hate groups will establish their own subculture. The company should be drawing its own lines on harmful content, not relying on a jury system, he said. The company didn’t comment about his remarks.

Parler has a “community jury, and sometimes a jury is fallible,” but the jurors are trained to be viewpoint-neutral, said Peikoff. Parler is an alternative for conservatives who feel they have been mistreated on other platforms, but it’s not intended as a conservative platform, she said: “They trust us because they know we are allowing the widest amount of speech possible, consistent with law.” Users can report content violations, she said: The community jury has a portal, and voluntary jurors can determine whether there’s a violation. It’s a quorum system, so it requires the majority of jurors to determine a violation.

Some algorithms used by Facebook and Twitter have the potential to exacerbate hate groups and hate speech, Peikoff said: They are designed to increase engagement, making people more likely to find more polarizing content. Facebook and Twitter didn’t comment. Peikoff raised concerns about recent remarks from Facebook CEO Mark Zuckerberg at a Senate Judiciary Committee hearing (see 2011180020) signaling an openness to regulations surrounding transparency, requiring reporting and a minimum level of effectiveness dealing with the speech. Those requirements could be cost prohibitive for smaller platforms, she said. Putting that in place “would at the minimum create a barrier to entry making it impossible for smaller startups to compete because Facebook has this extensive infrastructure.”

Cooper credited Parler for a willingness to communicate with the public: “The door is open for communication, and that’s something that we appreciate. That’s not an automatic. Dealing with many other companies is sort of like dealing with the Wizard of Oz: Send an email, and we’ll get back to you if we feel like it.”