Expect a U.S. Supreme Court majority to side with the tech industry in its content moderation fight against social media laws in Florida and Texas, experts told us in interviews last week.
Section 230
Sen. Lindsey Graham wants to introduce legislation with Senate Judiciary Committee Chairman Dick Durbin, D-Ill., that will repeal Section 230 when Congress returns from break Feb. 26, the South Carolina Republican told us before the start of recess.
The FTC’s proposed rules for moderating fake online reviews are overly broad and carry liability risks that will result in platforms censoring legitimate reviews on sites like Google, Facebook and Yelp, the Interactive Advertising Bureau said Tuesday.
Allowing the affordable connectivity program to lapse would have “significant downstream effect” on the economy, said FCC Commissioner Anna Gomez during a Q&A at ITI’s Intersect event Wednesday.
Existing law needs updating to protect artists and individuals from fake AI-generated content, House Intellectual Property Subcommittee Chairman Darrell Issa, R-Calif., said Friday during a hearing in Los Angeles.
The Senate Judiciary Committee will seek support from Meta, X, TikTok and Discord for kids’ privacy legislation during Wednesday's hearing when their CEOs are scheduled to appear, Sen. Richard Blumenthal, D-Conn., told reporters Tuesday.
Consumer and industry advocates sounded alarms late last week over a proposed California ballot initiative that would make social media companies liable for up to $1 million in damages for each child their platform injures. Courts would likely find that Common Sense CEO James Steyer’s December proposal violates the First Amendment and Section 230 of the Communications Decency Act, said comments California DOJ forwarded to us Friday. For example, “Initiative 23-0035 is a misguided and unconstitutional proposal that will restrict all Californians’ access to online information,” the Electronic Frontier Foundation (EFF) said.
First Amendment questions are lingering when it comes to censoring online chatbots, even when they encourage users to kill themselves, a tech industry executive told the Senate Homeland Security & Governmental Affairs Committee Wednesday. Sen. Josh Hawley, R-Mo., pressed Information Technology Industry Council General Counsel John Miller about an online user who took his life after interacting with an online chatbot that encouraged him to do so. Hawley argued individuals and their families, including parents of young users, should be able to sue tech companies for such incidents. Miller said companies don’t want chatbots “doing those sorts of things,” but AI is responsible for a lot of good. The technology has been useful in cancer research, for example, he said. Asked if companies should be sued for AI's negative impacts, Miller said, “Under the current law, that’s probably not allowable,” alluding to Communications Decency Act Section 230, which has enabled technological innovation and which the U.S. Supreme Court has upheld, Miller said. Hawley asked Miller if he would support legislation the senator sponsored with Sen. Richard Blumenthal, D-Conn., the No Section 230 Immunity for AI Act. The proposed law would clarify that Section 230 doesn’t apply to claims based on generative AI activity (see 2306150059). Miller said he hasn’t reviewed the bill but argued there are “other equities at play in this discussion,” including the First Amendment. Hawley asked Miller if a chatbot encouraging a teenager to kill himself is First Amendment-protected speech: “Is that your position?” Miller said it’s not his position, but “I don’t think the question’s been resolved.”
Generative AI is expanding Big Tech’s data monopoly and worsening news outlets' financial crisis, Sens. Richard Blumenthal, D-Conn., and Josh Hawley, R-Mo., agreed Wednesday while hearing testimony about The New York Times Co. (NYT) lawsuit against Microsoft and OpenAI.
Future Section 230 cases before the 5th U.S. Circuit Court of Appeals face "an extreme risk of judicial activism to overturn the existing 5th Circuit precedent and disrupt decades of Section 230 jurisprudence," Santa Clara University law professor Eric Goldman blogged Tuesday. Pointing to the dissent issued Monday in a 5th Circuit denial of an en banc rehearing motion involving a lawsuit against messaging app Snapchat (see 2312180055), Goldman said the seven dissenting judges' goal "seems to be to urge the Supreme Court to take this case." Written by Circuit Judge Jennifer Walker Elrod, the dissent criticized the 5th Circuit's previous "atextual interpretation" of Section 230, "leaving in place sweeping immunity for social media companies that the text cannot possibly bear." "Declining to reconsider this atextual immunity was a mistake," Elrod wrote.