Kids’ online safety bills at the federal and state levels are creating compliance concerns with their vague language that potentially runs afoul of the First Amendment, a compliance attorney said Friday. Mark Brennan, a tech and telecom attorney with Hogan Lovells, told a webinar that bills like the Kids Online Safety Act (KOSA), which passed the Senate and the House Commerce Committee (see 2409230044), presents a legal framework with a lot of compliance “mystery.” He noted federal courts have ruled similar state-level bills are unconstitutional. The Computer & Communications Industry Association and NetChoice are leading several tech industry challenges against state laws around the country, including measures in Texas, Florida, Mississippi and Georgia (see 2409260053, 2409260062 and 2407170046). The knowledge standard contemplated in KOSA effectively tells companies they don’t necessarily need to verify age, but they’re also subject to “significant penalties” for harms minors suffer when interacting on platforms, said Brennan. It creates an environment where companies feel like they “have no choice but to verify" the age of all users, not just minors. Tech associations have argued age-verification requirements are a First Amendment violation because of their impact on access to protected speech.
Mississippi’s social media age-verification law is unconstitutional because it places a “government-mandated” barrier blocking access to protected speech, NetChoice argued Thursday before the 5th U.S. Circuit Court of Appeals (see 2408010054). HB-1126 disfavors social speech in relation to professional speech and media-driven speech, the trade association said. NetChoice won a preliminary injunction against the law from the U.S. District Court for Southern Mississippi on July 1 (see 2407160038), and Mississippi Attorney General Lynn Fitch (R) appealed.
The 11th U.S. Circuit Court of Appeals should reverse a district court injunction against a Georgia anti-retail-theft law because the tech industry failed to show federal law preempts the measure, Georgia Attorney General Christopher Carr (R) argued Wednesday (docket 24-12273). Gov. Brian Kemp (R) signed the Combating Organized Retail Crime Act (Act 564) May 6. It requires that e-commerce platforms like Facebook Marketplace and Craigslist verify information about high-volume sellers to prevent online sales of stolen goods, with a specific focus on under-the-radar cash transactions. U.S. District Judge Steven Grimberg in July granted NetChoice’s request for preliminary injunction. Grimberg found the Inform Consumers Act, a 2023 federal law that imposes similar verification requirements on high-volume sellers, preempts Act 564. The state measure conflicts with the federal law’s language limiting its applications to transactions that “only” take place through the online marketplace in question, the judge found. Carr argued NetChoice can’t show that it’s impossible for companies to comply with both the federal and state laws. The Georgia law “slightly differs” from the federal law when it broadens the category of “discrete sales.” Georgia’s “slightly broader monitoring obligations go beyond federal regulation, not against federal regulation,” he argued. “They are complementary, there is no conflict.” NetChoice said in July that Act 564 “violates federal law and the Supremacy Clause, smothering Georgia’s thriving businesses with red tape.”
The “current technological reality of implementing” a Texas bill requiring age-verification on porn websites “means that it will burden adults’ access to constitutionally protected speech,” said the Center for Democracy and Technology, other nonprofits and three privacy professors in an amicus brief Friday. The groups and academics supported a challenge by the Free Speech Coalition of a Texas law at the U.S. Supreme Court. FSC is a porn industry trade association represented by the American Civil Liberties Union (see 2409170012). “The limitations of current age verification technology -- and the difference between the internet’s inherent capability to transmit and make available uploaded identifying data and the ability of a stationery-store owner to recall such data from a quick flash of ID -- create a significantly higher burden on adult access to protected content,” the amici wrote in case 23-50627. Several other groups also supported the FSC in amicus briefs posted Monday. The Foundation for Individual Rights and Expression, Reason Foundation and the First Amendment Lawyers Association said jointly that the 5th U.S. Circuit Court of Appeals incorrectly granted the state “a free hand to force adult Texans to show their papers and surrender their privacy simply to access content protected by the First Amendment.” Another amici filing including TechFreedom and the Electronic Frontier Foundation said the 5th Circuit erroneously applied rational-basis review rather than strict scrutiny. Along similar lines, the Cato Institute wrote, “It is the government’s burden to prove that the law serves a compelling government interest and uses the least restrictive means to achieve that interest. Texas did not clear this high bar.” Agreeing with others, the Electronic Privacy Information Center wrote that it’s “important for [SCOTUS] to take special care in this case to apply a constitutional framework capable of distinguishing unconstitutional censorship laws from constitutional kids’ privacy and safety laws.” Meanwhile, the Institute for Justice said the Supreme Court should use the case to stop a “growing problem” of courts “selecting the standard of review based on the government’s professed motive rather than by examining the actual conduct subject to regulation under the law.” A group of internet law professors, including Eric Goldman of Santa Clara University's High Tech Law Institute, said age-verification gates online are costly, raise privacy concerns when they collect sensitive data, and discourage “readers from accessing constitutionally protected material.”
Amazon, Meta, Google, TikTok and other companies should change their data “surveillance” practices to improve user privacy, the FTC said Thursday, concluding a probe the Trump administration started. The FTC issued a staff report with recommendations for nine companies that received orders in December 2020 (see 2012140054). Republican commissioners said in statements that some of the recommendations exceed the FTC’s authority. The report details how companies “harvest an enormous amount” of data and monetize it for billions of dollars annually, Chair Lina Khan said. “These surveillance practices can endanger people’s privacy, threaten their freedoms, and expose them to a host of harms.” The agency issued Section 6(b) orders to Amazon, Facebook, YouTube, X, Snap, ByteDance, Discord, Reddit and WhatsApp. Staff recommended data minimization practices, targeted advertising limits and more-stringent restrictions for children. The commission voted 5-0 to issue the report, but Commissioners Andrew Ferguson and Melissa Holyoak dissented in part. Ferguson argued that some of the FTC’s recommended actions for companies exceed agency authority: “We are not moral philosophers, business ethicists, or social commentators. ... [A]s Beltway bureaucrats, our opinion on these matters is probably worth less than the average American’s.” Some of the recommendations are “thinly-veiled threats,” he said. Ferguson cited the recommendation that companies not willfully ignore user age because it won’t “help companies avoid liability under” the Children’s Online Privacy Protection Act. Holyoak said some of the agency’s recommendations could chill online speech. For example, should a company follow recommendations to redesign algorithms for classes the agency deems “protected,” it could undermine the speech rights of certain populations. The report “fails to robustly explore the full consequences of its conclusions and recommendations,” she added. Khan in her statement denied the report “somehow endorses or encourages the platforms to disfavor certain viewpoints.” The report directly states that it doesn’t “address or endorse any attempt to censor or moderate content based on political views,” said Khan.
Instagram launched Teen Accounts, with protections limiting who can contact teen users and monitoring the content they see, it said Tuesday. People younger than 16 will need parental permission to change any of the built-in protections to make them less strict. "This new experience is designed to better support parents, and give them peace of mind that their teens are safe with the right protections in place," Instagram said. The product is default private, with Instagram users who aren't following teens unable to see their content or interact with them, it said. As such, only people teens follow or are already connected to them can message Teen Accounts users, it added. Instagram will automatically place teens in the most-restrictive setting of its sensitive-content control. After using Instagram for 60 minutes daily, teens will receive a notice from Teen Accounts telling them to leave the app.
A Texas anti-porn law violates the First Amendment by requiring websites to verify users’ ages, the Free Speech Coalition said at the U.S. Supreme Court. The American Civil Liberties Union filed a brief Monday on behalf of the FSC, a pornography industry trade association. The U.S. District Court in Austin agreed in August 2023 to block the law (HB-1181), one day before its Sept. 1 effective date. U.S. District Court Judge David Ezra found the law likely violates the First Amendment rights of adults trying to access constitutionally protected speech. But the 5th U.S. Circuit Court of Appeals partially vacated the injunction, finding the age-verification requirements constitutional. SCOTUS in July agreed to hear the case, Free Speech Coalition v. Paxton (docket 23-1122) (see 2407020033). “Under strict scrutiny, this is a straightforward case,” said the coalition’s brief: The Texas law “is both overinclusive and underinclusive, and it fails to pursue its objective with the means least restrictive of adults’ protected speech.” The coalition added, “Restoring the preliminary injunction … would not undermine genuine efforts to limit minors’ access to sexually inappropriate material.” Adults “have a First Amendment right to read about sexual health, see R-rated movies, watch porn, and otherwise access information about sex if they want to," said Vera Eidelman, ACLU Speech, Privacy and Technology Project staff attorney. “They should be allowed to exercise that right as they see fit, without having to worry about exposing their personal identifying information in the process.”
Despite numerous social media platform competitors, X has maintained its user base due to its personalized algorithms and people engaging with one another on the platform, Midia analyst Hanna Kahlert blogged Tuesday. However, that changed with X's recent ban in Brazil, as Bluesky has seen a surge in user downloads and usage, she said. It's unknown whether that translates into consistent, long-term usage. But with Bluesky also seeing growth in Portugal and other South American nations such as Chile and Argentina, it appears a competitor can finally draw users from X, she said.
Forty-two attorneys general supported U.S. Surgeon General Vivek Murthy’s recommendation that social media carry warnings like the labels on cigarette packages. Murthy suggested last June that social media companies display warnings about mental health risks associated with their platforms (see 2406170059). The 42 bipartisan AGs, writing Monday under National Association of Attorneys General letterhead and representing states including California, New York and Indiana, supported the idea in a letter to House Speaker Mike Johnson, R-La., and Senate leaders Chuck Schumer, D-N.Y., and Mitch McConnell, R-Ky. “Young people are facing a mental health crisis, which is fueled in large part by social media,” wrote the AGs from 39 states, the District of Columbia, American Samoa and the U.S. Virgin Islands. “This generational harm demands immediate action. By mandating a surgeon general’s warning on algorithm-driven social media platforms, Congress can help abate this growing crisis and protect future generations of Americans.” New York AG Letitia James hopes “warning labels will be implemented swiftly to raise more awareness about this issue," the Democrat said in a news release Tuesday. Arkansas AG Tim Griffin (R) said, "A Surgeon General’s warning on social media platforms isn’t a cure-all, but it’s a step in the right direction toward keeping our kids safe in digital spaces.”
Snapchat’s design and algorithms help sexual predators target children, a lawsuit that New Mexico Attorney General Raul Torrez (D) filed last week argues. The complaint was filed at the New Mexico First Judicial District Court in Santa Fe County (case D-101-CV-202402131). “Our undercover investigation revealed that Snapchat’s harmful design features create an environment where predators can easily target children through sextortion schemes and other forms of sexual abuse,” said Torrez. “Snap has misled users into believing that photos and videos sent on their platform will disappear, but predators can permanently capture this content and they have created a virtual yearbook of child sexual images that are traded, sold, and stored indefinitely.” Snap is reviewing the complaint “carefully, and will respond to these claims in court,” the company said in a statement. “We have been working diligently to find, remove and report bad actors, educate our community, and give teens, as well as parents and guardians, tools to help them be safe online. We understand that online threats continue to evolve and we will continue to work diligently to address these critical issues.”