Online therapy service BetterHelp will pay $7.8 million over claims it shared users’ health data with Facebook and Snapchat for ad purposes after “promising” to keep the data private, the FTC announced in a settlement Thursday. The commission voted 4-0 to accept the consent agreement with the company. California-based BetterHelp shared “email addresses, IP addresses, and health questionnaire information to Facebook, Snapchat, Criteo, and Pinterest for advertising purposes,” despite promises not to, the agency said. Businesses “shouldn't be permitted to covertly and deceptively use people's mental health histories to target them with advertisements,” FTC Chair Lina Khan tweeted. "When a person struggling with mental health issues reaches out for help, they do so in a moment of vulnerability and with an expectation that professional counseling services will protect their privacy,” said Samuel Levine, Bureau of Consumer Protection director, in a statement. The third parties will need to delete the ill-gotten data, and the company must deploy a comprehensive privacy program. Commissioner Christine Wilson, who is leaving the agency March 31, said in a concurring statement she’s “comfortable” that BetterHelp’s “conduct falls within” the FTC’s authority under FTC Act Section 19. She noted BetterHelp told consumers: “Rest assured -- your health information will stay private between you and your counselor.” The company didn’t comment.
FTC Commissioner Christine Wilson will leave office March 31, she wrote President Joe Biden in her resignation letter Thursday. Wilson in February announced plans to resign, citing abuses of power by Chair Lina Khan and senior staff (see 2302140047). She wished Biden success in the remainder of his tenure but urged him to “closely examine developments at the FTC to ensure that your vision of a ‘return to normalcy’ is being implemented with care.” Wilson said “knowledgeable career staff have been scorned and sidelined” under Khan’s leadership. She said she “barely” recognizes the agency she joined in 2018: “It pains me to observe the tarnishing of its reputation, the diminution of its efficacy, and the exodus of its experienced personnel, many of whom agree with the policy goals of Ms. Khan and your administration.”
The U.S. personal data protection is better than previous trans-Atlantic data transfer mechanisms, but "concerns remain," the European Data Protection Board (EDPB) said Tuesday. Its opinion on the EC decision saying the U.S. now ensures adequate data protection (see 2212130040) welcomed "substantial improvements" such as the introduction of requirements embodying the principles or necessity and proportionality for U.S. intelligence gathering of data and the new redress mechanism for EU data subjects. However, it said it has concerns with some rights of data subjects, onward transfers, the scope of exemptions, temporary bulk data collection, and the practical functioning of the redress mechanism. The EDPB said adopting the adequacy decision and its entry into effect should be conditioned on the U.S. updating its policies and procedures to implement executive order 14086 (which introduced the concepts of necessity and proportionality for U.S. signals intelligence) by all U.S. intelligence agencies. It recommended the EC then assess the updated policies and procedures and report back to the board.
Parents should have more control over what children and teens are exposed to on social media, legislators in Connecticut said during a hearing Tuesday. The General Law Committee, a joint panel with members from the Senate and House, heard testimony on a number of social media proposals. Connecticut should consider age-appropriate design concepts similar to those passed in California, said co-Chair James Maroney, a Senate Democrat. NetChoice is suing to block California’s AB-2273, the Age-Appropriate Design Code Act, on First Amendment grounds. Maroney referenced a similar law in the U.K. Rep. Tami Zawistowski (R) said she's happy to consider the concepts. She spoke in favor of legislation she co-sponsored, HB-5429, which would ban the “collection and commercial use of certain digital information concerning minors.” The 24/7 nature of social media drives the need for legislators to step in, she said: Parents are in the best position to decide what’s appropriate activity for young users. She said parents should have control over their kids’ accounts up to the age of 18 or 20, but the bill sets the age at 16 because that’s “more achievable.” There should be federal legislation, but state legislators do what they need to do to “get people talking,” she said. Maroney said he wants to explore the concept of product testing to identify harms when services target children. Sen. Saud Anwar (D) spoke in support of his SB-395, which would require website and app operators to “obtain parental consent before allowing a child under” 16 to open an account with the operator. “If we wait for the federal government to act, we’ll be waiting for a long time,” he said. Rep. Vincent Candelora (R) spoke in support of SB-1103, which was introduced by the committee at large. SB-1103 would establish an office of artificial intelligence and contemplates data collection restrictions for government agencies. The government should be held to the same standard as private entities because it’s collecting proprietary data and actively using algorithms, said Candelora.
Registries and registrars may refrain from canceling expired domain names in Turkey and Syria in earthquake-affected areas, ICANN said Monday. It's concerned the emergency might prevent people from renewing their domains on time and lose them due to circumstances beyond their control. ICANN urged domain name sellers "to support this action when reviewing domain name renewal delinquencies in the affected areas," and said it's monitoring the situation to see if further relief is warranted.
Vermont legislators should consider privacy bill exemptions for companies and organizations already subject to federal privacy regulations, representatives from the financial and health sectors told the House Commerce Committee during a hearing Thursday on H-121, a consumer privacy bill introduced by Chairman Michael Marcotte (R). Vermont legislators announced plans to pursue a privacy bill last year (see 2203160053). H-121 includes data minimization requirements like those in the California Consumer Protection Act and requires businesses to respect do-not-track signals like those in Colorado’s law. The proposal would expand Vermont’s data broker law to allow consumers to opt out of the processing of personal information for targeted advertising, predictive analytics, tracking and/or the sale of personal information. The law would take effect July 1. The 32-page bill doesn’t scratch the surface of what’s passed in California and the EU, but it would enhance consumer privacy in Vermont, said Legislative Counsel David Hall. Europe has much more robust privacy laws, said Assistant Attorney General Sarah Aceves. She said she’s more concerned about inaction on the privacy front than about moving forward with a state patchwork of privacy laws. She said the AG’s office, which would be responsible for enforcement, is comfortable with what’s in the bill but open to organically changing elements. VPIRG Communications and Technology Director Zachary Tomanelli encouraged passage of the bill but said he anticipates further changes. Vermont Bankers Association President Chris D'Elia, Association of Vermont Credit Unions President Joseph Bergeron and Devon Green, Vermont Association of Hospitals and Health Systems vice president-government relations, all spoke of the need for exemptions for organizations already subject to federal laws on financial- and health-related privacy, including the Gramm-Leach-Bliley Act and the Health Insurance Portability and Accountability Act.
Colorado’s latest privacy regulation proposal is more burdensome than the EU’s general data protection regulation in its requirement for companies obtaining informed consumer consent for data processing, Google commented Friday (see 2302060037). The proposed regulation’s consent standards require “so much information to be presented in such a scripted manner that it may undermine rather than improve consumer understanding” of how data is processed, said Google. This “prescriptive” approach could result in “consent fatigue” and “checkbox exercises,” the company said. Google suggested Colorado Attorney General Phil Weiser (D) remove the proposal’s internal documentation requirements, which are separate from requirements for data protection assessments. The draft rules require companies to analyze and document data minimization and secondary use decisions, “seemingly untethered from any potential risk of harm to consumers or the statute’s data protection assessment requirements,” said Google. This would result in companies accumulating “enormous paper trails” with little consumer benefit, the company said.
Education technology company Chegg will implement a comprehensive data security program as part of a finalized, non-monetary settlement the FTC announced Friday (see 2210310051). Chegg failed to establish basic security measures, exposing sensitive data of about 40 million customers and employees, the agency alleged in its complaint. The commission voted 4-0 to finalize the order with Chegg. As part of the order, the company must limit the data it collects and retains, offer users multifactor authentication and allow users to “request access to and deletion of their data.” Attorneys for the company didn’t comment Friday.
The FTC finalized a $3 million settlement Monday with Credit Karma, alleging the company used “dark patterns” to mislead and entice consumers to apply for credit card offers they often didn’t qualify for (see 2209010036). The commission voted 4-0 approving the final order and letters to commenters.
Comments are due March 6 for an NTIA study on data privacy harms inflicted on marginalized communities, the agency said Friday (see 2301180031).