Rosenworcel Proposes Ruling Prohibiting Calls Mimicking Human Voices
The FCC is quickly following up on a November AI notice of inquiry (see 2311160028), with Chairwoman Jessica Rosenworcel proposing a ruling Wednesday that would make voice-cloning technology in robocall scams illegal. The draft proposes a declaratory ruling that voice-cloned…
Sign up for a free preview to unlock the rest of this article
Communications Daily is required reading for senior executives at top telecom corporations, law firms, lobbying organizations, associations and government agencies (including the FCC). Join them today!
calls violate the Telephone Consumer Protection Act. The FCC recently finished a comment cycle on the NOI. Among the comments, attorneys general from 25 states and the District of Columbia asked the agency to use the proceeding to clarify that calls mimicking human voices are considered “an artificial voice” under the TCPA (see 2401170023). An FCC news release cites that filing. “AI-generated voice cloning and images are already sowing confusion by tricking consumers into thinking scams and frauds are legitimate,” Rosenworcel said: “No matter what celebrity or politician you favor, or what your relationship is with your kin when they call for help, it is possible we could all be a target of these faked calls.” If approved, the rules would give the AGs “new tools” to battle the “bad actors behind these nefarious robocalls and hold them accountable under the law,” the FCC said. Pennsylvania AG Michelle Henry said her office supports the ruling “to protect consumers from intentionally deceptive and manipulative marketing tactics.” The proposed ruling would “put the calling industry and provider community on notice that they need consent to make calls with AI,” a USTelecom spokesperson said in an email: “This important action will thwart prolific robocallers that want to use AI to deliver to consumers calls they never asked for and do not want. We encourage the Commission to quickly adopt the Chair’s proposal.”