FTC Announces Children’s Privacy Enforcements and Launches AI Chatbot Inquiry

Right on time for the new school year, the FTC rolled out multiple enforcement actions related to children’s privacy, highlighting a renewed focus on federal regulation of minors’ personal data and the evolving challenges of establishing effective, privacy-protective age assurance solutions. The FTC also announced a Section 6(b) inquiry into the impacts of AI chatbot companions on children and teens.

On September 2, 2025, the FTC announced a $10 million settlement with Disney over the unlawful collection of children’s data in violation of the Children's Online Privacy Protection Act (COPPA). The complaint alleges that Disney posted kid-directed YouTube videos without labeling them as “Made for Kids”, allowing the company to collect personal data from children and use it for targeted advertising without parental notification and consent.

The same day, the DOJ filed an FTC settlement with China-based “app-enabled” robotic toy maker Apitor in California federal court over claims that Apitor allowed a Chinese analytics provider to gather geolocation data from children under 13 without parental consent in violation of COPPA. The settlement includes a $500,000 penalty and compliance requirements, including vendor oversight for COPPA requirements and disgorging unlawfully collected data. In conjunction with the settlement, the FTC reiterated the importance of getting verifiable parental consent even when a third party is collecting data on behalf of the company.

On September 3, 2025, the FTC announced a joint order with Utah against the operators of Pornhub and other pornography-streaming sites, over charges that the operators did not block content involving child sexual abuse material (CSAM) and nonconsensual material (NCM) despite claiming that this content was “strictly prohibited” under a zero-tolerance policy. The order imposes a $15 million fine and numerous compliance requirements related to preventing NCM and CSAM content and establishing age and consent verification requirements.

On September 11, 2025, the FTC formally announced the launch of a Section 6(b) inquiry on the impacts of AI-powered chatbots on children and teens. The FTC reported that it has issued orders to seven companies that provide “consumer-facing AI-powered chatbots”, citing the power of generative AI to mimic human communications and interpersonal relationships. In particular, the goal is to determine what steps companies have taken “to evaluate the safety of their chatbots when acting as companions, to limit the products’ use by and potential negative effects on children and teens, and to apprise users and parents of the risks associated with the products.” The FTC is seeking a range of information from the companies, including how the companies monetize user engagement. The FTC’s inquiry follows the passage of a novel New York law that places guardrails for providers of AI companion models. 

Sign Up For Our Newsletter