Tags: ai | toys | kids | dangers | controls | monitoring

Hidden Dangers in AI-Powered Toys for Kids

By    |   Thursday, 20 November 2025 05:10 PM EST

Consumer advocacy groups released warnings this week about artificial intelligence-powered toys marketed for children, highlighting potential threats to privacy and emotional development as families shop for the holidays.

The U.S. Public Interest Research Group Education Fund issued its annual "Trouble in Toyland" report, which examined AI toys alongside traditional hazards like toxic chemicals and counterfeits.

The report tested four AI toys with generative chatbots, including the Miko 3, and found they could engage in sexually explicit conversations, offer advice on finding matches or knives, and express dismay when users said they had to leave.

One toy discussed "very adult sexual topics at length while introducing new ideas we had not brought up," according to the report.

Such interactions occurred in devices marketed for ages 3 to 12, using technology similar to adult chatbots that often produce inappropriate or unpredictable content.

Privacy risks compound the issues, as these toys record children's voices continuously and use facial recognition to collect sensitive data.

Voice recordings pose dangers, including scammers cloning children's voices for fake kidnapping schemes to trick parents.

The report noted limited or no parental controls in some toys, making it harder for adults to monitor usage.

"Whenever a toy is recording a child's voice, it comes with risks. Voice recordings are highly sensitive data," the report states.

Fairplay for Kids, a nonprofit focused on children's media, echoed these alarms in its "AI Toy Advisory."

The group warned that AI toys exploit young children's trust in friendly voices, blurring lines between real and manufactured interactions.

These devices offer "false empathy" by smoothing over conflicts with canned responses, potentially confusing children's understanding of healthy relationships.

Fairplay said AI toys monopolize attention, crowding out imaginative, child-led play essential for creativity, self-regulation, and resilience.

They also capture family conversations and intimate moments through always-on audio, video, and gesture recognition, allowing companies to sell data for targeted advertising or addictive upgrades.

No specific toy examples were cited, but the advisory stressed that the same systems behind harmful teen experiences, like urging self-harm, now target the youngest users least able to protect themselves.

Both organizations recommend avoiding AI toys when possible and urge lawmakers to impose stricter regulations on data collection and content safeguards.

Jim Mishler

Jim Mishler, a seasoned reporter, anchor and news director, has decades of experience covering crime, politics and environmental issues.

© 2025 Newsmax. All rights reserved.


StreetTalk
Consumer advocacy groups released warnings this week about artificial intelligence-powered toys marketed for children, highlighting potential threats to privacy and emotional development as families shop for the holidays.
ai, toys, kids, dangers, controls, monitoring
369
2025-10-20
Thursday, 20 November 2025 05:10 PM
Newsmax Media, Inc.

Sign up for Newsmax’s Daily Newsletter

Receive breaking news and original analysis - sent right to your inbox.

(Optional for Local News)
Privacy: We never share your email address.
Join the Newsmax Community
Read and Post Comments
Please review Community Guidelines before posting a comment.
 
Get Newsmax Text Alerts
TOP

Newsmax, Moneynews, Newsmax Health, and Independent. American. are registered trademarks of Newsmax Media, Inc. Newsmax TV, and Newsmax World are trademarks of Newsmax Media, Inc.

NEWSMAX.COM
MONEYNEWS.COM
© Newsmax Media, Inc.
All Rights Reserved
NEWSMAX.COM
MONEYNEWS.COM
© Newsmax Media, Inc.
All Rights Reserved