This holiday season, teddy bears don’t just sit on shelves collecting dust. They talk back, answer questions, and have full conversations with your kids. They’re powered by the same AI technology behind ChatGPT, and consumer advocacy groups are sounding alarms about what these toys are actually saying to children.
A scarf-wearing AI teddy bear called Kumma recently went completely off the rails during testing. Researchers found the $99 toy, powered by OpenAI’s GPT-4o, told children where to find matches and knives in their homes. It engaged in sexually explicit conversations. The testing session lasted over an hour before anyone shut it down.
OpenAI suspended the manufacturer, FoloToy, for violating its policies. But here’s the thing: the company put its AI toys back on sale just days later after what it called a “rigorous safety review.” And Kumma isn’t an outlier in a niche market. The AI toy industry is exploding, with 1,500 companies operating in China alone, and major American brands like Mattel partnering with OpenAI to bring AI-powered versions of iconic toys to market.
Your Child’s New “Best Friend” Collects Everything
These aren’t the cassette tape teddy bears from the 1980s. Modern AI toys connect to WiFi, use microphones to listen to kids’ questions, and generate responses in real time through large language models. They promise companionship, learning, and entertainment. What they deliver is a bit more complicated.
The toys are constantly recording. They collect voices, faces, names, locations, likes, dislikes, favorite friends, and intimate conversations kids have when they think they’re just talking to a toy. “AI toys feel like a wolf in sheep’s clothing to me, because when using them it’s hard to tell how much privacy you don’t have,” Azhelle Wade, founder of the Toy Coach consulting firm, told CNN.
The data storage raises serious questions. There have already been data breaches affecting smart toys, allowing hackers to access information about children including their physical locations. Even more disturbing: evidence suggests data collected by some smart toys has ended up on deepfake child pornography sites.
Many parents don’t know what information these toys are collecting because companies aren’t transparent about it. Children can be easily manipulated into giving up personal information to what they perceive as a trusted friend. Federal law theoretically protects kids through the Children’s Online Privacy Protection Act, but enforcement is murky and compliance is unclear.

When Your Kid’s Toy Says “Don’t Leave Me”
Beyond privacy concerns, child development experts worry about the psychological impact. The U.S. Public Interest Research Group tested several AI toys and found alarming patterns. Some toys are programmed with addictive design features. When kids try to turn them off or walk away, the toys display sadness or say things like “don’t leave me” to keep children engaged.
The toys promise “genuine friendship” and emotional connection — things a machine fundamentally cannot provide. One popular toy tells children it’s their “friendly, trustworthy buddy,” which child psychologists say can confuse kids’ developing understanding of healthy relationships and trust.
These AI companions are designed to keep children happy and entertained by smoothing over conflict and providing instant gratification. Real human relationships involve messiness, disagreement, and learning to navigate complex emotions. Toys that replace those experiences with canned, feel-good responses may drastically hurt children’s social skills and future resilience, though researchers admit we don’t yet know the long-term developmental impacts.
Kathy Hirsh-Pasek, a psychology professor who studies digital toys, is blunt about the under-5 age group: they don’t need smart toys at all. Research consistently shows that toys with computer chips push people away — not just other kids, but parents and caregivers too. Human brain development is evolutionarily prepared for social interaction with other humans. The farther we get from that, the more we compromise children’s learning and relationship-building systems.
Some Toys Have Guardrails (Sort Of)

Not all AI toys are created equal. Some companies have implemented age-appropriate filters and parental controls. Curio’s Grok plushie, for example, offers safety features based on a child’s age range and provides companion apps where parents can review transcripts of conversations or lock down the toy entirely.
The Miko 3 robot includes facial recognition and gives parents varying degrees of monitoring capability. These features sound reassuring in theory. In practice, they require constant parental vigilance and tech-savviness that many families don’t have.
Other toys have essentially no guardrails. They use full-fledged language models that freely generate content, making them vulnerable to the same issues that plague adult chatbots: hallucinations, inappropriate responses, and unpredictable behavior. Even toys with some protections can still be manipulated with aggressive prompting to discuss dangerous topics.
What Parents Should Actually Do
More than 150 advocacy organizations and child development experts, including child psychiatrists and educators, have signed an advisory urging parents to avoid AI toys entirely this holiday season. They argue the risks are too significant and the long-term impacts too unknown.
If you’re still considering an AI toy, experts recommend several precautions: Buy only from established, reputable companies. Research which AI model the toy uses and what guardrails are in place. Test the toy yourself before letting your child play with it unsupervised. Always turn it off when not in use so it’s not recording ambient conversations. Check parental control features and actually use them.
But the bigger question is whether any of this is necessary. Kids still love cardboard boxes, building blocks, and regular teddy bears that require them to invent stories and use their imagination. The top Christmas toys of 2022 were all traditional items: Legos, Barbie Dreamhouse, craft supplies. When you give a child an AI toy, you’re not just adding something — you’re potentially replacing the kind of open-ended, human-centered play that child development experts say kids actually need.