As parents scurry to complete Christmas shopping this year, children’s and consumer advocacy groups are warning they should avoid artificial intelligence toys, citing safety concerns.
Fairplay, a nonprofit children’s safety organization, issued an advisory last month urging gift givers to avoid buying AI toys for children this holiday season. The group says that the toys, which can be found in a range of plushies, dolls, action figures, and kids’ robots, are generally powered by an AI model and have been shown to harm children and teenagers.
“The serious harms that AI chatbots have inflicted on children are well-documented, including fostering obsessive use, having explicit sexual conversations, and encouraging unsafe behaviors, violence against others, and self-harm,” Fairplay said.
According to a consumer alert report by the Public Interest Research Group (PIRG), a test on four AI toys found that “some of these toys will talk in-depth about sexually explicit topics, will offer advice on where a child can find matches or knives, act dismayed when you say you have to leave, and have limited or no parental controls.”
The “Trouble in Toyland” report also found that there are privacy concerns with owning these type of toys as they have the potential to collect data through methods such as facial recognition scans and recording a child’s voice.
“They’re collecting their names, their dates of birth. All kinds of information — the kid’s likes, dislikes, favorite toys, favorite friends,” said Teresa Murray, co-author of the PIRG report and director of its consumer watchdog program, told NPR.
“They’re connected to the internet, so anything is available. Who knows what those toys might start talking to your children about with their friends or their friends’ parents or your neighborhood? I mean, it’s terrifying,” she said.
More than 150 organizations and individual experts signed the advisory issued by Fairplay, voicing concerns that the toys prey on children’s trust, disrupt children’s relationships, and hinder kids’ creative and learning activities.
“What’s different about young children is that their brains are being wired for the first time, and developmentally it is natural for them to be trustful, for them to seek relationships with kind and friendly characters,” said Rachel Franz, director of Fairplay’s Young Children Thrive Offline Program.
According to Common Sense Media, 72 percent of America’s teenagers say they have used chatbots as companions. And nearly one in eight have sought emotional or mental health support from them.
Dr. Anna Ord, Dean of Regent University’s School of Psychology, told CBN News recently that children and teens can easily become victims of technology.
“If a child asks a question about self-harm or something from an adult, adults can discern and not go that route,” Ord explained. “But the chatbots are built to please, they’re built to be user-friendly. So they will produce content that the person asks for without a filter or thinking about this: ‘Is this the right thing to do?'”
Fairplay, a 25-year-old organization formerly known as the Campaign for a Commercial-Free Childhood, also says many of these AI-powered toys are being released with “no regulation and no research,” which is another reason parents should pause.
Tech company OpenAI, which powers ChatGPT, suspended the maker of the AI-powered teddy bear Kumma last month after a PIRG report found that the toy was providing details about how to find and ignite matches and talked in-depth about sexual matters, NPR reports.
“We suspended this developer [the Singapore-based toymaker FoloToy] for violating our policies,” OpenAI spokesperson Gaby Raila told the outlet. “Our usage policies prohibit any use of our services to exploit, endanger, or sexualize anyone under 18 years old. These rules apply to every developer using our API, and we monitor and enforce them to ensure our services are not used to harm minors.”
Toy makers and AI companies are responding to concerns, contending that they are focusing on safety and privacy.
Curio Interactive, which makes stuffed toys, like rocket-shaped Gabbo, says it has “meticulously designed” guardrails to protect children, and the company encourages parents to “monitor conversations, track insights, and choose the controls that work best for their family.”
Another company, Miko, has announced it is stepping away from ChatGPT as its language model system and is using its own conversational AI model that it considers safer for children.
“We are always expanding our internal testing, strengthening our filters, and introducing new capabilities that detect and block sensitive or unexpected topics,” said CEO Sneh Vaswani.
Meanwhile, Dr. Dana Suskind, a pediatric surgeon and social scientist who studies early brain development, told AP that the best toys for children are the ones that allow them to do 90 percent of the work, allowing them to lead in imaginative play.
“Kids need lots of real human interaction. Play should support that, not take its place. The biggest thing to consider isn’t only what the toy does; it’s what it replaces,” she explained.
“A simple block set or a teddy bear that doesn’t talk back forces a child to invent stories, experiment, and work through problems. AI toys often do that thinking for them,” Suskind added. “Here’s the brutal irony: when parents ask me how to prepare their child for an AI world, unlimited AI access is actually the worst preparation possible.”
***Please sign up for CBN Newsletters and download the CBN News app to ensure you keep receiving the latest news from a distinctly Christian perspective.***
Source link