Teens Are Flocking to AI Chatbots. Is this Healthy?
Relationships are messy, whether you are an adult with lots of experience or a kid navigating tough times with a best friend, boyfriend or girlfriend. You can’t predict moods, interests or desires. For teens learning the ins and outs of relationships for the first time, disagreements, fights and breakups can be crushing.
But what if your teen’s best friend wasn’t actually human? It may seem far-fetched, but it’s not. A new report from Common Sense Media says that 72 percent of teens surveyed have used AI companions, and 33 percent have relationships or friendships with these chatbots.
The language that AI companions use, the responses they make, and the empathy they exude can make a user feel as though they truly understand and sympathize. These chatbots can make someone feel liked or even loved. They are programmed to help users feel like they’ve made a real connection. And as adolescents have a naturally developing fascination with romance and sexuality, if you feel ignored by the girls in your high school, well, now, on the nearest screen is a hot girlfriend who is constantly fascinated by you and your video games, or a super cute boyfriend whom you never had to engage in small talk with to form a bond.
This may be perplexing to some parents, but if your child is navigating the complex worlds of technology, social media and artificial intelligence, the likelihood they will be curious about an AI companion is pretty high. Here’s what you need to know to help them.
Chatbots have been around for a long time. In 1966 an MIT professor named Joseph Weizenbaum created the first chatbot, named ELIZA. Today AI and natural language processing have sprinted far past ELIZA. You probably have heard of ChatGPT. But some of the common companion AI platforms are ones you might not be familiar with: Replika, Character.AI and My AI are just a few. In 2024 Mozilla counted more than 100 million downloads of a group of chatbot apps. Some apps set 18 as a minimum age requirement, but it’s easy for a younger teen to get around that.
You might think your kid won’t get attached, that they will know this chatbot is an algorithm designed to give responses based on the text inputs they receive; that it’s not “real.” But a fascinating Stanford University study of students who use the app Replika found that 81 percent considered their AI companion to have “intelligence,” and 90 percent thought it “human-like.”
On the plus side, these companions are sometimes touted for their supportiveness and promotion of mental health; the Stanford study even found that 3 percent of users felt their Replika had directly helped them avoid suicide. If you’re a teenager who is marginalized, isolated or struggling to make friends, an AI companion can provide much-needed companionship. They may offer practice when it comes to building conversational and social skills. Chatbots can offer helpful information and tips.
A Florida mother has sued the company that owns Character.AI, alleging the chatbot formed an obsessive relationship with her 14-year-old son, Sewell Setzer III, and ultimately encouraged him to attempt suicide (which he tragically completed). Another suit filed in 2024 alleges that the same chatbot encourages self-harm in teens and violence towards parents who try to set limits on how often kids use the app.
Then there’s privacy: Wired, drawing on Mozilla’s research, labeled AI companions a “privacy nightmare,” many crawling with data trackers that might manipulate users into thinking a chatbot is their soulmate, encouraging negative or harmful behaviors.
Given what we know about teens, screens and mental health, online influences are sometimes powerful, largely unavoidable, and potentially life-changing for children and families.
Compartir esta publicación
← Publicación más antigua
Publicación más reciente →