Teens regularly chat with AI companions, survey finds

Jul 16, 2025 - 16:01
 0
Teens regularly chat with AI companions, survey finds
Teen girl looks at laptop computer while lying on her bed.

Artificial intelligence companions have gone mainstream amongst teens, according to a new report.

The findings may surprise parents familiar with AI chatbot products like OpenAI's ChatGPT and Google's Gemini, but haven't heard about platforms that specifically allow users to form friendships and romantic relationships with so-called AI companions.

The latter category includes products like Replika, Nomi, Talkie, and Character.AI. Some of the platforms are for users 18 and older, though teens may lie about their age to gain access.

A nationally representative survey of 1,060 teens ages 13 to 17 conducted this spring by Common Sense Media, an advocacy and research nonprofit in the U.S., found that 52 percent of respondents regularly use AI companions. Only 28 percent of the teens surveyed had never used one.

Teens don't yet appear to be replacing human relationships "wholesale" with AI companions, said Michael Robb, head of research at Common Sense Media. The majority are still spending more time with human friends and still find person-to-person conversations more satisfying.

But Robb added that there's reason for caution: "If you look, there are some concerning patterns beneath the surface."

How teens use AI companions

A third of teens said they engaged with AI companions for social interactions and relationships, doing things like role-playing and practicing conversations. They also sought emotional support, friendship, and romantic interactions.

In the survey, teens ranked entertainment and curiosity as top reasons for using an AI companion. Yet a third of those who use AI companions have opted to use them to discuss important or serious issues, instead of a real person. Robb said this tendency points to potential downsides of AI companion use.

Though some AI companion platforms market their product as an antidote to loneliness or isolation, Robb said the technology should not replace human interaction for teens. Still, without conclusive proof of what happens to teens (and adults) who come to rely on AI companions for vital connection, technology companies may still lean into the idea that use of their product is better than feeling alone.

"They're happy to fill that gap of knowledge with a hope and a prayer," Robb said.

He also suspects that, like with social media, there may be some youth who benefit from practicing certain social skills with an AI companion, and other young users who are more susceptible to a negative feedback loop that makes them more lonely and anxious and less likely to build offline relationships.

A new report from Internet Matters, a London-based online youth safety nonprofit, suggests that's already happening amongst children in the United Kingdom who use AI companions.

Children defined as vulnerable because they have special education disabilities or needs, or a physical or mental health condition, particularly use AI companions for connection and comfort, according to survey data collected by Internet Matters.

Nearly a quarter of vulnerable children in the survey reported using general AI chatbots because they could talk to no one else. These children were not only more likely to use chatbots, they were also nearly three times as likely to engage with companion-style AI chatbots.

The report warned that as children begin to use AI chatbots as companions, "the line between real and simulated connection can blur." That may lead to more time spent online.

Earlier this year, Common Sense Media described AI companions as unsafe for teens under 18. Robb said that tech companies should put in place robust age assurance measures to prevent underage users from accessing AI companion platforms.

Red flags for parents

Parents concerned about their teen's AI companion use should look for the following red flags, Robb said:

  • Behavior indicating that the teen is replacing human relationships with AI relationships.

  • Excess time spent on AI companion platforms, especially when it displaces activities like sleep, exercise, and in-person socialization.

  • Emotional outbursts when denied access to an AI companion.

Robb also suggested that parents discuss AI companion use with their teens, and any concerns both parties may have. These concerns could include disturbing statements or responses that AI companions can make and the sharing of personal information by a teen, including their real name, location, or personal secrets.

A quarter of AI companion users surveyed by Common Sense Media said they'd communicated sensitive information to their companion. Robb said it's important for teens to understand that personal details are often considered proprietary data owned by the companion platform once shared by the user.

Even when it's been anonymized, that information may help train the company's large language model. It could potentially show up in marketing copy or conversation scenarios. In a worst case scenario, personal data could be hacked or leaked.

For example, as Mashable's Anna Iovine reported, 160,000 screenshots of direct messages between an AI "wingman" app and its users were just leaked thanks to an unprotected Google Cloud Storage bucket owned by the app's company.

Robb encourages parents to set boundaries around AI use for their children, such as prohibiting specific platforms or the sharing of certain personal details.

"It's totally fine for a parent to have rules about AI, like the way they do with other types of screen uses," Robb said. "What are your own red lines as a parent?"