How AI chatbots are changing the way young people seek mental health support

Young people are turning to AI for mental health support. Here’s what they have to say.

How AI chatbots are changing the way young people seek mental health support

Young people are turning to AI for mental health support. Here’s what they have to say.

The use of artificial intelligence chatbots has surged since ChatGPT’s launch in November 2022. Today, over a billion people use AI tools each month. Young people in particular are some of AI’s most frequent users: Roughly one in four adults under 30 now turn to AI chatbots for health information at least once a month. 

To better understand this trend, Infodemiology.com partnered with the New Jersey-based nonprofit organization Partners in Prevention to examine how young people use AI chatbots to learn about and discuss mental health. Eleven interviews with youth aged 15 to 22 reveal how AI chatbots are shaping young people’s perceptions, decisions, and approaches to mental well-being.

AI chatbots are emerging tools for mental health support.

AI chatbots use natural language processing and language learning models to engage with users in a human-like conversation. Designed to simulate natural dialogue and adapt to user input, these tools generate personalized responses that can feel authentic and emotionally attuned. As a result, some users have begun turning to chatbots not only for information, but also for social interaction, advice, and emotional support.

A recent report found that nearly three-quarters of teens 13 to 17 reported using AI “companions,” chatbot characters designed to mimic personal relationships. Among them, one-third use these tools for social interaction or conversation, and more than one in ten use them specifically for mental health support. 

Mental health professionals, however, have raised concern about this growing reliance on AI for unvetted emotional or therapeutic advice. 

“People are starting to blur the lines a bit between social interaction and computer programs, and I do worry about that,” said Jennifer Katzenstein, a child psychologist at Johns Hopkins, in a recent interview.

Young people discuss the use of AI chatbots for mental health information and support.

Takeaway #1: Youth feel mental health is more normalized, but still challenging to navigate.

In interviews, young people described their generation as more open and self-aware about mental health than previous generations. Feelings of anxiety, depression, and stress are common topics among peers often shared through jokes or memes that make difficult emotions like burnout or isolation easier to discuss. 

Yet openness doesn’t always translate to seeking help. Some interviewees said they hesitate to ask for support, worrying their feelings were not serious enough or might be dismissed as “just part of being a teen.” 

Others feared being seen as dramatic, weak, or burdensome. These conflicting states of normalization and vulnerability mean that many young people continue to manage their struggles privately even as attitudes about mental health conversations progress.

“I feel like we still struggle with [mental health], because at least a lot of the people I know like to shoulder their mental health themselves and have this ‘I can handle it on my own’ mentality. Because of that, they shy away from discussing it with friends and family.”

Takeaway #2: AI chatbots serve as first-line emotional support for some young people.

Many young people reported turning to AI chatbots as a first step to explore their feelings and find ways to cope with them. They cited privacy, accessibility, and a lack of judgment as key reasons these tools felt safe to use, especially compared with opening up to peers and adults. Several interviewees said the experience resembled speaking to a counselor or trusted loved one. 

Chatbots were also seen as less overwhelming than traditional search engines, which can return hundreds of results of mixed quality. By contrast, chatbots provide a single, tailored response, which interviewees found more personal and easier to process, especially in moments of emotional distress.

“I like asking ChatGPT because it almost feels like a person to talk to… I think it’s better than just Googling something and getting thousands of results. With ChatGPT, you get one entry that’s tailored to your exact question.”

Takeaway #3: Young people are using AI to fill gaps in mental health information.

Beyond emotional support, young people said they use chatbots as substitutes for other mental health resources. Several interviewees noted that chatbots were very effective at breaking down complex mental health topics into clear, actionable information—simplifying explanations, reframing concepts, and generating step-by-step guidance.

Some also relied on AI to translate or adapt information to their cultural or linguistic needs. One interviewee explained that a chatbot helped fill the gap when they struggled to find reliable mental health materials in Spanish.

“I can manipulate it however I want - expand on it, translate it into Spanish, explain it differently, or give me three actionable steps. You can break it apart in any way you want, and that’s why we like it. It’s like Google, but quicker and more flexible.”

For many, this flexibility and responsiveness made chatbots feel more relevant than static websites or long-form articles, which rarely reflect their lived experiences.

Takeaway #4: AI is useful. People still matter more.

Young people recognized AI’s usefulness, but they also agreed that these tools can’t replace human connection. Most said they would rather speak with a trusted adult or mental health professional about serious concerns. They emphasized the unique value of empathy, lived experience, and genuine understanding; qualities that no algorithm or chatbot can replicate.

Some interviewees also raised concerns about chatbot accuracy, highlighting instances of the tools giving advice that was too broad or even misleading. Others voiced concern about privacy and how AI platforms store or use their personal information.

“The more and more you engage with it, the more data it collects about you and the information you give it. It’s interesting to think about, but I personally would prefer just having a person to speak to.”

Overall, young people viewed AI chatbots as helpful complements—but not replacements—for professional care and human support.

AI chatbots are reshaping the mental health landscape—for better or worse.

As AI tools evolve, they are beginning to alter how people approach mental health care. Young people, in particular, are increasingly turning to AI tools as a discreet and accessible first line of support, especially when traditional resources feel out of reach. 

Yet the growing reliance on AI also brings new risks. While most youth recognize that AI cannot replace real human support, AI tools are a frequent source of health misinformation and may be actively harmful to people in the midst of a mental health crisis. According to one report, over one million users share suicidal thoughts with ChatGPT each week.

In June, the American Psychological Association issued a health advisory warning that “AI has made the process of discerning truth even more difficult." The statement cautioned that misinformation or incomplete guidance “can lead to harmful behaviors, misdiagnoses, and delayed or incorrect treatment…which can have serious impacts on [adolescent] well-being.”