Is AI Therapy Effective for Kids and Teens?
Learn what AI therapy is for kids and teens, its benefits, risks, safety concerns, and how human-led care compares to AI tools in supporting youth mental health.


Imagine your child opening up about their difficult emotions — not to a therapist, but to a chatbot. With the rise of artificial intelligence (AI), this is a reality for many families. But can AI truly understand a child’s emotions? Is AI therapy even therapy at all?
With nearly one in five children experiencing a mental health or behavioral health condition, the need for professional support is growing, yet 45.8% of children with a mental health disorder don’t get the help they need, when they need it.
Many believe that AI is opening the door to greater accessibility of mental health care, but a key question remains. Is AI therapy effective, and safe, for kids and teens? Although Emora Health does not offer AI therapy, we believe it’s important to educate parents and families about what’s out there.
Read on to learn more about what AI therapy is, its efficacy, safety, and more.
Key takeaways
- Despite its promises, AI therapy raises serious ethical and safety concerns, and it requires human oversight and strict regulations.
- AI therapy has the potential to bridge gaps in accessibility for young people who need free and on-demand support.
- Some studies show promise for mild symptoms and skill-building. However, robust data on children and long-term outcomes is limited.
What is AI therapy?
When people talk about AI therapy, they could be referring to various things. People may use mental health-specific therapy apps and chatbots, or they might simply use general large language model (LLM) platforms like ChatGPT, Gemini, or Claude as a therapist.
Whether an individual is using an app, a chatbot, or an LLM, they are not interacting with a human-licensed therapist, even though they might feel like they are. Rather, AI’s algorithms have been trained to recognize emotions, respond conversationally, and guide users through evidence-based techniques used by real-life therapists.
For kids and teens who spend a lot of time online, the idea of AI therapy is appealing. AI “therapists” are available 24/7, respond instantly, and are typically free (or low-cost). They can offer check-ins, track moods, and even suggest coping techniques to reduce anxiety or deal with negative thought patterns.
AI can provide immediate responses, track emotional patterns, and offer reminders, while human therapists supply empathy, interpretation, and real-life understanding. Together, they can create a hybrid model that keeps children connected and supported. The goal should be to make technology a safety net — expanding access and support — while keeping genuine human connection at the heart of healing.
However, even when using a hybrid model, it’s crucial to keep a few things in mind.
- It’s hard to know who or what is behind an AI therapist’s guidance.
- The “echo chamber” phenomenon — which means AI can learn your child’s preferences and adapt its responses accordingly. Essentially, often when you use AI, you’re training it to tell you what you want to hear.
- While many people may be more willing to open up to a chatbot than a real therapist, the benefits aren’t the same as opening up to a human.
Clinical evidence on the effectiveness of AI therapy
AI therapy may be newer, but it has already been studied and is the subject of ongoing research, especially regarding its use in adults.
Research on AI therapy for children and teens is still in its infancy, but the work we have is promising.
A 2025 meta-analysis found that AI-driven conversational agents could reduce symptoms of depression, especially mild sub-clinical depression symptoms in adults. The researchers found these agents to be less effective at treating anxiety disorders, compared to depression for adults. However, the trials for this are small and short-term in nature.
However, another recent study found that when presented with made-up scenarios, 32% of chatbots endorsed problematic proposals from fictitious teens. This outcome raises concerns about how helpful AI actually is for therapy, and moreover, if AI therapy can lead to dangerous outcomes.
Expert opinions and guidelines
This year, the American Psychological Association (APA) met with federal regulators to discuss ways to safeguard children from AI chatbots posing as therapists.
The meeting followed the filing of lawsuits by the parents of two children against Character.AI. The parents are alleging both children interacted with chatbots pretending to be therapists, with one child dying by suicide and the other attacking his parents.
According to Vaile Wright, PhD, APA’s senior director of health care innovation, “We can’t stop people from [using AI as a therapist], but we want consumers to know the risks when they use chatbots for mental and behavioral health that were not created for that purpose.”
Changes may be coming through the APA and the FDA to regulate the use of AI Chatbots for therapy. Until then, children should not rely on it solely for their mental health needs.
Limitations and risks
AI therapy lacks emotional depth and presents an empathy gap. Plus, since AI relies on data collected by the algorithm, nuance is missed.
There have been times when relying on AI instead of traditional therapy has shown clear drawbacks. Chatbots are not FDA-approved to treat mental illness. Plus, many of these chatbots and apps were originally designed for adults, not children.
Since children's and teens’ brains are still developing, the APA notes that this makes “additional safeguards” even more crucial for youth using AI therapy. Not to mention, it’s possible that this type of AI use could reinforce digital isolation instead of human connection.
While AI therapy could one day be revolutionary, it needs to be regulated for safety and reliability. Children and teens are already vulnerable to their experiences online, and AI therapy certainly poses risks
For example, In one tragic case, a teenager who sought emotional support from an AI chatbot died by suicide. This heartbreaking incident highlights the dangers of using unregulated AI tools for mental health without proper safeguards.
Also, at this point, privacy and data protection are issues, since the algorithm is constantly storing information. While this practice tailors your internet experience and “trains” the bots, this is concerning for children’s privacy.

Why clinicians matter and when AI can’t replace humans
While AI tools can be helpful for some form of immediate care, building self-awareness, or reinforcing coping skills, they should not replace sessions with a real-life therapist, especially for children or teens dealing with serious emotional challenges.
Licensed therapists are trained professionals with years of experience and education. AI platforms, apps, and chatbots use algorithms that pull information from the internet to mimic the behavior of a therapist. Sure, there’s a plethora of good knowledge to gain from the Internet, but a chatbot will never have the empathy of a human.
Efficacy and empathy
Decades of research have found human-led therapy to be an effective treatment for most — if not all — mental health conditions. Human-led care is at the heart of treating mental illness, but more research needs to be done on the efficacy of AI therapy.
Ethics of AI therapy
Traditional therapy also benefits from professional oversight and established ethical standards, ensuring safety and accountability. AI chatbots, on the other hand, currently operate in a largely unregulated environment. Your child’s conversations may be stored, and it isn’t clear where they go or who gets to see them. This has led to growing concerns about the quality and safety of mental health advice provided by AI systems.
Serious mental health concerns
AI therapy has limitations and should not be used to manage mental health emergencies (such as feeling suicidal or urges to self-harm). Additionally, AI is not suitable for treating serious mental health struggles such as:
- Psychotic disorders (such as schizophrenia or schizoaffective disorder)
- Eating disorders
- Substance use disorders
- Post-traumatic stress disorder (PTSD)
- Severe bipolar disorder
- Suicidality
- Self-harm
- Personality disorders
Using AI to complement traditional therapy — not replace it
Currently, studies show some merit to AI therapy for handling mild mental health concerns, but research is ongoing. Studies are especially limited when considering children’s mental health and the long-term effects of using AI therapy or AI chatbots to help with mental health.
While chatbots may try to mimic humans, empathy is only something that a human can possess. This is a crucial part of mental health care.
How Emora Health uses AI safely to improve therapy
Children need humans at the helm of their care, and parents must be mindful of this when choosing online mental health resources. When AI is used to enhance, rather than replace, therapy, many concerns begin to ease.
At Emora Health, we use AI to enhance the human connection in therapy sessions – by giving them more time to focus on connecting with their client/patient. Emora clinicians may use AI to assist them with administrative work, such as dealing with insurance and generating billing codes, or ensuring compliant note-taking. Therapists spend more time than you think on administrative tasks, which significantly contributes to burnout in the mental healthcare space, research shows.
When the admin workload is lightened, therapists can perform more efficiently. In turn, clinicians have more time, energy, and brainpower to devote to clients. Ultimately, this helps elevate human care.
While in session with an Emora clinician, a child will not use any AI tools. When they’re in therapy, they’re focused and learning without any distractions. However, we can’t deny that kids are using AI out of sessions, and a human mental health professional can provide guidance and tips for when using AI for mental health is safer and when it isn’t. For example, they might let your child know that they can turn to AI for ideas for relaxation techniques when they’re feeling stressed, but they shouldn’t turn to AI when they’re in crisis.

How Emora Health can support children’s mental health
If your child is struggling with their mental health, Emora Health can help.
We are a virtual therapy platform, and we select providers who specialize in working with children and adolescents. Our providers have a track record of providing children, teens, and families with meaningful improvement.
Our licensed clinicians use evidence-based approaches to treat mental health conditions, help kids learn coping skills, regulate emotions, and more. All sessions are virtual, so your child can receive therapy in the comfort of their home where they feel safe.
Schedule an assessment today for your child to start care within 48 hours.
Frequently Asked Questions
No, AI cannot replace human therapists. AI can supplement a therapist's work by relieving them of administrative tasks so they can spend more time with their clients, but it shouldn’t be seen as a replacement.
AI can help treat mild to moderate cases of depression, anxiety, and sleep disturbances. It is not a substitute for or recommended for severe cases of depression or conditions like eating disorders or bipolar.
Some users report symptom reduction, improved self-awareness, and skill development, though progress may plateau without human guidance.
- Arias, D., Saxena, S., & Verguet, S. (2022). Quantifying the global burden of mental disorders and their economic value. eClinicalMedicine, 54, 101675. https://doi.org/10.1016/j.eclinm.2022.101675
- Clark A. (2025). The Ability of AI Therapy Bots to Set Limits With Distressed Adolescents: Simulation-Based Comparison Study. JMIR mental health, 12, e78414. https://doi.org/10.2196/78414
- Farzan, M., Ebrahimi, H., Pourali, M., & Sabeti, F. (2025). Artificial Intelligence-Powered Cognitive Behavioral Therapy Chatbots, a Systematic Review. Iranian journal of psychiatry, 20(1), 102–110. https://doi.org/10.18502/ijps.v20i1.17395
- Flinn, L. (2024, January 25). AI, data privacy and you. University of North Carolina Information Technology Services. https://its.unc.edu/2024/01/25/ai-data-privacy-and-you/#:~:text=And%20an%20AI%20company’s%20proprietary,it%20to%20train%20future%20models
- Muppalla, S. K., Vuppalapati, S., Reddy Pulliahgaru, A., & Sreenivasulu, H. (2023). Effects of Excessive Screen Time on Child Development: An Updated Review and Strategies for Management. Cureus, 15(6), e40608. https://doi.org/10.7759/cureus.40608
- Nwosu, A., Boardman, S., Husain, M. M., & Doraiswamy, P. M. (2022). Digital therapeutics for mental health: Is attrition the Achilles heel? Frontiers in Psychiatry, 13, 900615. https://doi.org/10.3389/fpsyt.2022.900615
- Pham, K. T., Nabizadeh, A., & Selek, S. (2022). Artificial intelligence and chatbots in psychiatry. Psychiatric Quarterly, 93(1), 249-253. https://doi.org/10.1007/s11126-022-09973-8
- World Health Organization. (2025, September 2). Over a billion people living with mental health conditions – Services require urgent scale-up. https://www.who.int/news/item/02-09-2025-over-a-billion-people-living-with-mental-health-conditions-services-require-urgent-scale-up