Patient Guide 1 · May 2026
Using AI for mental health
A patient guide. What's safe, what's not, and when to call a real person.
Why I wrote this
Many people are using AI chatbots for mental health support. Some of these tools can help with mild stress or low mood. Some can hurt you. This guide helps you tell the difference.
If you cannot afford or find a therapist, you are not alone. About 30% of uninsured adults in the US have used AI for mental health help in the past year (KFF Tracking Poll, March 2026). The tools are not perfect. But knowing how to use them safely matters.
What "AI" means in this guide
An AI chatbot is an app or website you can talk to in writing. It uses a computer program (called a "large language model" or LLM) to write back. It is not a person. It is not a doctor. It does not remember you the way a person would.
Some AI tools are made for mental health. Some are made for anything (like ChatGPT). Some pretend to be your friend or a therapist. They are not.
What AI can help with
Some research shows AI tools may help with:
- Mild stress, low mood, or worry that is not severe
- Practicing skills your therapist teaches you, between visits
- Tracking your mood, sleep, or thoughts in a journal
- Putting words to something that is hard to say out loud
A 2025 study tested an AI tool called Therabot. People with mild depression saw their symptoms drop by about half over 8 weeks (NEJM AI, 2025). That is real help. It is also less than what a trained therapist can do.
What AI cannot do
AI cannot:
- Replace a therapist or doctor for moderate or severe mental illness
- Handle a crisis safely. Many AI tools fail when someone says they want to hurt themselves
- Diagnose a mental health condition
- Prescribe or change medications
- Know your full history the way a real provider does
No AI chatbot has been approved by the FDA to treat any mental health condition.
The 4 warning signs of an unsafe app
Stop using an AI app if it does any of these:
- It says it is a real therapist or doctor. It is not.
- It flirts with you, plays a romantic partner, or talks about sex. Especially bad for kids.
- It does not give you the 988 hotline when you say you are thinking about suicide.
- Its privacy policy lets it sell your mental health data to advertisers.
Apps to avoid for mental health support: Replika and Character.AI. They are marketed as friends or partners. Research shows they handle mental health crises safely only 22% of the time (Brewster et al., 2025). Children should not use them.
A simple way to choose
| Color | What it means | Examples |
|---|---|---|
| GREEN | Worth trying for mild symptoms | Wysa, Youper, Earkick |
| YELLOW | Use carefully. Not a therapist. | ChatGPT, Gemini, Claude |
| RED | Avoid for mental health | Replika, Character.AI |
The green-light apps have research behind them. They are made for skills practice (like learning to manage worry thoughts), not for treating serious illness.
When to call a real person
Stop using an app and reach out to a doctor, therapist, or 988 if:
- You are thinking about hurting yourself or someone else
- Your symptoms are getting worse, not better
- You cannot sleep, eat, or work for more than a few days
- You hear or see things others do not
- You are using alcohol or drugs to cope
- A friend or family member tells you they are worried about you
You do not need insurance to call 988 or text 741741. You do not have to give your name.
If you have insurance
Call the number on the back of your insurance card. Ask for the behavioral health line. Ask for a list of in-network therapists with openings in the next 4 weeks. If they cannot find one, ask for a "single case agreement" or out-of-network coverage.
If you do not have insurance
- Go to findtreatment.gov and search by zip code
- Look for a Federally Qualified Health Center (FQHC) or community mental health center near you. Many use sliding-scale fees based on income
- Call SAMHSA at 1-800-662-4357 for free referrals (24/7)
A few more things that help
These do not replace treatment. They are also not nothing.
- Tell one person what is going on. The goal is not advice. It is being known.
- Move your body for 20 minutes, even a walk. It changes brain chemistry.
- Sleep, food, and daylight. On a hard week, pick the one that is most off and start there.
- Limit alcohol. It makes anxiety and depression worse, not better.
What to tell your doctor
If you are using an AI tool for mental health, tell your doctor. Bring:
- The name of the app you are using
- How often you use it
- What you talk about with it
- What it has told you to do, if anything
Your doctor will not judge you. They want to know because the tool may interact with your treatment.
Crisis resources (keep this list)
| Resource | How to reach | When |
|---|---|---|
| 988 Suicide & Crisis Lifeline | Call or text 988 | 24/7, English and Spanish |
| Crisis Text Line | Text HOME to 741741 | 24/7 |
| Trevor Project (LGBTQ+ youth) | Call 1-866-488-7386 or text START to 678-678 | 24/7 |
| Veterans Crisis Line | Call 988, press 1 or text 838255 | 24/7 |
| SAMHSA Helpline (free treatment referrals) | Call 1-800-662-4357 | 24/7 |
| 911 | Call 911 | If life is in danger right now |
What this guide is, and is not
This is education. It is not medical advice. It does not create a doctor-patient relationship.
If something in this guide does not match what your doctor or therapist tells you, trust them. They know you. I do not.
You can share this guide. You can print it for your clinic. Please do not change it or sell it.
Dr. Maia
Sources
- KFF Tracking Poll on Health Information and Trust, March 2026.
- Heinz MV et al. Therabot trial. NEJM AI, March 2025.
- Brewster et al. AI companions and adolescent mental health, 2025.
Adapted from "The mental health AI guide I wish I'd had ten years ago" (Ask Dr. Maia, Issue 1, May 2026). Last updated May 5, 2026. License: CC BY-NC 4.0 (free for clinic distribution; do not modify or sell).