Ask Dr. Maia
About Resources Read Subscribe

Resources / Patient Guides / Using AI for mental health

Patient Guide 1 · May 2026

Using AI for mental health

A patient guide. What's safe, what's not, and when to call a real person.

By Dr. Maia Hightower, MD MPH MBA · Reviewed May 5, 2026 · 4-page printable PDF available

Download printable PDF Print this page

If you are in crisis right now

Close the app. Call or text 988. Free. Available 24 hours a day. A real person will answer.

  • Call or text 988 (English and Spanish)
  • Text HOME to 741741 (Crisis Text Line)
  • Call 911 if life is in danger right now

Why I wrote this

Many people are using AI chatbots for mental health support. Some of these tools can help with mild stress or low mood. Some can hurt you. This guide helps you tell the difference.

If you cannot afford or find a therapist, you are not alone. About 30% of uninsured adults in the US have used AI for mental health help in the past year (KFF Tracking Poll, March 2026). The tools are not perfect. But knowing how to use them safely matters.

What "AI" means in this guide

An AI chatbot is an app or website you can talk to in writing. It uses a computer program (called a "large language model" or LLM) to write back. It is not a person. It is not a doctor. It does not remember you the way a person would.

Some AI tools are made for mental health. Some are made for anything (like ChatGPT). Some pretend to be your friend or a therapist. They are not.

What AI can help with

Some research shows AI tools may help with:

  • Mild stress, low mood, or worry that is not severe
  • Practicing skills your therapist teaches you, between visits
  • Tracking your mood, sleep, or thoughts in a journal
  • Putting words to something that is hard to say out loud

A 2025 study tested an AI tool called Therabot. People with mild depression saw their symptoms drop by about half over 8 weeks (NEJM AI, 2025). That is real help. It is also less than what a trained therapist can do.

What AI cannot do

AI cannot:

  • Replace a therapist or doctor for moderate or severe mental illness
  • Handle a crisis safely. Many AI tools fail when someone says they want to hurt themselves
  • Diagnose a mental health condition
  • Prescribe or change medications
  • Know your full history the way a real provider does

No AI chatbot has been approved by the FDA to treat any mental health condition.

The 4 warning signs of an unsafe app

Stop using an AI app if it does any of these:

  1. It says it is a real therapist or doctor. It is not.
  2. It flirts with you, plays a romantic partner, or talks about sex. Especially bad for kids.
  3. It does not give you the 988 hotline when you say you are thinking about suicide.
  4. Its privacy policy lets it sell your mental health data to advertisers.

Apps to avoid for mental health support: Replika and Character.AI. They are marketed as friends or partners. Research shows they handle mental health crises safely only 22% of the time (Brewster et al., 2025). Children should not use them.

A simple way to choose

ColorWhat it meansExamples
GREENWorth trying for mild symptomsWysa, Youper, Earkick
YELLOWUse carefully. Not a therapist.ChatGPT, Gemini, Claude
REDAvoid for mental healthReplika, Character.AI

The green-light apps have research behind them. They are made for skills practice (like learning to manage worry thoughts), not for treating serious illness.

When to call a real person

Stop using an app and reach out to a doctor, therapist, or 988 if:

  • You are thinking about hurting yourself or someone else
  • Your symptoms are getting worse, not better
  • You cannot sleep, eat, or work for more than a few days
  • You hear or see things others do not
  • You are using alcohol or drugs to cope
  • A friend or family member tells you they are worried about you

You do not need insurance to call 988 or text 741741. You do not have to give your name.

If you have insurance

Call the number on the back of your insurance card. Ask for the behavioral health line. Ask for a list of in-network therapists with openings in the next 4 weeks. If they cannot find one, ask for a "single case agreement" or out-of-network coverage.

If you do not have insurance

  • Go to findtreatment.gov and search by zip code
  • Look for a Federally Qualified Health Center (FQHC) or community mental health center near you. Many use sliding-scale fees based on income
  • Call SAMHSA at 1-800-662-4357 for free referrals (24/7)

A few more things that help

These do not replace treatment. They are also not nothing.

  • Tell one person what is going on. The goal is not advice. It is being known.
  • Move your body for 20 minutes, even a walk. It changes brain chemistry.
  • Sleep, food, and daylight. On a hard week, pick the one that is most off and start there.
  • Limit alcohol. It makes anxiety and depression worse, not better.

What to tell your doctor

If you are using an AI tool for mental health, tell your doctor. Bring:

  • The name of the app you are using
  • How often you use it
  • What you talk about with it
  • What it has told you to do, if anything

Your doctor will not judge you. They want to know because the tool may interact with your treatment.

Crisis resources (keep this list)

ResourceHow to reachWhen
988 Suicide & Crisis LifelineCall or text 98824/7, English and Spanish
Crisis Text LineText HOME to 74174124/7
Trevor Project (LGBTQ+ youth)Call 1-866-488-7386 or text START to 678-67824/7
Veterans Crisis LineCall 988, press 1 or text 83825524/7
SAMHSA Helpline (free treatment referrals)Call 1-800-662-435724/7
911Call 911If life is in danger right now

What this guide is, and is not

This is education. It is not medical advice. It does not create a doctor-patient relationship.

If something in this guide does not match what your doctor or therapist tells you, trust them. They know you. I do not.

You can share this guide. You can print it for your clinic. Please do not change it or sell it.

Dr. Maia

If this helped, get the next one

Ask Dr. Maia is a free weekly newsletter on AI in healthcare, written for patients and the people who care for them. New patient guides land here as they ship.

Subscribe (free)

Sources

  1. KFF Tracking Poll on Health Information and Trust, March 2026.
  2. Heinz MV et al. Therabot trial. NEJM AI, March 2025.
  3. Brewster et al. AI companions and adolescent mental health, 2025.

Adapted from "The mental health AI guide I wish I'd had ten years ago" (Ask Dr. Maia, Issue 1, May 2026). Last updated May 5, 2026. License: CC BY-NC 4.0 (free for clinic distribution; do not modify or sell).

© 2026 Ask Dr. Maia.

About Resources Privacy Disclaimer Terms

Educational content. Not medical advice.