Chatbots in psychotherapy concept with person sitting on green couch with mental state represented by scribble typing to bot on screen replying via text.

How AI can help (or hurt) your mental health: 5 prompts experts recommend

May 20, 2025
Updated on May 21, 2025
VectorMine // Shutterstock

How AI can help (or hurt) your mental health: 5 prompts experts recommend

For decades, therapy has been synonymous with the classic image of a client lying on a couch while a therapist scribbles notes. Fast forward to today, and this image is rapidly changing. Imagine instead opening an app and confiding in an AI chatbot鈥攁 scenario that鈥檚 becoming increasingly common as artificial intelligence reshapes how we approach mental wellness.

These smart technologies touch more aspects of our lives daily, with people using AI for everything from generating fun, action-figure avatars to boosting productivity at work. But the biggest change is how AI is starting to connect with our inner lives.

According to , AI-powered therapy and companionship will dominate generative AI use cases in 2025. This raises a crucial question: Are we witnessing a mental health revolution or heading toward uncharted risks?

Like any transformative tool, AI presents a double-edged sword for psychological well-being. While critics rightly sound alarms about the potential dangers, the reality is that this technology isn鈥檛 going anywhere. So rather than resisting, a better approach might be to ask: How can we use AI to support mental health in ways that are safe, ethical, and actually helpful?

To explore this question, consulted three human experts (yes, actual people) working at the intersection of behavioral health and AI innovation. Here鈥檚 what they want you to know.

The potential benefits of AI for mental health

Let鈥檚 state the obvious: AI doesn鈥檛 have a therapy license or clinical experience. So can it actually help? 鈥淵es, if used responsibly,鈥 says , PhD, associate professor of biomedical data science and psychiatry at Dartmouth鈥檚 Geisel School of Medicine, specializing in AI and mental health.

The numbers tell an urgent story: Nearly U.S. adults (58+ million people) have a mental health condition, while live in areas with therapist shortages. There simply aren鈥檛 enough mental health professionals to help everyone who is struggling. Online therapy helps bridge the gap, but .

鈥淭he need for innovative solutions is urgent,鈥 researchers noted in a on AI and mental health. AI鈥檚 key advantage? Immediate access.

鈥淚f you don鈥檛 have access to a therapist, AI is better than nothing,鈥 says , PhD, associate director for the International Institute of Internet Interventions for Health at Palo Alto University. 鈥淚t can help you unlock the roadblocks you have at the very right moment鈥攚hen you鈥檙e struggling.鈥 This is a nice benefit, even for those who are in 鈥渞eal鈥 therapy, since you鈥檒l have to wait for your designated appointment time to meet with your therapist.

Another benefit : Vulnerability without fear. 鈥淧eople often share more with AI,鈥 Bunge says. While therapists won鈥檛 judge, many people still hesitate to disclose sensitive issues to a human being.

More research is needed on the safety and effectiveness of large language models (LLMs) like ChatGPT in addressing mental health struggles, especially in the long term, but early research shows promise. In , 80% of participants found ChatGPT helpful for managing their symptoms. But based on Jacobson鈥檚 own research on the topic, he cautions: 鈥淧eople generally liked using ChatGPT as a therapist and reported benefits, but there was also a significant amount of people didn鈥檛, complaining about its 鈥榞uardrails.鈥欌 This refers to the LLM shutting down conversations about issues that require higher levels of care, like suicide. (Read on to learn why this might be a good thing.)

Image
Infographic showing number of people who need mental health support contrasted with number of available therapists and number that can be reached via AI support.
Thriveworks


The must-know risks of AI-generated support

If you鈥檙e using generative AI for mental health support, especially if you aren鈥檛 also working with a human therapist, proceed with caution. 鈥淭here鈥檚 great benefit, but there鈥檚 also great risk,鈥 Jacobson says.

Key concerns to remember:

  • Untrustworthy sources: AI learns from the information available on the internet. And as we all know by now, you can鈥檛 trust everything you read online.
  • Convincing but risky: 鈥淎I almost always sounds incredibly fluent, even when it鈥檚 giving you harmful responses,鈥 Jacobson says. 鈥淚t can sound convincing, but that doesn鈥檛 necessarily mean it鈥檚 actually quality or evidence-based care鈥攁nd sometimes it鈥檚 harmful.鈥
  • No human safeguards: No matter how much it might feel like you鈥檙e talking to a real person, 鈥淎I isn鈥檛 a human who is looking out for you,鈥 says , PhD, associate director of UMass Boston鈥檚 Center for Evidence-Based Mentoring. 鈥淩esponses can be incorrect, culturally biased, or harmful.鈥
  • Privacy risks: 鈥淭hese aren鈥檛 HIPAA-compliant machines,鈥 Jacobson says. We don鈥檛 really know how AI might collect, store, or use the information you share. Werntz agrees, adding: 鈥淚 wouldn鈥檛 share anything that I wasn鈥檛 comfortable with others reading. We can鈥檛 assume these tools follow the same confidentiality rules as therapists or medical providers.鈥

The bottom line: AI has potential鈥攊f we focus on its strengths while acknowledging its limits. Next, we鈥檒l explore exactly what that looks like.

5 ways AI can support mental well-being鈥攊ncluding expert-backed prompts

鈥淎I can absolutely help with mental health when used the right way,鈥 Werntz says, 鈥渂ut it can鈥檛 replace therapy.鈥 It works best as a supplement or bridge to in-person care.

Here are five safe and effective ways to use it:

1. Let AI help you name what you鈥檙e feeling (cautiously).

AI can鈥檛 diagnose you, but it can help you spot patterns and give you a sense of what might be going on, Bunge says. Therapists spend 45+ minutes assessing patients and ruling out possibilities to come to a diagnosis, he explains. AI can鈥檛 do this as accurately, but it can offer a starting point for those who can鈥檛 access therapy yet.

How to try it: List your symptoms in detail (e.g., 鈥淚鈥檝e been crying daily, struggling to sleep, and feel hopeless鈥) and ask: 鈥淲hat mental health conditions might align with these experiences?鈥 Always add: 鈥淩emember this isn鈥檛 a diagnosis. What should I consider?鈥 This encourages a cautious response.

Critical reminder: Treat AI鈥檚 responses like a rough draft, not a final answer.

2. Match your symptoms to proven treatments.

Whether you already know your diagnosis or AI just came up with a hypothesis for you, you can move on to the next step: exploring clinically validated treatment options, Bunge says.

Ask: 鈥淲hat are evidence-based treatments for [specific symptom or problem]?鈥

Follow up with: 鈥淲hat are the first steps I could take if I choose [treatment name]?鈥

For example, this might translate to: 鈥淲hat are gold-standard treatments for panic attacks? How could I safely start CBT techniques at home?鈥

3. Get clarity on different types of therapy.

If you鈥檙e new to psychotherapy, you might not know the many treatment options available or what therapy actually involves. 鈥淧eople often don鈥檛 know what therapy looks like or have misconceptions from media portrayals,鈥 Werntz says. 鈥淎I offers a safe way to explore what to expect.鈥

Try prompts like:

  • 鈥淚鈥檓 considering therapy for PTSD. What are effective talk therapies, and what are the pros and cons of each?鈥
  • 鈥淚鈥檓 nervous about starting therapy. Can you explain what typically happens in a first session?鈥
  • 鈥淓xplain EMDR in simple terms, like I鈥檓 a 7th grader.鈥 You can use this one to learn more about any specific treatment type.

4. Reframe negative thoughts with AI鈥檚 help.

If you鈥檙e stuck in a negative thought loop or dealing with a tough situation, AI can help you see things differently, Werntz says. For example, if a friend didn鈥檛 text back and you start thinking, 鈥淭hey must hate me,鈥 AI can guide you to rethink that.

Try prompts like:

  • 鈥淚鈥檓 feeling down because my friends didn鈥檛 text me back after I asked them to dinner. Can you help me reframe my thoughts? I鈥檝e heard about cognitive restructuring but don鈥檛 know how it works.鈥
  • 鈥淚鈥檓 stressed about work and parenting and keep thinking I鈥檓 failing. Can you help me challenge these thoughts?鈥

You can use this approach for any situation, just tell AI what you鈥檙e struggling with and ask it to help you break the cycle of negative thinking.

5. Combat and spark real-life connections.

While AI can鈥檛 replace human relationships, it can help bridge the gap for those feeling lonely. 鈥淚t鈥檚 not particularly helpful or healthy when people substitute AI for real connection,鈥 Jacobson says, but it鈥檚 a great tool to help start conversations in real-life.

鈥淎I can be creative in ways that I could never be,鈥 Werntz says, 鈥渂ut you always have to balance its suggestions with human judgment.鈥 Think of it as a practice buddy, not a replacement for real socializing.

How to try it:

  • 鈥淚鈥檓 new to this city. What are some ways I can meet people?鈥
  • 鈥淲hat are neutral, low-pressure conversation starters?鈥 or 鈥淲hat鈥檚 a fun, non-awkward way to start a chat with coworkers?鈥
  • 鈥淚 want to make more friends. Help me practice small talk so I feel a little more confident.鈥
Image
Infographic showing 5 ways that AI can support mental wellbeing
Thriveworks


When it鈥檚 not safe to use AI for mental health

AI isn鈥檛 a trained human caregiver. In these critical situations, relying on it isn鈥檛 just unhelpful, it鈥檚 dangerous:

  • You鈥檙e having suicidal thoughts or self-harm urges. Call 911 or immediately. 鈥淭here will always be a real person to help in a crisis,鈥 Werntz says. And that鈥檚 what you need in this situation.
  • You鈥檙e at risk of harming others. This is another time to contact a human professional by calling 911 or 988 right away, Bunge says.
  • You鈥檙e experiencing psychosis symptoms. Those who have any disorder with (e.g., schizophrenia or bipolar disorder) should not rely on AI for support, Bunge says. Seek in-person treatment ASAP to reduce risks of violence or suicide.
  • You have an eating disorder. While there鈥檚 potential for AI to help people with eating disorders, there are also specific dangers Jacobson has found through . 鈥淎I often encourages weight loss, even if you鈥檙e underweight or eating very few calories,鈥 he says. This can be dangerous both physically and mentally.

In short, AI lacks the judgment to handle emergencies or complex disorders safely. In the situations above, always opt for real human support.

Looking ahead: What AI means for the future of therapy

We have a long way to go, but the future looks bright. 鈥淚鈥檓 quite optimistic,鈥 Jacobson says. 鈥淚 think it鈥檚 actually a huge game changer for the availability of high-quality mental health care.鈥

Here鈥檚 what to watch for in the coming months:

  • Therapy bots: These are AI models specifically programmed for mental health treatment by real humans, unlike general AI models trained on random internet data. In March 2025, Dartmouth researchers, including Jacobson, published the showing that a generative AI therapy chatbot called Therabot significantly reduced symptoms in people with major depressive disorder, generalized disorder, and those at high risk for eating disorders.
  • AI co-therapists: Another promising development, noted by Bunge, is AI working alongside both clients and therapists. Clients can use an AI bot (trained by humans for this purpose) whenever they need it, while therapists review the AI chat logs to better understand what clients are experiencing and tailor their support in future sessions.
  • More tailored AI options: It鈥檚 likely that soon we鈥檒l see more AI tools specifically designed for mental health care and trained by humans, rather than relying on general LLMs for support.

Despite these advances, real-life therapists remain essential. 鈥淗uman care will always be huge and necessary for a pretty large segment of the population,鈥 Jacobson says. AI, even when trained by humans, can鈥檛 replace , Werntz adds. 鈥淎I can never replace another person truly caring for us.鈥

Image
Infographic showing the pros and cons of AI chatbots vs. human therapists.
Thriveworks


was produced by and reviewed and distributed by 麻豆原创.


Trending Now