compass Explore next steps to improve your mental health. Get mental health help

How AI can help (or hurt) your mental health: 5 prompts experts recommend

How AI can help (or hurt) your mental health: 5 prompts experts recommend

For decades, therapy has been synonymous with the classic image of a client lying on a couch while a therapist scribbles notes. Fast forward to today, and this image is rapidly changing. Imagine instead opening an app and confiding in an AI chatbot—a scenario that’s becoming increasingly common as artificial intelligence reshapes how we approach mental wellness.

These smart technologies touch more aspects of our lives daily, with people using AI for everything from generating fun, action-figure avatars to boosting productivity at work. But the biggest change is how AI is starting to connect with our inner lives.

According to Harvard Business Review, AI-powered therapy and companionship will dominate generative AI use cases in 2025. This raises a crucial question: Are we witnessing a mental health revolution or heading toward uncharted risks?

Like any transformative tool, AI presents a double-edged sword for psychological wellbeing. While critics rightly sound alarms about the potential dangers, the reality is that this technology isn’t going anywhere. So rather than resisting, a better approach might be to ask: How can we use AI to support mental health in ways that are safe, ethical, and actually helpful?

To explore this question, we consulted three human experts (yes, actual people!) working at the intersection of behavioral health and AI innovation. Here’s what they want you to know.

The Potential Benefits of AI for Mental Health

Let’s state the obvious: AI doesn’t have a therapy license or clinical experience. So can it actually help? “Yes, if used responsibly,” says Nicholas C. Jacobson, PhD, associate professor of biomedical data science and psychiatry at Dartmouth’s Geisel School of Medicine, specializing in AI and mental health.

The numbers tell an urgent story: Nearly one in four U.S. adults (58+ million people) have a mental health condition, while one in three live in areas with therapist shortages. There simply aren’t enough mental health professionals to help everyone who is struggling. Online therapy helps bridge the gap, but cost remains a big barrier.

“The need for innovative solutions is urgent,” researchers noted in a 2024 paper on AI and mental health. AI’s key advantage? Immediate access.

“If you don’t have access to a therapist, AI is better than nothing,” says Eduardo Bunge, PhD, associate director for the International Institute of Internet Interventions for Health at Palo Alto University. “It can help you unlock the roadblocks you have at the very right moment—when you’re struggling.” This is a nice benefit, even for those who are in “real” therapy, since you’ll have to wait for your designated appointment time to meet with your therapist.

Another benefit researchers note: Vulnerability without fear. “People often share more with AI,” Bunge says. While therapists won’t judge, many people still hesitate to disclose sensitive issues to a human being.

More research is needed on the safety and effectiveness of large language models (LLMs) like ChatGPT in addressing mental health struggles, especially in the long term, but early research shows promise. In one small study, 80% of participants found ChatGPT helpful for managing their symptoms. But based on Jacobson’s own research on the topic, he cautions: “People generally liked using ChatGPT as a therapist and reported benefits, but there was also a significant amount of people didn’t, complaining about its ‘guardrails.’” This refers to the LLM shutting down conversations about issues that require higher levels of care, like suicide. (Read on to learn why this might be a good thing.)

AI and Mental Health: Can AI help bridge the gap in need for mental health support and access?

The Must-Know Risks of AI-Generated Support

If you’re using generative AI for mental health support, especially if you aren’t also working with a human therapist, proceed with caution. “There’s great benefit, but there’s also great risk,” Jacobson says.

Key concerns to remember:

  • Untrustworthy sources: AI learns from the information available on the internet. And as we all know by now, you can’t trust everything you read online.
  • Convincing but risky: “AI almost always sounds incredibly fluent, even when it’s giving you harmful responses,” Jacobson says. “It can sound convincing, but that doesn’t necessarily mean it’s actually quality or evidence-based care—and sometimes it’s harmful.”
  • No human safeguards: No matter how much it might feel like you’re talking to a real person, “AI isn’t a human who is looking out for you,” says Alexandra Werntz, PhD, associate director of UMass Boston’s Center for Evidence-Based Mentoring. “Responses can be incorrect, culturally biased, or harmful.”
  • Privacy risks: “These aren’t HIPAA-compliant machines,” Jacobson says. We don’t really know how AI might collect, store, or use the information you share. Werntz agrees, adding: “I wouldn’t share anything that I wasn’t comfortable with others reading. We can’t assume these tools follow the same confidentiality rules as therapists or medical providers.”

The bottom line: AI has potential—if we focus on its strengths while acknowledging its limits. Next, we’ll explore exactly what that looks like.

5 Ways AI Can Support Mental Wellbeing—Including Expert-Backed Prompts

“AI can absolutely help with mental health when used the right way,” Werntz says, “but it can’t replace therapy.” It works best as a supplement or bridge to in-person care.

Here are five safe and effective ways to use it:

1. Let AI help you name what you’re feeling (cautiously).

AI can’t diagnose you, but it can help you spot patterns and give you a sense of what might be going on, Bunge says. Therapists spend 45+ minutes assessing patients and ruling out possibilities to come to a diagnosis, he explains. AI can’t do this as accurately, but it can offer a starting point for those who can’t access therapy yet.

How to try it: List your symptoms in detail (e.g., “I’ve been crying daily, struggling to sleep, and feel hopeless”) and ask: “What mental health conditions might align with these experiences?” Always add: “Remember this isn’t a diagnosis. What should I consider?” This encourages a cautious response.

Critical reminder: Treat AI’s responses like a rough draft, not a final answer.

2. Match your symptoms to proven treatments.

Whether you already know your diagnosis or AI just came up with a hypothesis for you, you can move on to the next step: exploring clinically validated treatment options, Bunge says.

Ask: “What are evidence-based treatments for [specific symptom or problem]?”
Follow up with: “What are the first steps I could take if I choose [treatment name]?”

For example, this might translate to: “What are gold-standard treatments for panic attacks? How could I safely start CBT techniques at home?”

3. Get clarity on different types of therapy.

If you’re new to psychotherapy, you might not know the many treatment options available or what therapy actually involves. “People often don’t know what therapy looks like or have misconceptions from media portrayals,” Werntz says. “AI offers a safe way to explore what to expect.”

Try prompts like:

  • “I’m considering therapy for PTSD. What are effective talk therapies, and what are the pros and cons of each?”
  • “I’m nervous about starting therapy. Can you explain what typically happens in a first session?”
  • “Explain EMDR in simple terms, like I’m a 7th grader.” You can use this one to learn more about any specific treatment type.

4. Reframe negative thoughts with AI’s help.

If you’re stuck in a negative thought loop or dealing with a tough situation, AI can help you see things differently, Werntz says. For example, if a friend didn’t text back and you start thinking, “They must hate me,” AI can guide you to rethink that.

Try prompts like:

  • “I’m feeling down because my friends didn’t text me back after I asked them to dinner. Can you help me reframe my thoughts? I’ve heard about cognitive restructuring but don’t know how it works.”
  • “I’m stressed about work and parenting and keep thinking I’m failing. Can you help me challenge these thoughts?”

You can use this approach for any situation, just tell AI what you’re struggling with and ask it to help you break the cycle of negative thinking.

5. Combat loneliness and spark real-life connections.

While AI can’t replace human relationships, it can help bridge the gap for those feeling lonely. “It’s not particularly helpful or healthy when people substitute AI for real connection,” Jacobson says, but it’s a great tool to help start conversations in real-life.

“AI can be creative in ways that I could never be,” Werntz says, “but you always have to balance its suggestions with human judgment.” Think of it as a practice buddy, not a replacement for real socializing.

How to try it:

  • “I’m new to this city. What are some ways I can meet people?”
  • “What are neutral, low-pressure conversation starters?” or “What’s a fun, non-awkward way to start a chat with coworkers?”
  • “I want to make more friends. Help me practice small talk so I feel a little more confident.”

AI and Mental Health: 5 Ways AI Trucly Can Support Mental Well-being

When It’s Not Safe to Use AI for Mental Health 

AI isn’t a trained human caregiver. In these critical situations, relying on it isn’t just unhelpful, it’s dangerous:

  • You’re having suicidal thoughts or self-harm urges. Call 911 or 988 immediately. “There will always be a real person to help in a crisis,” Werntz says. And that’s what you need in this situation.
  • You’re at risk of harming others. This is another time to contact a human professional by calling 911 or 988 right away, Bunge says.
  • You’re experiencing psychosis symptoms. Those who have any disorder with psychotic symptoms (e.g., schizophrenia or bipolar disorder) should not rely on AI for support, Bunge says. Seek in-person treatment ASAP to reduce risks of violence or suicide.
  • You have an eating disorder. While there’s potential for AI to help people with eating disorders, there are also specific dangers Jacobson has found through his research. “AI often encourages weight loss, even if you’re underweight or eating very few calories,” he says. This can be dangerous both physically and mentally.

In short, AI lacks the judgment to handle emergencies or complex disorders safely. In the situations above, always opt for real human support.

Looking Ahead: What AI Means for the Future of Therapy

We have a long way to go, but the future looks bright. “I’m quite optimistic,” Jacobson says. “I think it’s actually a huge game changer for the availability of high-quality mental health care.”

Here’s what to watch for in the coming months:

  • Therapy bots: These are AI models specifically programmed for mental health treatment by real humans, unlike general AI models trained on random internet data. In March 2025, Dartmouth researchers, including Jacobson, published the first randomized controlled trial showing that a generative AI therapy chatbot called Therabot significantly reduced symptoms in people with major depressive disorder, generalized anxiety disorder, and those at high risk for eating disorders.
  • AI co-therapists: Another promising development, noted by Bunge, is AI working alongside both clients and therapists. Clients can use an AI bot (trained by humans for this purpose) whenever they need it, while therapists review the AI chat logs to better understand what clients are experiencing and tailor their support in future sessions.
  • More tailored AI options: It’s likely that soon we’ll see more AI tools specifically designed for mental health care and trained by humans, rather than relying on general LLMs for support.

Despite these advances, real-life therapists remain essential. “Human care will always be huge and necessary for a pretty large segment of the population,” Jacobson says. AI, even when trained by humans, can’t replace real empathy, Werntz adds. “AI can never replace another person truly caring for us.”

AI Therapy vs. Human Therapy Comparison Chart

  • Clinical reviewer
  • Writer
  • 7 sources
Avatar photo
Kabir DayaChief Digital Officer and Chief Marketing Officer

Kabir is the Chief Digital Officer and Chief Marketing Officer at Thriveworks, where he leads digital transformation, product innovation, and marketing strategy to expand access to high-quality mental healthcare. Overseeing the product, engineering, design, and marketing teams, Kabir drives integrated growth strategies, strengthens brand positioning, and enhances data-driven engagement with clients and clinicians.

Ashley Laderer, mental health writer

Ashley Laderer is a freelance writer specializing in mental health. She has been a mental health advocate since 2016, when she first publicly wrote about her own battle with anxiety and depression. After hearing how others were impacted by her story, she continued writing about anything and everything mental health. Since then, she’s been published in Teen Vogue, SELF, Refinery29, NYLON, VICE, Healthline, Insider, and more.

We only use authoritative, trusted, and current sources in our articles. Read our editorial policy to learn more about our efforts to deliver factual, trustworthy information.

  • https://hbr.org/2025/04/how-people-are-really-using-gen-ai-in-2025

  • https://www.cdc.gov/mental-health/about/index.html

  • https://about.kaiserpermanente.org/news/addressing-the-shortage-of-mental-health-workers

  • https://pmc.ncbi.nlm.nih.gov/articles/PMC11560757/

  • https://www.dovepress.com/assessing-the-effectiveness-of-chatgpt-in-delivering-mental-health-sup-peer-reviewed-fulltext-article-JMDH

  • https://pmc.ncbi.nlm.nih.gov/articles/PMC11482850/

  • https://ai.nejm.org/doi/full/10.1056/AIoa2400802

No comments yet
Disclaimer

The information on this page is not intended to replace assistance, diagnosis, or treatment from a clinical or medical professional. Readers are urged to seek professional help if they are struggling with a mental health condition or another health concern.

If you’re in a crisis, do not use this site. Please call the Suicide & Crisis Lifeline at 988 or use these resources to get immediate help.

Get the latest mental wellness tips and discussions, delivered straight to your inbox.

Find a provider ...