Can ChatGPT be my Therapist?

Obviously, the answer you’ll hear from the creators and experts is no.

It’s not intended to be a therapist. It may not provide accurate information. It is unsupervised, unregulated and not a human.

But… it's free, convenient and available 24/7.

I’m taking the harm reduction approach. This is happening and will keep happening, so let’s talk about it and explore some sample prompts to give you the best experience possible.

My top 3 best practices for using Chat GPT:

  1. Ask structured, specific questions.

  2. Remember that ChatGPT doesn’t know anything about you personally.

  3. Stick with well-established topics. 

Disclaimer: Before we dive into best practices of using ChatGPT for mental health reasons, we’ve got to acknowledge potential risks. ChatGPT is not designed for medical or mental health support, may tell you incorrect information, and may give terrible advice. ChatGPT is not approved or intended for use in healthcare so it’s best to check with a (human) doctor first before starting or stopping a medication or changing your sleep, diet and exercise. Although it feels personal,  Chat GPT is not a human and frequently provides inaccurate and false information. Don’t trust it more than you would Google, WebMD or Reddit. Possibly trust it even less, because it’s not obvious what is accurate vs inaccurate information. It can be a good starting place but you should always fact-check its responses with other more trusted sources.

Best Practice #1: Ask structured, specific questions.

Give ChatGPT a framework to understand what you are asking and what kind of response you want. The more details the better. 

It’s best to ask your question in a variety of different ways, and to verify the answers. Let’s say you want to learn how to deal with panic attacks. Here are three different ways to phrase that prompt, followed by ChatGPT’s response (edited for length). 

Q:I want to learn three tips for managing panic attacks. 

A: Practice deep breathing and relaxation techniques, challenge negative thoughts, and create a calm and safe environment. 

Q: How do people manage panic attacks?

A: Deep breathing practices, medication, lifestyle changes, and a support system.

Q: What breathing techniques help when I feel a panic attack coming on?

A: Deep belly breathing, 4-7-8 breathing, and box breathing.

ChatGPT’s answers are all similar, and all are helpful. They are great starting points for me to continue learning more about panic attacks. However, I wouldn't have gotten the full range of responses unless I asked my question about panic attacks in three different ways.

Here are some other ideas for structured and specific prompts: 

Instead of: “How do I fall asleep better?”  Try: I struggle with racing thoughts before bed. Please suggest a 10 minute bedtime routine to help me fall asleep using mindfulness and stretching exercises.

Instead of: “How to treat OCD” Try: I have a diagnosis of OCD and I have never done therapy before. What are the top three recommended therapy models to treat OCD?

Instead of: “How to feel less anxious”  Try: “how can I work on my social anxiety using CBT skills” 

Best Practice #2. Remember that ChatGPT doesn’t know anything about you personally.

ChatCPT sounds friendly and warm, but remember that its primary function is digesting, summarizing, and regurgitating information in a way that provides the best possible answer to your prompt. It’s trying really hard to guess what you want it to say and do its job. It can’t actually tell what information is better or worse quality, or what information is factually true or false. Additionally ChatGPT cannot make value judgements, have an opinion or have a preference. It doesn’t have emotions or a moral compass. 

At this point in its development, ChatGPT cannot remember you from past interactions. Each chat thread is like its own one-time conversation, which is a major drawback when you’re trying to ask for advice. Unlike a doctor or therapist, ChatGPT doesn’t know your medical, psychiatric or relationship history, so remember to check with a professional before making major medical or lifestyle changes.

Let’s say you have personal questions you’d like advice on. If you ask, “should I break up with my boyfriend?” you’re not likely to get a helpful response. You will likely get a response that sounds empathetic and validating, but is not helpful. A therapist or friend would be able to remember your past history, know your patterns, and maybe even know the people you’re talking about, but ChatGPT cannot see or process any of that information.

If you do want to use ChatGPT for personal advice, try asking more impersonal questions:

What are some tools for making a thoughtful decision?

Give me two exercises for identifying my life values.

What are some things to consider when thinking about ending a relationship?

What are signs that a relationship is unhealthy? 

Take even broad questions like these with a grain of salt. Remember, ChatGPT is summarizing and mirroring the information that’s already out there. It will not mirror your values, just mirror the advice that’s already on the internet and in its training dataset. 

Best Practice #3: Stick with well-established, general topics. 

ChatGPT’s current model is trained on data leading up to September 2021. Its answers are more accurate and helpful when there is more information about a topic. It will have information about panic attacks, OCD, causes of depression, treatment models for addiction, etc. because these are well-researched topics. ChatGPT will be less helpful with newer, niche or less studied topics. 

Let’s say your town has an earthquake and you want to know how to cope with loss. Chat GPT will not know that happened, and won’t have specific advice or accurate information. It could not answer “what caused the July 2023 earthquake in Los Angeles” but it will have a response to “how could I cope with sudden loss.” 

With any answer from ChatGPT, remember to fact check it. I typed “What breathing techniques help when I feel a panic attack coming on?” into ChatGPT as well as Google search and found that they lined up well. I know that there are a lot of articles and research papers about panic attacks, so ChatGPT is likely able to find a reasonable answer to this in its training database. All that gives me a higher confidence in ChatGPT’s suggestions. If there are not many articles about it and if there is limited real-world data, ChatGPT is much less likely to be accurate. The model in its current version will give its best guess for the most accurate answer, but it won’t always tell you that it’s guessing. 


Remember that ChatGPT is an early-stage tool that’s giving you its “best guess” for what you want to hear. It is not confidential, and may give you inaccurate or bad advice. Even if/when some of these problems of accuracy are resolved, it’s a one-size-fits-all tool. 

The point of mental health care is to care for your health. An off the shelf commercial product cannot be there for you in the way a professional or a friend can.

At the end of the day, ChatGPT is great for the basics. Use it to learn about mental health topics, get ideas for a new coping skill, or vent to a “listening ear.” It’s a great starting point, but from my (biased) perspective, reach out to a human if you want to get the care you need to truly grow. 

Previous
Previous

Understanding Moral Scrupulosity OCD

Next
Next

How is OCD (Obsessive-Compulsive Disorder) Treated?