People are using ChatGPT for therapy. Therapists say it's not a bad idea — if you do it right.
Therapists say more of their clients are using ChatGPT for advice. It works best for journal prompts and concrete questions, less for psychoanalysis.
Getty Images; Alyssa Powell/BI
- More people are starting to use ChatGPT for free therapy.
- Therapists say it can make people lonelier and dependent on seeking reassurance.
- They recommend using it for specific tasks, like journal prompts.
I'm not shy about admitting I've confided in a robot.
Just last week, I spent a solid hour in an animated back-and-forth with ChatGPT, seeking advice for a personal problem that I've been agonizing over for weeks. I didn't want to bore my husband and friends with it again.
I'm far from the only one who does this. Reddit users say they love using the AI platform for life coaching and unpacking other people's narcissistic behaviors; TikTokers share tips like "voice journaling" into ChatGPT or changing the settings so ChatGPT responds back in audio, mimicking a therapist.
Rachel Goldberg, a licensed clinical social worker in Los Angeles, said she wasn't surprised to learn that some people use ChatGPT for therapeutic purposes. She even put me in touch with one of her clients, who recently disclosed that she uses the platform daily in addition to their in-person sessions.
The client, who we'll call Emily (she asked to remain anonymous because she is Goldberg's client), said she uses ChatGPT to "brain dump" her thoughts when she feels stressed. Sometimes, she doesn't want to text Goldberg ahead of a session or "burden" her friends; she just needs to process something quickly. Within seconds, she gets "the most amazing life advice," she said.
"It's kind of a little compass helping me go throughout my day," Emily, 28, told BI.
ChatGPT is a tempting alternative (or accessory) to therapy. It's free, available anytime, and offers customized, in-depth advice. The downsides — like OpenAI training on personal data or the environmental damage caused by frequent AI use — are easy to ignore in the heat of a crisis when ChatGPT responds with tranquil eloquence.
While some therapists say it's OK, they encourage hard boundaries around using ChatGPT: its boundless accessibility can reinforce reassurance-seeking behaviors and exacerbate loneliness. It can make a person more anxious and atomized — the opposite of what good therapy aims to do.
It felt nice to read ChatGPT's polite affirmations that my issue "sounds really tough." However, it just didn't feel the same as chatting with a friend who might crack a joke or share how they got through a similar experience.
Free 'therapy in your pocket'
Emily, who has been going to therapy for about eight or nine years, doesn't believe ChatGPT is better than traditional therapy or a substitute for her friends.
It is, however, convenient. When her car was stolen twice, or she felt overwhelmed by big life transitions, she'd write into ChatGPT. She found it most helpful when it identified her feelings when she was too in it to realize them herself.
"If I tell it things where I don't sound very confident or I'm doing something for someone else, it catches that, and it'll be like, 'this person is responsible for their own emotions,'" Emily said. "It's literally therapy in your pocket at any time."
While Goldberg said it's good that ChatGPT "does a pretty good job of validating" someone's feelings and can help them become more self-reflective, the danger lies in being overly dependent on a second opinion. There are still moments when one has to make a split-second decision on their own, like speaking up in a meeting or leaving a rude date.
Ciara Bogdanovic, a therapist specializing in dialectical behavioral therapy, told BI her worry is that ChatGPT can't spot broader patterns in a client, like the need for frequent validation. For people with OCD, ChatGPT can heighten reassurance-seeking behaviors, which a competent therapist would work on, encouraging the client to expose themselves to some discomfort.
AI is "just going to reassure, which is reinforcing the behavior which might be damaging," Bogdanovic said.
When customizing goes too far
The biggest problem I found with ChatGPT therapy is that it can be customized to answer in ways you find preferable. If I asked ChatGPT to analyze a conversation at face value, it told me I was coming off a touch harsh. Then I followed Reddit's advice of asking ChatGPT to spot signs of manipulation in the other person's texts. It broke down all the ways they were defensive and emotionally immature.
Without that note, readings might look different, potentially even sympathetic to the love-bomber or gaslighter.
Therapists and friends aren't perfectly objective, either. It's just that ChatGPT's answers can be continually tweaked to your liking, down to the tone of voice.
Unlike a therapist, Goldberg said that ChatGPT won't pick up on a user labeling everyone in their life as toxic and gently start to push back. It will simply tell them what they want to hear — ultimately, to their detriment.
"As a therapist, I'm constantly assessing: what's the client's diagnosis, what is their history, what is their family context?" Bogdanovic said. How she responds to one client may be completely different to another client, even if they raise what seems like exactly the same concern. ChatGPT, meanwhile, is "just going to spit out an answer," she said.
Losing the human touch
As empathetic as ChatGPT sounds, it has limitations for the kinds of problems it can solve.
Angela Betancourt, a 42-year-old business owner, no longer goes to therapy. While she said ChatGPT can't compete with it, she uses it for quick pep talks or offering a different perspective on approaching a problem.
Betancourt also loves using ChatGPT for gratitude journal prompts. Recently, she used it to reflect on the joyful moments of a trip she took with her family. However, she said she wouldn't use ChatGPT to deal with heavier emotions like grief. She lost her father-in-law last year and her father a few years before that. She believes only real people can provide adequate comfort and advice in a crisis.
Unlike a psychologist or best friend, ChatGPT is also not beholden to confidentiality — data inputted may potentially be used to identify users in the future, though it hasn't happened yet.
Emily tries not to include too much super-personal data. Sometimes, though, she'll describe a "really intense situation" involving other people. "If that ever got out, it could ruin relationships," she said. "I'm just hoping it's completely private."
If you're going to spill to ChatGPT, create boundaries
Both Bogdanovic and Goldberg predict that more people will start using ChatGPT for therapy, at least in the near future. They also hope they'll exercise some boundaries around it.
Bogdanovic recommends using it only for "pointed questions," like how to respond to your boss's message or breathing techniques to calm down.
In my experience, even the best AI -powered advice can fall short. By the time my husband came home that day, ChatGPT still hadn't given me an answer that felt right — it provided too many potential courses of action. I spiraled, wondering if I was giving ChatGPT a fair amount of context without completely compromising my privacy.
So I asked him instead: knowing all about this recurring issue, what would he do if he were me right now? He paused and closed his eyes, pensiveness in his voice. Then he gave me his advice.
I texted the person back in the way he suggested. I have no idea if it was the best answer or the rightest of resolutions. All I know is that I felt better when I put my phone down to hug him.