Javascript is not enabled.

Javascript must be enabled to use this site. Please enable Javascript in your browser and try again.

Skip to content
Content starts here
CLOSE ×
Search
Leaving AARP.org Website

You are now leaving AARP.org and going to a website that is not operated by AARP. A different privacy policy and terms of service will apply.

AI Could Be Your Next Therapist If You’re Feeling Lonely, Depressed or Anxious

Artificial intelligence can play a supporting role in solving nation’s mental health woes, experts say


spinner image a man lays on a couch speaking to a robot therapist
GETTY IMAGES

When you wake up in the middle of the night feeling overwhelmed because of a recent death of a loved one, your new shrink could be an artificial intelligence (AI) chatbot.

It offers advice and suggests techniques to improve your mood. And it’s almost as if you're texting with a human therapist.

spinner image Image Alt Attribute

AARP Membership— $12 for your first year when you sign up for Automatic Renewal

Get instant access to members-only products and hundreds of discounts, a free second membership, and a subscription to AARP the Magazine. 

Join Now

Pouring your heart out to an empathetic AI-enabled app or textbot doesn’t replicate lying on a couch to confide in a human psychotherapist or even calling a crisis line. But with proper guardrails, some mental health professionals see a supporting role for AI, at least for certain people under certain situations.

“I don’t think [AIs] are ever going to replace therapy or the role that a therapist can play,” says C. Vaile Wright, senior director of the office of health care innovation at the American Psychological Association in Washington, D.C. “But I think they can fill a need … reminding somebody to engage in coping skills if they’re feeling lonely or sad or stressed.”

AI could help with therapist shortage

spinner image an a i robot therapist
Getty Images

In 2021 in the U.S., nearly 1 in 5 adults sought help for a mental health problem, but more than a quarter of them felt they didn’t get the help they needed, according to a federal Substance Abuse and Mental Health Services Administration annual survey. About 15 percent of adults 50 and older sought those services and 1 in 7 of the 15 percent thought they didn’t receive what they needed.

Globally, the World Health Organization (WHO) estimated in 2019 that about 14.6 percent of all adults 20 and older were living with mental disorders, roughly equivalent to the percentage of adults 50 to 69. About 13 percent of adults 70 and older had mental health problems.

The pandemic caused anxiety and depressive disorders to increase by more than a quarter worldwide, according to WHO. Most people with diagnosed mental health conditions, even before the 2020 surge in need, are never treated. Others, who see few services offered in their countries or a stigma attached to seeking them, choose not to attempt treatment.

Learn more

Senior Planet from AARP has free online classes to help you discover more about artificial intelligence.

A March survey of 3,400 U.S. adults from CVS Health and The Harris Poll in Chicago showed that 95 percent of people age 65 and older believe that society should take mental health and illness more seriously, says Taft Parsons III, chief psychiatric officer at Woonsocket, Rhode Island-based CVS Health. But erasing the stigma that older adults in this country often feel about getting treatment will only increase the need for therapists.

“Artificial intelligence may be leveraged to help patients manage stressful situations or support primary care providers treating mild illnesses, which may take some pressure off the health care system,” he says.

ChatGPT won’t replace Freud — yet

Relying on a digital therapist may not mean spilling your guts to the groundbreaking AIs dominating headlines. Google’s Bard, the new Microsoft Bing and most notably Open AI’s ChatGPT, whose launch last fall prompted a tsunami of interest in all things artificial intelligence, are called “generative AIs.”

They’ve been trained using vast amounts of human-generated data and can create new material from what’s been input, but they’re not necessarily grounded in clinical support. While ChatGPT and its ilk can help plan your kid’s wedding, write complaint letters or generate computer code, crafting its data into your personal psychiatrist or psychotherapist may not be one of its strengths, at least at the moment.

“The problem with this space is it’s completely unregulated. There’s no oversight body ensuring that these AI products are safe or effective.”

— C. Vaile Wright, American Psychological Association

Nicholas C. Jacobson, a Dartmouth assistant professor of biomedical data science and psychiatry in Lebanon, New Hampshire, has been exploring both generative and rules-based scripted AI bots for about 3½ years. He sees potential benefits and dangers to each.

“The gist of what we’ve found with pre-scripted works is they’re kind of clunky,” he says. On the other hand, “there’s a lot less that can go off the rails with them. … And they can provide access to interventions that a lot of people wouldn’t otherwise be able to access, particularly in a short amount of time.”

Jacobson says he’s “very excited and very scared” about what people want to do with generative AI, especially bots likely to be introduced soon — a feeling echoed Tuesday when more than 350 executives, researchers and engineers working in AI released a statement from the nonprofit Center for AI Safety raising red flags about the technology.

Generative AI bots can supply useful advice to someone crying out for help. But because these bots can be easily confused, they also can spit out misleading, biased and dangerous information or spout guidance that sounds plausible but is wrong. That could make mental health problems worse in vulnerable people.

‘Rigorous oversight’ is needed

How you can reach out for human help

If you or a loved one is considering self-harm, go to your nearest crisis center or hospital or call 911. 

The 988 Suicide & Crisis Lifeline, formerly known as the National Suicide Prevention Lifeline, is the federal government’s free 24-hour hotline. The nonprofit Crisis Text Line also has 24/7 counselors. Both use trained volunteers nationwide, are confidential and can be reached in ways most convenient to you: 

  • Dial or text 988. A phone call to 988 offers interpreters in more than 240 languages.
  • Call 800-273-TALK (8255), a toll-free phone number, to reach the same services as 988. 
  • For hearing- impaired users with a TTY phone, call 711 and then 988
  • Text HOME to 741741, the Crisis Text Line.  
  • In WhatsApp, message 443-SUP-PORT
  • Go to crisistextline.org on your laptop, choose the Chat With Us button and stay on the website. 

“The only way that this field should move forward — and I’m afraid will likely not move forward in this way — is carefully,” Jacobson says.

The World Health Organization’s prescription is “rigorous oversight” of AI health technologies, according to a statement the United Nations’ agency issued in mid-May.

“While WHO is enthusiastic about the appropriate use of technologies, including [large language model (LLM) tools] to support health care professionals, patients, researchers and scientists, there is concern that caution, which would normally be exercised for any new technology, is not being exercised consistently with LLMs.”

The hundreds working with AI technology who signed this week’s statement say vigilance should be a global priority. Wright at the American Psychological Association raises similar concerns:

“The problem with this space is it’s completely unregulated,” she says. “There’s no oversight body ensuring that these AI products are safe or effective.”

Check on the source of the advice

Before turning to an AI bot for therapy, investigate the credentials and qualifications of the humans who developed it.

  • Are the therapists licensed?
  • Are they therapists at all?
  • What is their clinical approach, and is it evidence based?
  • And is everything kept confidential?

Heed disclaimers. When a user recently informed ChatGPT that he was feeling down, it responded, “I’m sorry to hear that you’re feeling sad. I’ll do my best to help you. Remember that I’m an AI language model, so I can provide some suggestions and support, but it’s important to reach out to friends, family or professionals for assistance when needed.”

Technology & Wireless

Consumer Cellular

5% off monthly fees and 30% off accessories

See more Technology & Wireless offers >

Recognize, too, that not all AIs are the same. Generative AI and scripted approaches that mimic human exchanges are different.

“Amidst some extreme examples of badly behaving generative AIs, it’s easy to dismiss the utility of all similar tech used in mental health care,” blogged Alison Darcy, a Stanford-trained clinical psychologist who is also president and founder of San Francisco-based Woebot Health. “More nuanced understanding is needed to prevent the erosion of public confidence in a key public health opportunity.” Her company’s scripted AI “conversational agent,” called Woebot, uses cognitive behavioral therapy psychological treatment methods to interact with clients.

One useful consumer resource to consult before checking out an AI chatbot is the nonprofit One Mind PsyberGuide, which publishes online reviews of mental health apps and digital resources based on their credibility, user experience and data privacy. The organization’s executive director is Stephen Schueller, a psychology professor at the University of California at Irvine.

Can AI therapy work?

Whether AI therapy will be effective could boil down to a patient’s willingness to seek its help. Some people need a human connection and have an emotional objection to an AI shrink, says Jeff Hancock, founding director of the Stanford Social Media Lab and a communications professor at the California university.

“That would be like trying to go to a [human] therapist that you don’t believe in,” he says. “Guess what? It’s not going to work.” But Hancock also is aware of people who are not only fine with AI analysis but indicated they wouldn’t open up to a human.

“There’s potential here to really help some folks who just don’t want to meet with a person and talk about [problems],” Hancock says. They may feel embarrassed or find human therapists intimidating.

“There’s no guarantee that you won’t get bad advice from a human therapist,” he says. But good ones are typically supervised, mentored, licensed, take careful notes and “are responsible for the things they do and say.”

Artificial intelligence bots might lower the cost of care for people in a financial bind, Dartmouth’s Jacobson says. “The standard that I think about … is this: Is it generally going to be better than no care at all?”

spinner image AARP Membership Card

Join AARP today for $16 per year. Get instant access to members-only products and hundreds of discounts, a free second membership, and a subscription to AARP The Magazine. 

Your comfort level is important

spinner image a man laying on a couch for therapy
Getty Images

Even people willing to embrace an AI therapist are likely to feel strange about it at first, says Jo Aggarwal, the Boston-based founder and CEO of Wysa. The company produces what it calls an AI coach.

“It’s super weird for people to be talking about an AI initially,” she says. “It feels like, ‘Oh, don’t I have family? Don’t I have friends? Why am I talking to AI?’ After one session, it helps.”  

More than 6 million people in 95 countries use Wysa, which has both free and premium versions. For a fee, the app will connect users with a human coach.

On an intellectual level, most people realize they’re not talking to a person. But they still may form a bond.

“The fact that I can interact with it anytime I want … I think is important [and an] absolute positive,” Hancock says. An AI may not judge you or grow impatient like a person might.

Bots can build a bridge to humans

Still, Hancock points out risks. After a chatbot conversation ends, a person may still feel alone because no human was involved.

“What really matters is how these systems can help us connect with the people who matter in our lives,” he says.

A lot of the people who reminisce with the Pi chatbot, released in early May, want a good listener, says CEO Mustafa Suleyman of Inflection AI, the Palo Alto, California, start-up behind Pi. Pi users so far tend to skew older. He sees his bot as a kind and supportive companion, not a professional therapist, though it could one day grow into that.

“There’s potential here to really help some folks who just don’t want to meet with a person and talk about [problems].”

— Jeff Hancock, Stanford Social Media Lab

“Some people have referred to it as a therapy experience but only in as much as good therapy is about asking good questions, listening and reflecting back well,” he says. Suleyman is also a cofounder of DeepMind, an AI company that Google bought in 2014 and helped with the development of the Google Bard generative AI bot.

Pi can detect your mood, tone and energy, Suleyman says. And if Pi and some other chatbots haven’t heard from you for a few days, they may proactively reach out by text, just to check in.

If you bring up something sad and sensitive, such as your dog dying or a friend who is very sick, Pi is designed to slow down and express sympathy. If Pi detects something truly dark, potential harm to you or others, it will direct you to professional support, Suleyman says. Because of privacy measures, Pi doesn’t know who you are and cannot dispatch help directly.

The Pi app, free for now, is available on iOS with Android in the works. People also can talk to it through a desktop computer, WhatsApp and perhaps eventually a smart speaker.

At what point might a human intervene when an AI bot determines that someone is in a crisis?

“It’s up to the consent of the person using it,” says Michiel Rauws, founder and CEO of San Francisco-based Cass. His rules-based self-help mental health chatbot is a free benefit to employees or members when organizations employ his company. “We tell them a crisis counselor is available right now. ‘Do you want to speak to this person?’ ” If the OK is given, Rauws says the average response time is about 10 seconds.

Human therapists can also benefit from artificial intelligence, Stanford’s Hancock says. With a patient’s permission, an AI bot can gather and synthesize information between sessions to help a therapist create a course of treatment and test out ideas.

“Let’s not say, ‘I’m going to force AI on people and that’s going to solve this mental health crisis that we see in older adults who are struggling with loneliness, and younger adults struggling with pretty severe anxiety and depression,’ ” Hancock says. But “I don’t want to foreclose this as a way of saying, ‘No, AI is never going to work here.’"

This story, originally published June 1, 2023, was updated with a graphic on the results of a recent AARP survey.

Unlock Access to AARP Members Edition

Join AARP to Continue

Already a Member?