Advertisement
Wellness
LifestyleHealth & Wellness

Can mental health apps and AI chatbots really help you overcome your issues?

  • Mental health apps and AI chatbots offer to ‘disrupt’ traditional therapy, but their effectiveness is still under study – and there’s evidence some can do harm
  • One expert says he would like to see AI used instead to reduce practitioners’ tasks such as record-keeping to ‘free up more time for humans to connect’

Reading Time:4 minutes
Why you can trust SCMP
Mental health apps offer convenient help through a smartphone. But can they replace a human therapist? Photo: Shutterstock
Tribune News Service

Thousands of apps have stampeded into the mental health space in the past few years offering to “disrupt” traditional therapy. Now, with the frenzy around AI innovations like ChatGPT, the claim that chatbots can provide mental health care is on the horizon.

The numbers explain why: pandemic stresses led to millions more Americans seeking treatment. At the same time, there has long been a shortage of mental health professionals in the United States. This holds true in many places around the world, including in Hong Kong.

Given the US Affordable Care Act’s mandate that insurers offer parity between mental and physical health coverage, there is a gaping chasm between demand and supply.

Advertisement

For entrepreneurs, that presents a market bonanza. At the South by Southwest conference in the US state of Texas in March, where health start-ups displayed their products, there was a near-religious conviction that AI could rebuild healthcare. Apps and machines were offered that could diagnose and treat all kinds of illnesses, with claims they could replace doctors and nurses.

Unfortunately, in the mental health space, evidence of effectiveness is lacking. Few of the many apps on the market have independent-outcomes research showing they help. Most haven’t been scrutinised at all by the US Food and Drug Administration (FDA).

Advertisement

Though marketed to treat conditions such as anxiety, attention-deficit/hyperactivity disorder, and depression, or to predict suicidal tendencies, many warn users (in small print) that they are “not intended to be medical, behavioural health or other healthcare services”.

Advertisement
Select Voice
Choose your listening speed
Get through articles 2x faster
1.25x
250 WPM
Slow
Average
Fast
1.25x