ADVERTISEMENTREMOVE AD

Future of Mental Health: Are We Ready for AI Therapists in India?

In India, AI can potentially fill gaps in mental health services, but how practical is it?

Published
Fit
5 min read
story-hero-img
i
Aa
Aa
Small
Aa
Medium
Aa
Large
Hindi Female

(This is the first article of 'AI Told You So' by The Quint, a special series that takes a closer look at the possibilities unlocked by Artificial Intelligence in various sectors and walks of life, where the technology stands today, and the challenges ahead.)

Let's rewind to a few months ago. It's November, and the freshly launched unassuming AI chatbot called ChatGPT is slowly taking over conversations online. It can, it seems, fetch information for you, plan your day, write essays, and even do your job for you.

This is when a platform called Koko decided to put it to the test in the realm of mental health. However, their users were not told about this switch. At the time, anyway.

In January, Koko's Co-founder, Rob Morris, took to Twitter to reveal that they were running an 'experiment', and that the mental health support they were providing was guided by a chatbot supervised by humans.

"We provided mental health support to about 4,000 people—using GPT-3," he said in a Twitter thread that explained the process.

Future of Mental Health: Are We Ready for AI Therapists in India?

  1. 1. AI in Mental Health: How Does It Work?

    According to Morris, messages composed by AI – and allegedly supervised by humans – were rated significantly higher by users than those written by humans on their own.

    Although it has its drawbacks, the boom of ChatGPT has made AI even more accessible to the everyday person. For those of us who were far removed from technological innovations, it has made it possible to envision using AI in everyday life.

    So is the idea of AI run mental health services and chatbot therapists so far-fetched? Can AI provide safe and effective mental health support?

    Beyond this 'experiment', many mental health support apps have been using AI to provide mental health support.

    Wysa is one such international platform that provides mental health support with the help of AI bots. The AI essentially 'listens' to you, is capable of gauging your tone, mood, and intensity of emotions and give a suitable response – something like a counsellor, or perhaps a friend would do.

    "AI is currently mainly being used for automating some of their routine tasks like patient assessments, tracking patients' symptoms and flagging any indicators of mental health disorders found through parsing of textual, verbal or behavioural data," explains, Dr Megha Gupta, head of Artificial Intelligence at Wysa.

    Apps like Wysa take it a step further.

    Speaking to FIT, Smriti Joshi, Chief Psychologist at Wysa, says that the responses that the AI gives won't be very tailored to you initially, but as you keep talking to it, and feeding it information about you, it gets better at gauging how to respond.

    "It is able to remember things from past conversations and bring it up. For instance, it might learn over time that you are the type of person for whom meditation may not be the best because you are someone who has used it in the past, and it's not worked for you. It also depends on how overwhelmed the user is at that time."
    Smriti Joshi, Chief Psychologist at Wysa
    One of the most attractive aspects of AI chatbots, perhaps, is their accessibility, and that they are available whenever you need them, for however long.
    Expand
  2. 2. What Is the Future of AI in Healthcare in India?

    In a country like India, AI could mean a huge relief for the straining mental health sector.

    According to Section 18 of the Mental Healthcare Act, 2017, every citizen shall have a right to access mental health care and treatment from government healthcare facilities. But, in reality there aren't enough mental health practitioners, and those that are there are not accessible to everyone.

    According to data collected in 2014 by World Mental Health Atlast, India has an average of 0.2 psychiatrists per 100,000 people. Compare this to the global median of 3 per 100,000 population.

    The numbers haven't gone up since.

    • According to a study published in 2019, it will take 42 years to meet the requirement for psychiatrists in the country.

    • Therapy can be expensive with the bill running up to thousands per session - an unaffordable expense for most Indians.

    • On top of all this, conversations around mental health are still marred with taboo, with many who need it feeling hesitant to seek help.

    In a situation like this, AI could be the bridge that can close this gap.

    People are able to talk more freely to the chatbot, and in the comfort and privacy of their own space, says Smriti Joshi.

    Expand
  3. 3. What Are The General Concerns?

    The use of AI in mental health services is undoubtably brimming with potential, but a future with fully functional and safe AI therapists may still be far off.

    Earlier this month, the World Health Organization (WHO), released a study on the applications and challenges of using AI in mental health research.

    “We found that AI application use in mental health research is unbalanced and is mostly used to study depressive disorders, schizophrenia and other psychotic disorders. This indicates a significant gap in our understanding of how they can be used to study other mental health conditions,” said Dr Ledia Lazeri, Regional Advisor for Mental Health at WHO/Europe in the statement.

    There are other real-world concerns as well when it comes to implementing AI in mental health support.

    Remember Rob Morris from Koko? In the Twitter thread, he also talks about how they had to pull the plug on the experiment' because although it was wildly successful at first, once people found out it was an AI on the other end, they were less receptive to it.

    "We find that people prefer talking to actual humans, and also in a physical setting," says Shruthi S, from YourDOST, another online counselling platform.

    Expand
  4. 4. Can AI Replace Humans In Mental Health Support?

    One of the major concerns that sceptics have brought up is of safety. Is AI smart enough to navigate high-risk, sensitive situations? Will the chatbots be able to mitigate risks of self-harm, or harm to others?

    According to Shruthi S, AI is also a risk her organisation is not willing to take yet.

    "We have an AI bot on the platform that asks initial questions to find the right expert (from their database) for them. Our use of AI at the moment stops there," she says.

    Smriti Joshi also underscores that the chatbot is not a replacement for an actual therapist. "We are not a treatment bot, rather a support bot."

    The chatbot can't be left to its own devices just yet, she says, adding that many of the responses are clinically vetted responses by qualified professional that are fed in.

    "Like when people say words like trauma, suicidal thoughts, self harm etc - we have created conversations around this because of the risks involved, we can't let the AI self respond from the training it has received."
    Smriti Joshi, Cheif Psychologist, Wysa

    "In case it comes across these keywords, then the bot gets into a loop telling them that it is not equipped to handle the situation, and that they should consult a professional," she adds.

    Joshi explains that they also have qualified therapists on board for those who need them.

    Expand
  5. 5. Privacy & Legal Security Issues Cannot Be Ignored

    There's also the question of privacy and legal security. What if there's a leak of private data? What if the AI is not able to handle a sensitive situation, and someone ends up hurting themselves?

    Who would be held accountable if the technology goes wrong? The law is vague on this.

    India still doesn't have strong data protection laws, especially ones that are tailored to AI. There are also no clear regulatory approval processes in place for such services before they are launched into the market.

    All the future is fast approaching, these concerns would need to be addressed, among with questions of infrastructure and implementation before AI therapists becomes a reality.

    (At The Quint, we are answerable only to our audience. Play an active role in shaping our journalism by becoming a member. Because the truth is worth it.)

    Expand

According to Morris, messages composed by AI – and allegedly supervised by humans – were rated significantly higher by users than those written by humans on their own.

Although it has its drawbacks, the boom of ChatGPT has made AI even more accessible to the everyday person. For those of us who were far removed from technological innovations, it has made it possible to envision using AI in everyday life.

So is the idea of AI run mental health services and chatbot therapists so far-fetched? Can AI provide safe and effective mental health support?

AI in Mental Health: How Does It Work?

Beyond this 'experiment', many mental health support apps have been using AI to provide mental health support.

Wysa is one such international platform that provides mental health support with the help of AI bots. The AI essentially 'listens' to you, is capable of gauging your tone, mood, and intensity of emotions and give a suitable response – something like a counsellor, or perhaps a friend would do.

"AI is currently mainly being used for automating some of their routine tasks like patient assessments, tracking patients' symptoms and flagging any indicators of mental health disorders found through parsing of textual, verbal or behavioural data," explains, Dr Megha Gupta, head of Artificial Intelligence at Wysa.

Apps like Wysa take it a step further.

Speaking to FIT, Smriti Joshi, Chief Psychologist at Wysa, says that the responses that the AI gives won't be very tailored to you initially, but as you keep talking to it, and feeding it information about you, it gets better at gauging how to respond.

"It is able to remember things from past conversations and bring it up. For instance, it might learn over time that you are the type of person for whom meditation may not be the best because you are someone who has used it in the past, and it's not worked for you. It also depends on how overwhelmed the user is at that time."
Smriti Joshi, Chief Psychologist at Wysa
One of the most attractive aspects of AI chatbots, perhaps, is their accessibility, and that they are available whenever you need them, for however long.
ADVERTISEMENTREMOVE AD

What Is the Future of AI in Healthcare in India?

In a country like India, AI could mean a huge relief for the straining mental health sector.

According to Section 18 of the Mental Healthcare Act, 2017, every citizen shall have a right to access mental health care and treatment from government healthcare facilities. But, in reality there aren't enough mental health practitioners, and those that are there are not accessible to everyone.

According to data collected in 2014 by World Mental Health Atlast, India has an average of 0.2 psychiatrists per 100,000 people. Compare this to the global median of 3 per 100,000 population.

The numbers haven't gone up since.

  • According to a study published in 2019, it will take 42 years to meet the requirement for psychiatrists in the country.

  • Therapy can be expensive with the bill running up to thousands per session - an unaffordable expense for most Indians.

  • On top of all this, conversations around mental health are still marred with taboo, with many who need it feeling hesitant to seek help.

In a situation like this, AI could be the bridge that can close this gap.

People are able to talk more freely to the chatbot, and in the comfort and privacy of their own space, says Smriti Joshi.

0

What Are The General Concerns?

The use of AI in mental health services is undoubtably brimming with potential, but a future with fully functional and safe AI therapists may still be far off.

Earlier this month, the World Health Organization (WHO), released a study on the applications and challenges of using AI in mental health research.

“We found that AI application use in mental health research is unbalanced and is mostly used to study depressive disorders, schizophrenia and other psychotic disorders. This indicates a significant gap in our understanding of how they can be used to study other mental health conditions,” said Dr Ledia Lazeri, Regional Advisor for Mental Health at WHO/Europe in the statement.

There are other real-world concerns as well when it comes to implementing AI in mental health support.

Remember Rob Morris from Koko? In the Twitter thread, he also talks about how they had to pull the plug on the experiment' because although it was wildly successful at first, once people found out it was an AI on the other end, they were less receptive to it.

"We find that people prefer talking to actual humans, and also in a physical setting," says Shruthi S, from YourDOST, another online counselling platform.

ADVERTISEMENTREMOVE AD

Can AI Replace Humans In Mental Health Support?

One of the major concerns that sceptics have brought up is of safety. Is AI smart enough to navigate high-risk, sensitive situations? Will the chatbots be able to mitigate risks of self-harm, or harm to others?

According to Shruthi S, AI is also a risk her organisation is not willing to take yet.

"We have an AI bot on the platform that asks initial questions to find the right expert (from their database) for them. Our use of AI at the moment stops there," she says.

Smriti Joshi also underscores that the chatbot is not a replacement for an actual therapist. "We are not a treatment bot, rather a support bot."

The chatbot can't be left to its own devices just yet, she says, adding that many of the responses are clinically vetted responses by qualified professional that are fed in.

"Like when people say words like trauma, suicidal thoughts, self harm etc - we have created conversations around this because of the risks involved, we can't let the AI self respond from the training it has received."
Smriti Joshi, Cheif Psychologist, Wysa

"In case it comes across these keywords, then the bot gets into a loop telling them that it is not equipped to handle the situation, and that they should consult a professional," she adds.

Joshi explains that they also have qualified therapists on board for those who need them.

ADVERTISEMENTREMOVE AD

Privacy & Legal Security Issues Cannot Be Ignored

There's also the question of privacy and legal security. What if there's a leak of private data? What if the AI is not able to handle a sensitive situation, and someone ends up hurting themselves?

Who would be held accountable if the technology goes wrong? The law is vague on this.

India still doesn't have strong data protection laws, especially ones that are tailored to AI. There are also no clear regulatory approval processes in place for such services before they are launched into the market.

All the future is fast approaching, these concerns would need to be addressed, among with questions of infrastructure and implementation before AI therapists becomes a reality.

(At The Quint, we are answerable only to our audience. Play an active role in shaping our journalism by becoming a member. Because the truth is worth it.)

Read Latest News and Breaking News at The Quint, browse for more from fit

Speaking truth to power requires allies like you.
Become a Member
3 months
12 months
12 months
Check Member Benefits
Read More
×
×