Saturday, December 6, 2025
ADVT 
Tech

AI and Mental Health: Friend or Foe?

Dr. Shimi Kang and Devinder Dhaliwal  Darpan, 11 Sep, 2025 08:39 PM
  • AI and Mental Health: Friend or Foe?

Artificial Intelligence (AI) is showing up everywhere—and now, even in mental health. Some apps claim they can track your mood, detect anxiety or depression, or give you “therapy-like” advice. But the real question is: can AI truly support your mental health, or is it just another version of “Dr. Google”? 

As mental health professionals, we believe AI can be helpful, but only with caution. It’s not a replacement for therapy or connection. Used responsibly, it can support clinicians and improve access to information. But misused, it risks misinformation, confusion, privacy concerns, and emotional harm. 

Where AI Helps 

  • Supporting Mental Health Professionals 

AI can help therapists behind the scenes—like transcribing notes, organizing records, or flagging clinical reminders. This frees up time for what truly matters: listening and connecting with clients. 

  • Early Detection 

Some AI tools can detect early signs of emotional distress by analyzing speech or behavior patterns. While this is promising, these tools are just that—tools. They need trained professionals to interpret them and guide appropriate next steps. 

  • Access to Mental Health Information 

AI can guide users to helpful, evidence-based mental health resources. This can be a good starting point, but it’s not therapy. Real healing involves empathy, cultural understanding, and trust—things AI cannot replicate. 

Where It Falls Short 

  • The “Dr. Google” Trap & Echo Chambers 

We’ve all typed symptoms into Google and ended up convinced we’re dealing with the worst-case scenario and that we are dying. AI can do the same, but in a more personalized way. If you’re already feeling anxious or depressed, AI may start feeding you more information based on your searches. 

This creates an emotional echo chamber, where AI confirms your worries instead of challenging them. For someone in a vulnerable state, this can intensify negative thoughts or delay getting real help. It might feel supportive, but it’s not offering clinical guidance—just algorithms responding to clicks. 

  • No Cultural Awareness 

Mental health in South Asian communities is layered—shaped by family expectations, religious beliefs, language, and generational gaps. AI doesn’t understand the pressure of being the “perfect child” or the silence around mental illness. Cultural context matters, and only a human being can truly grasp that. 

  • Privacy Concerns 

AI tools often collect and store data—sometimes without users fully realizing it. When that data includes emotional or mental health information, the risk of misuse or breach becomes more serious. 

So, Friend or Foe? 

AI can support mental health systems, but it’s not a substitute for human care. It may assist professionals or provide information, but it can’t offer the compassion, safety, or cultural sensitivity that true mental health care requires. 

Especially in South Asian communities—where stigma still exists—what we need is real, human understanding. Let’s use AI as a tool, not a therapist. 

Dr. Shimi Kang MD, FRCPC  

Dr. Shimi Kang is an award-winning Harvard-trained doctor, researcher, and keynote speaker specializing in the science of motivation. She founded Future-Ready Minds, is the host of ‘Mental Wealth with Dr. Shimi Kang’ on YouTube, and is the author of the #1 national bestselling parenting book ‘The Dolphin Parent.’ Her work focuses on mental health, addiction, and brain-related conditions, offering assessments and treatments like psychedelic-assisted therapy and more.   

Devinder Dhaliwal 

Devinder Dhaliwal is a Registered Clinical Social Worker with extensive experience in mental health, addictions, and family support. He holds a Master of Social Work and has operated a successful private practice for over seven years. His professional background includes working with youth at risk of gang involvement, individuals experiencing homelessness, and families navigating complex relationship challenges. Raised in Abbotsford and now based in Chilliwack, Devinder values community, family, and lifelong learning, and enjoys golf, travel, and time with his three sons. 

MORE Tech ARTICLES

Blue ticks: when your Whatsapp message is read

Blue ticks: when your Whatsapp message is read
 Whatsapp has now introduced blue tick marks, which show users that their messages have been read.....

Blue ticks: when your Whatsapp message is read

Smartphones engage young Americans, entertain Koreans

Smartphones engage young Americans, entertain Koreans
While young American smartphone users are more likely to use their phones for email, texting and social media, those from South Korea are more.....

Smartphones engage young Americans, entertain Koreans

App helps kids handle emergencies

App helps kids handle emergencies
A new app called Monster Guard is out to teach children, through fun and games, how to handle emergencies....

App helps kids handle emergencies

An app to track your sleep

An app to track your sleep
If you need an expert to track how well you sleep, here is one that can do it from your smartphone....

An app to track your sleep

App stops texting while driving

App stops texting while driving
The AT&T DriveMode app for iPhone turns on when it detects a speed of 25 km per hour and automatically responds to incoming SMS and MMS....

App stops texting while driving

Smartphones can monitor muscle disease treatment

Smartphones can monitor muscle disease treatment
A smartphone-based technology can effectively monitor the treatment progress of muscular dystrophy, a condition which causes the muscles of the body to weaken....

Smartphones can monitor muscle disease treatment