Saturday, December 6, 2025
ADVT 
Tech

AI and Mental Health: Friend or Foe?

Dr. Shimi Kang and Devinder Dhaliwal  Darpan, 11 Sep, 2025 08:39 PM
  • AI and Mental Health: Friend or Foe?

Artificial Intelligence (AI) is showing up everywhere—and now, even in mental health. Some apps claim they can track your mood, detect anxiety or depression, or give you “therapy-like” advice. But the real question is: can AI truly support your mental health, or is it just another version of “Dr. Google”? 

As mental health professionals, we believe AI can be helpful, but only with caution. It’s not a replacement for therapy or connection. Used responsibly, it can support clinicians and improve access to information. But misused, it risks misinformation, confusion, privacy concerns, and emotional harm. 

Where AI Helps 

  • Supporting Mental Health Professionals 

AI can help therapists behind the scenes—like transcribing notes, organizing records, or flagging clinical reminders. This frees up time for what truly matters: listening and connecting with clients. 

  • Early Detection 

Some AI tools can detect early signs of emotional distress by analyzing speech or behavior patterns. While this is promising, these tools are just that—tools. They need trained professionals to interpret them and guide appropriate next steps. 

  • Access to Mental Health Information 

AI can guide users to helpful, evidence-based mental health resources. This can be a good starting point, but it’s not therapy. Real healing involves empathy, cultural understanding, and trust—things AI cannot replicate. 

Where It Falls Short 

  • The “Dr. Google” Trap & Echo Chambers 

We’ve all typed symptoms into Google and ended up convinced we’re dealing with the worst-case scenario and that we are dying. AI can do the same, but in a more personalized way. If you’re already feeling anxious or depressed, AI may start feeding you more information based on your searches. 

This creates an emotional echo chamber, where AI confirms your worries instead of challenging them. For someone in a vulnerable state, this can intensify negative thoughts or delay getting real help. It might feel supportive, but it’s not offering clinical guidance—just algorithms responding to clicks. 

  • No Cultural Awareness 

Mental health in South Asian communities is layered—shaped by family expectations, religious beliefs, language, and generational gaps. AI doesn’t understand the pressure of being the “perfect child” or the silence around mental illness. Cultural context matters, and only a human being can truly grasp that. 

  • Privacy Concerns 

AI tools often collect and store data—sometimes without users fully realizing it. When that data includes emotional or mental health information, the risk of misuse or breach becomes more serious. 

So, Friend or Foe? 

AI can support mental health systems, but it’s not a substitute for human care. It may assist professionals or provide information, but it can’t offer the compassion, safety, or cultural sensitivity that true mental health care requires. 

Especially in South Asian communities—where stigma still exists—what we need is real, human understanding. Let’s use AI as a tool, not a therapist. 

Dr. Shimi Kang MD, FRCPC  

Dr. Shimi Kang is an award-winning Harvard-trained doctor, researcher, and keynote speaker specializing in the science of motivation. She founded Future-Ready Minds, is the host of ‘Mental Wealth with Dr. Shimi Kang’ on YouTube, and is the author of the #1 national bestselling parenting book ‘The Dolphin Parent.’ Her work focuses on mental health, addiction, and brain-related conditions, offering assessments and treatments like psychedelic-assisted therapy and more.   

Devinder Dhaliwal 

Devinder Dhaliwal is a Registered Clinical Social Worker with extensive experience in mental health, addictions, and family support. He holds a Master of Social Work and has operated a successful private practice for over seven years. His professional background includes working with youth at risk of gang involvement, individuals experiencing homelessness, and families navigating complex relationship challenges. Raised in Abbotsford and now based in Chilliwack, Devinder values community, family, and lifelong learning, and enjoys golf, travel, and time with his three sons. 

MORE Tech ARTICLES

Smartphone users spending more time on apps: Study

Smartphone users spending more time on apps: Study
Smartphone users are spending more time on apps that ever. According to a latest study, app usage has increased by an average of 21 percent...

Smartphone users spending more time on apps: Study

Be careful with wearable lifestyle monitoring devices

Be careful with wearable lifestyle monitoring devices
If you plan to buy a wearable device fitted with health apps to improve your well-being, do not go by its flashy advertisements but check if it matches...

Be careful with wearable lifestyle monitoring devices

Habitual Facebook users prone to phishing attacks

Habitual Facebook users prone to phishing attacks
According to a new study by an Indian-origin researcher, habitual use of Facebook makes individuals susceptible to social media phishing attacks by criminals....

Habitual Facebook users prone to phishing attacks

Try selfie hat for a perfect shot

Try selfie hat for a perfect shot
For selfie lovers, technology giant Acer has developed a giant Mexican sombrero fitted with a tablet on the hat's fold-down flap....

Try selfie hat for a perfect shot

Shun smartphone addiction with this surrogate model

Shun smartphone addiction with this surrogate model
Are you addicted to your smartphone and cannot find time for social outings? Try this NoPhone to shun the virtual world and stay closer to reality....

Shun smartphone addiction with this surrogate model

App to save if your kid is in trouble

App to save if your kid is in trouble
A 12-year-old child from Texas has developed a new mobile app that turns the power button on any Android phone into a virtual panic button in case there is a threat...

App to save if your kid is in trouble