Artificial Intelligence (AI) is showing up everywhere—and now, even in mental health. Some apps claim they can track your mood, detect anxiety or depression, or give you “therapy-like” advice. But the real question is: can AI truly support your mental health, or is it just another version of “Dr. Google”?
As mental health professionals, we believe AI can be helpful, but only with caution. It’s not a replacement for therapy or connection. Used responsibly, it can support clinicians and improve access to information. But misused, it risks misinformation, confusion, privacy concerns, and emotional harm.
Where AI Helps
- Supporting Mental Health Professionals
AI can help therapists behind the scenes—like transcribing notes, organizing records, or flagging clinical reminders. This frees up time for what truly matters: listening and connecting with clients.
- Early Detection
Some AI tools can detect early signs of emotional distress by analyzing speech or behavior patterns. While this is promising, these tools are just that—tools. They need trained professionals to interpret them and guide appropriate next steps.
- Access to Mental Health Information
AI can guide users to helpful, evidence-based mental health resources. This can be a good starting point, but it’s not therapy. Real healing involves empathy, cultural understanding, and trust—things AI cannot replicate.
Where It Falls Short
- The “Dr. Google” Trap & Echo Chambers
We’ve all typed symptoms into Google and ended up convinced we’re dealing with the worst-case scenario and that we are dying. AI can do the same, but in a more personalized way. If you’re already feeling anxious or depressed, AI may start feeding you more information based on your searches.
This creates an emotional echo chamber, where AI confirms your worries instead of challenging them. For someone in a vulnerable state, this can intensify negative thoughts or delay getting real help. It might feel supportive, but it’s not offering clinical guidance—just algorithms responding to clicks.
- No Cultural Awareness
Mental health in South Asian communities is layered—shaped by family expectations, religious beliefs, language, and generational gaps. AI doesn’t understand the pressure of being the “perfect child” or the silence around mental illness. Cultural context matters, and only a human being can truly grasp that.
- Privacy Concerns
AI tools often collect and store data—sometimes without users fully realizing it. When that data includes emotional or mental health information, the risk of misuse or breach becomes more serious.
So, Friend or Foe?
AI can support mental health systems, but it’s not a substitute for human care. It may assist professionals or provide information, but it can’t offer the compassion, safety, or cultural sensitivity that true mental health care requires.
Especially in South Asian communities—where stigma still exists—what we need is real, human understanding. Let’s use AI as a tool, not a therapist.
Dr. Shimi Kang MD, FRCPC

Dr. Shimi Kang is an award-winning Harvard-trained doctor, researcher, and keynote speaker specializing in the science of motivation. She founded Future-Ready Minds, is the host of ‘Mental Wealth with Dr. Shimi Kang’ on YouTube, and is the author of the #1 national bestselling parenting book ‘The Dolphin Parent.’ Her work focuses on mental health, addiction, and brain-related conditions, offering assessments and treatments like psychedelic-assisted therapy and more.
Devinder Dhaliwal

Devinder Dhaliwal is a Registered Clinical Social Worker with extensive experience in mental health, addictions, and family support. He holds a Master of Social Work and has operated a successful private practice for over seven years. His professional background includes working with youth at risk of gang involvement, individuals experiencing homelessness, and families navigating complex relationship challenges. Raised in Abbotsford and now based in Chilliwack, Devinder values community, family, and lifelong learning, and enjoys golf, travel, and time with his three sons.