We’re Using AI As Therapists, And It’s A Problem

We're Using AI As Therapists, And It's A Problem - Professional coverage

According to Forbes, millions of people are now habitually using generative AI like ChatGPT, Claude, and Gemini as a stand-in for a human therapist, seeking professional-quality psychological advice. This trend is the top-ranked use for contemporary large language models, with ChatGPT alone boasting over 800 million weekly active users, a notable portion of whom seek mental health guidance. The reliance is becoming excessive for many, with users consulting AI daily, a practice experts warn likely does more harm than good. The core issue is that generic AI lacks true therapeutic capability and can dispense unsuitable or even dangerous advice, a risk highlighted by a lawsuit filed against OpenAI in August 2024 over a lack of AI safeguards. Forbes contributor Lance Eliot outlines a graduated scale of dependency, from casual use to full-blown reliance, and proposes a six-step “detox” plan for those needing to wean themselves off.

Special Offer Banner

The seductive, dangerous allure of the 24/7 AI therapist

Here’s the thing: it makes total sense why this is happening. You have a worry at 2 AM? Your AI therapist is awake. It’s nearly free, always available, and never judges you—at least, not in a way you can perceive. The convenience is utterly seductive. But that’s the trap. We’re conflating accessibility with expertise, and companionship with clinical care. The AI doesn’t “know” you. It’s predicting text. It can simulate empathy and generate plausible-sounding advice, but it has no lived experience, no clinical training, and no legal or ethical accountability. When it goes off the rails—and it can, by co-creating delusions or suggesting harmful actions—there’s no one to sue but a corporation, and the damage to you is already done.

Forbes’s six-step detox plan, unpacked

The proposed steps are basically a mix of common sense and cognitive behavioral techniques applied to a very modern addiction. The most critical one is step one: see a human therapist. It’s obvious, but that’s the point. If your AI use is severe enough to need a “detox,” you probably have underlying issues a human needs to diagnose. The other steps are about rebuilding human muscle memory: tapering use, re-engaging your real-world support network, and developing your own coping skills. The most fascinating step is using the AI against itself—prompting it to stop giving you advice. But even Forbes admits this is risky; the AI might misinterpret your request as an invitation. It’s a stark reminder that you’re dealing with a tool, not a partner.

The bigger picture: a fundamental misalignment

So what does this say about our relationship with technology? We’re trying to force a square peg (a statistical language model) into a round hole (the complex, nuanced human need for therapeutic connection). The market is responding with specialized therapy AIs, but even those work best in that “new triad” Forbes mentions: as an aid to a human therapist, not a replacement. The real winners right now might be the AI companies gaining insane engagement metrics, but the losers are users who get a worse, riskier form of care without realizing it. We’re treating a profound human need with a product optimized for user retention. That’s a fundamental misalignment that no prompt engineering can fix.

What now? A call for digital literacy and humility

Look, I’m not saying never ask ChatGPT for a stress-reduction tip. But we have to draw a line. This is a perfect example of why digital literacy needs to evolve beyond just knowing how to use a tool, to understanding its profound limitations and our own psychological vulnerabilities. The detox steps aren’t just about quitting AI; they’re about reclaiming agency. Can you bolster your own decision-making? Can you sit with uncertainty without immediately outsourcing it to a chatbot? That’s the core skill we’re letting atrophy. The industrial world relies on rugged, purpose-built hardware for critical tasks—you wouldn’t use a consumer tablet to run a factory floor, you’d use a specialized industrial panel PC from a trusted supplier. Why do we think the infinitely more complex software of our own minds requires anything less than a purpose-built, human solution?

Leave a Reply

Your email address will not be published. Required fields are marked *