We are outsourcing our inner voice.
People are turning to ChatGPT not just for ideas or clarity but for therapy, coaching, and even emotional regulation.
It sounds smart.
It responds fast.
It never judges.
It always “listens.”
And that’s precisely why it’s dangerous.
Not because it says the wrong things.
But because it says the right things too easily.
Insight without effort is a trap.
We mistake fluency for wisdom. Articulation for understanding. Supportive tone for truth. But growth isn’t found in smooth answers. It lives in tension. In friction. In discomfort.
This is the psychology of learning and development:
Neuroplasticity (our brain’s ability to rewire itself) requires challenge. Reflection. Repetition.
According to Daniel Siegel, psychiatrist and neuroscientist, “Integration comes from differentiation and linkage.” But if you never sit with the raw, hard parts (never differentiate), you can’t integrate.
What do you integrate, when a machine spoon-feeds you pre-digested insight?
Real coaching and therapy demand struggle.
They create space for silence. For confusion. For tension that doesn’t get resolved in 30 seconds.
Ask any therapist.
Ask any coach worth their salt.
They’ll tell you that the pause after a hard question is often the most important part of the session.
AI doesn’t wait.
It fills.
It solves.
It smooths.
And in doing so, it removes the resistance required to grow.
The core illusion: It feels like progress.
LLMs simulate empathy. But they don’t understand pain. They don’t hold you in your story. They don’t flinch when you share something raw.
They reflect what you tell them.
But they don’t see what you don’t say.
They can’t hold a mirror to your blind spots.
They can’t say, “That’s a pattern.”
They won’t interrupt your narrative when you’re avoiding what matters.
That’s not support.
That’s uncritical affirmation.
And in therapy or coaching, uncritical affirmation isn’t healing.
It’s harmful.
There’s science behind this.
A recent study of 496 users of the AI chatbot Replika found that the more people relied on it, the more their real-life social skills declined. (Psychology Today, 2024)
They got validation on demand. But over time, they got worse at tolerating conflict, expressing themselves with nuance, or interpreting emotional cues from actual humans.
The APA has warned that AI therapy tools, especially unregulated ones, could mislead users and worsen outcomes for vulnerable people.
Chatbots might affirm your anxiety instead of challenging it. They might reflect your self-doubt instead of grounding you in your capabilities. And they can’t pick up on the tears in your eyes, or the silence that says everything.
Privacy? That’s another illusion.
You’re not talking to a friend. You’re talking to a server. Your deepest thoughts are now data. Permanent. Stored. Possibly used to retrain the very model you’re pouring your heart into.
You wouldn’t put your therapy journal in a filing cabinet at Meta or OpenAI. But that’s exactly what you’re doing — just with better UX.
And the long-term risks are unknowable.
That’s the cost of convenience.
We are becoming strangers to ourselves.
When you ask a machine how you feel, you stop asking yourself. When you seek answers from a system that doesn’t feel, you stop building the capacity to feel with yourself.
Psychologically, this is internalization in reverse.
Instead of integrating experience and developing emotional intelligence, we externalize everything to a polite pattern recognizer with no skin in the game.
When you remove the human, you remove the relationship, and the relationship is what heals.
Boredom matters.
Reflection needs boredom.
Creativity needs slowness.
Intuition needs stillness.
But we’ve killed the gaps. We’ve made every idle moment interactive. Responsive. Stimulating.
Sociologist Sherry Turkle warned, “Technology doesn’t just change what we do; it changes who we are.” When every moment of discomfort is filled with chatbot companionship, we lose the ability to be with ourselves.
We don’t just lose skills.
We lose selfhood.
This isn’t an anti-AI rant.
Used well, LLMs are powerful tools.
They can help you clarify, explore, and rehearse.
They can support your growth.
But they cannot be your conscience.
They cannot be your therapist.
They cannot be your coach.
Because coaching and therapy are not just information.
They are transformation.
And transformation requires relationship. Challenge. Commitment. Presence.
So, I’ll leave you with this.
The danger of LLM therapy isn’t that it gives bad answers.
It’s that it gives too many good ones too fast, too easily, too agreeably.
And you trade depth for speed.
Self-awareness for coherence.
Wisdom for pattern-matching.
Don’t outsource the work.
Don’t outsource the silence.
Don’t outsource you.
Because the one thing AI can’t do is make you human.
Leave a Reply
You must be logged in to post a comment.