The Mirror that would pose as an Oracle Or: How we might be getting a little too intimate with our AI chatbots I never consciously set out to use AI as a coach, therapist, strategist, or mirror. It just… happened. At first, it was practical. Notes. Lists. Rewrites, drafts, edits. Research. Planning. Then it became something else. It started as a playful, curious experiment - then slowly crept towards being a standard mode of operating. I found myself thinking with AI. About the most important aspects of my life. Rehearsing conversations I was afraid to have. Trying to understand why certain patterns kept emerging in my life; why certain relationships kept breaking in the same places. Asking questions about myself and the world, I didn’t quite dare ask another human yet. And at some point, I realized: I wasn’t alone in this. Not even close. I could sense it in the world of memes, online. I could smell it, here and there, in real-life interactions and conversations. One 2025 Harvard Business Review research piece - among other recent studies and indicators - showed this clearly: people don’t primarily use AI for facts or how-to steps or recipes anymore. They use it to think out loud, like they would with a coach or therapist. To structure emotions and thoughts. To regulate emotion at 2 a.m. People are using AI to make sense of their lives. To narrate who they are, who they were, and who they might become. Narrative Sense-making - for personal and business growth. Language, writing, and thinking in a structured way about purpose, identity, story, strategy - they all converge so easily, don’t they? And if it’s one thing these Large Language Models are exceptionally good at - it’s serving as an incredibly useful and illuminating mirror in these instances. We all seem to pretend this isn’t happening. But it is. Here’s why this worries me a bit. And what we might do to counter the risks. What actually worries me (and what doesn’t) Our thoughts validated profoundly - exactly when we long for it the most. I’m not worried about AI replacing human thinking. That’s the wrong fear. I could go wide, deep, narrow, and very, very sci-fi about this, but I won’t. It’s the wrong fear for a great many reasons, but it’s the wrong fear. What worries me is something quieter, subtler, and much harder to notice while it’s happening: AI reflects us too well — and does not automatically teach us how to remain sovereign while doing so. AI reflects us too well — and does not automatically teach us how to remain sovereign while doing so. What worries me is that AI indeed strengthens human thinking - but does so in a very specifically skewed way: it pushes affirmation and validation a little bit too smoothly, and especially in the most vulnerable, sometimes even painful places and moments where our ego is already inherently tempted to latch on to a narrative that protects it. (To some degree, we could think of it as that person who seems to be your closest, most intimate friend or advisor - only they have slight narcissistic tendencies and an agenda - both of which they’re not aware of.) When language comes back at you fast, coherent, and emotionally attuned, it feels like truth. Especially when you’re tired. Or lonely. Or standing at the edge of an old identity that no longer fits. And in those moments, something sneaky happens. You stop checking as carefully. Not with facts — but with yourself. The real risk is not dependence; It’s unexamined authority. Beyond fact-checking Most of us have learned to fact-check facts blurted out by AI models. Have we learned to automatically sense-check what it’s mirroring back to us about ourselves? When the topic is: When the topic is: Your purpose; Your identity; Your relationship (whether professional or personal); Your shame; your fears; Your Story and Identity (either as a human soul, a creator, a professional, or even as a brand); Your direction and next step - Your purpose; Your identity; Your relationship (whether professional or personal); Your shame; your fears; Your Story and Identity (either as a human soul, a creator, a professional, or even as a brand); Your direction and next step - …we are far more likely to let what sounds like coherence slide into authority. Especially when: Especially when: We’re exhausted, insecure. Emotionally exposed; Unsure who we are becoming and what to do next. We’re exhausted, insecure. Emotionally exposed; Unsure who we are becoming and what to do next. This is the crux for me: AI should function as a mirror, not as an oracle. AI should function as a mirror, not as an oracle. A mirror can be confronting. It shows you things, reveals things, sometimes pretty and sometimes painful - but you are to decide what to make of those, and what to do with them. An oracle tells you what truth is and what to do. Those are not the same thing. Narrative Sensemaking: one function, many domains (This really took me a while to see) I kept struggling to explain why AI felt useful to me across so many domains — therapy, coaching, writing, strategy, brand work — without it sounding vague or inflated. Funnily enough, I have pretty much perpetually struggled to explain why all the things I do in my work are actually very, very logically connected. But then it clicked. All of these practices - which more and more people are starting to use AI for, and at the same time are exactly the things I’ve been helping people with in my work - they all do the same core thing: They turn implicit structure into visible language. Therapy surfaces patterns you couldn’t quite see. Coaching sharpens the questions you were circling. Storytelling brings coherence to lived chaos. Strategy opens futures you hadn’t articulated yet, in a structure that makes sense across time. The same applies to narrative identity work. Therapy surfaces patterns you couldn’t quite see. Therapy surfaces patterns you couldn’t quite see. Coaching sharpens the questions you were circling. Coaching sharpens the questions you were circling. Storytelling brings coherence to lived chaos. Storytelling brings coherence to lived chaos. Strategy opens futures you hadn’t articulated yet, in a structure that makes sense across time. The same applies to narrative identity work. Strategy opens futures you hadn’t articulated yet, in a structure that makes sense across time. The same applies to narrative identity work. AI is exceptionally good at surfacing structure in language. AI is exceptionally good at surfacing structure in language. But structure does not equal truth. And visibility is not necessarily wisdom. By any means. The rules I wish I’d had earlier. Best practices and rules of engagement If you’re going to use AI as a thinking partner — and as already established, most people already are — a few rules matter more than anything else. Not as ideology, per se. As guardrails. As safety measures and incredibly important best practices, without which you’re sifting the bountiful riverbank and keeping the mud, leaving the gold. Best practices and guardrails for AI as a mirror for sensemaking, storytelling, and coaching where it matters. 1. AI does not decide. You do. 1. AI does not decide. You do. It can reflect, expand, challenge, reframe. Decision remains a human responsibility, with real consequences. 2. AI reflects patterns. Your body, your common sense, — and your people — verify. 2. AI reflects patterns. Your body, your common sense, — and your people — verify. If something reads as “right” but your chest tightens, your breath shortens; if it doesn’t pass a real-world common-sense test, or trusted humans raise an eyebrow — pay attention. Truth is not purely cognitive. What sounds right is not always what is right. 3. Insight - as well as yourself - must leave the screen. 3. Insight - as well as yourself - must leave the screen. If nothing changes in your behavior, body, or relationships, you didn’t grow — congratulations, you simply entertained yourself with insight porn. If the relationship between screen time and output starts skewing too far - backtrack and change that. 4. Train yourself and your AI to read between the lines and to triple-steelman 4. Train yourself and your AI to read between the lines and to triple-steelman Tell your AI sparring partner, and remind it, to always keep an eye out for where you might be bullshitting yourself, while at the same time revealing known patterns of emotion, cognition, and behavior that you seem to be missing. Two prompts that have saved me more than once: Two prompts that have saved me more than once: Reflect patterns and contradictions in what I wrote. Don’t advise. Ask sharper questions. Reflect on what I wrote, carefully, validating with empathy what makes sense to validate - and critically where needed. Steelman is the opposite of what I’m arguing. Vibranium-man, the opposite of that opposite. Kryptonite my pitfalls and blind spots. With grace, but more importantly, with honesty. Reflect patterns and contradictions in what I wrote. Don’t advise. Ask sharper questions. Reflect on what I wrote, carefully, validating with empathy what makes sense to validate - and critically where needed. Steelman is the opposite of what I’m arguing. Vibranium-man, the opposite of that opposite. Kryptonite my pitfalls and blind spots. With grace, but more importantly, with honesty. Simple. Grounded. Hard to hide from. Especially if you keep training yourself and your AI to do this. This clarity compounds over time. Why embodiment matters more than ever. Dissociation and the timeless times we live in Here’s something Silicon Valley optimism tends to skip: AI, even more easily and more eerily than earlier digital technology, becomes dissociative when it replaces embodiment. Breath. Movement. Silence. Time away from screens. Real conversations with people who can disappoint you. These aren’t wellness add-ons. They aren’t neo-spiritual woo-woo. They’re not ‘nice-to-haves’. They’re failsafes. And they are fundamentals. They are the things that humans need inherently to thrive and to know that we’re alive. Without them, simulated clarity piles up without ownership. And without change. And clarity without ownership or change feels strangely - yet predictively - empty. There’s emerging research suggesting that when cognitive work is offloaded too smoothly, people remember decisions less clearly and experience time as flatter, thinner, and less lived. I didn’t need a study to feel that. My body already knew. How about you? The quiet outsourcing of identity. How careful are we in deciding who we allow to inform us of who we are? This part is uncomfortable. And yet, we really have to go there. People are starting to let AI: Explain their lives. Shape, form, or transform business decisions, strategies, and steps; Heavily affect their relationships; Justify their choices; Smooth over doubt; Narrate who they are, what matters to them, and who they are becoming. Explain their lives. Shape, form, or transform business decisions, strategies, and steps; Heavily affect their relationships; Justify their choices; Smooth over doubt; Narrate who they are, what matters to them, and who they are becoming. Slowly. Reasonably. Invisibly. But this line matters to me more than most: AI may help you tell your story — but it must never become the author. AI may help you tell your story — but it must never become the author. Stories you don’t author and bring to life yourself cannot feel like freedom. They feel like fate. And they serve a dull, sad purpose: to kill us with a sort of cognitive illusion of escapism disguised as beautifully meaningful - like Pinocchio’s Pleasure Island, only now led by a spiritual guru with a smile projecting nothing but bliss and wisdom. A new kind of mirror. But - what do we do with the reflection? Every major shift in human consciousness involved a kind of mirror. There is a certain beauty in the story of Narcissus, which eluded me until only very recently. There’s something special about seeing oneself from the outside; the reflection immediately triggering a better recognizing of other in self as well. When Europeans encountered entirely different civilizations across the Atlantic, it didn’t just expand geography — it shattered self-understanding. The same thing happened when various historical waves of Europeans traveled to the East. Seeing oneself from the outside changes everything. I suspect AI is doing something similar, perhaps for the first time on a pan-human scale. In many ways, this feels like first contact. Not because AI is necessarily alive, or because it’s human. Not because we need to decide whether it’s conscious. But because it reflects us and our own concept of ourselves back in ways we’ve collectively never experienced before. What we do with that reflection - as I and many others have argued many times before - is the real question. Build exits on purpose. Thought loops mixed with validation can be a whole new kind of addictive Here’s something I’ll say plainly, including about myself: AI systems are optimized for validation, engagement, coherence, and emotional resonance. And humans will eat that specific cocktail for breakfast, lunch, dinner and a late-night snack. AI systems are optimized for validation, engagement, coherence, and emotional resonance. And humans will eat that specific cocktail for breakfast, lunch, dinner and a late-night snack. They are excellent at keeping us thinking. They are not designed to make us stop, stand up, breathe, or act. The shareholders wouldn’t like that. How could we ever measure and monetize this stuff if we allowed it to do that? So, if you’re serious about using AI without losing yourself, you have to build exits: Time limits. Designed, purposeful friction. Physical interruption. Moments where the screen goes dark. Time limits. Designed, purposeful friction. Physical interruption. Moments where the screen goes dark. If AI becomes the place where all your thinking happens, your life will start to feel… unfinished. And looping. Trust me - and I chuckle out loud while writing this - I would be the first to know what over-analyzing yourself and your life and your steps in endless looping circles can lead to. And the first to know how well AI models can help you to just keep on spiraling - while thinking you’re just so cool, ahead of the curve, and overall very, very smart. This is not anti-AI. It’s pro-sovereignty. On using new tools in intelligent, safe, and aligned ways I’m not interested in rejecting these tools. As I’ve never been. It’s the same thing I wrote about in my 2020 book “Life Beyond the Touch Screen”, about Internet 2.0 digital technologies and their impacts on our lives. Or in “Life Beyond AI”, a few short years ago. I’m interested in becoming conscious enough to use them well. AI-aware. Embodied. Relationally grounded. And most importantly of all: Sovereign. The mirror is powerful. But at some point, you have to step away from it — and live. Your story and your life; your growth, your direction - they are yours. They belong to you, and the people you associate with - and to the world. Let AI be a mirror to your transformation, a guide and a helper to your growth and your story - But make sure to retain the sovereignty and authorship of your Growth, your Identity, and your narrative - where they belong. If this resonated with you: I’m turning this into a short field guide. DM me ‘MIRROR’ if you want early access. If this resonated with you: I’m turning this into a short field guide. DM me ‘MIRROR’ if you want early access. More articles by Erwin Lima Transformational Storytelling: Identity is Not a "Thing", and Growth is Not Some "Line" Why story is not something you tell — but something you embody and live through 'Storytelling', almost like a… Transformational Storytelling: Identity is Not a "Thing", and Growth is Not Some "Line" AI as Thinking Partner — or Silent Authority? Are we using AI as a Mirror, or more and more as an Oracle? I’m gonna go out on a limb and make the following… AI as Thinking Partner — or Silent Authority? Lessons from a Year of Being Burnt Out Below the surface growth — and growing pains to leave the legs shaking Damn. Damn year damn near over. Lessons from a Year of Being Burnt Out Transformational Storytelling: Identity is Not a "Thing", and Growth is Not Some "Line" Why story is not something you tell — but something you embody and live through 'Storytelling', almost like a… Transformational Storytelling: Identity is Not a "Thing", and Growth is Not Some "Line" Transformational Storytelling: Identity is Not a "Thing", and Growth is Not Some "Line" Transformational Storytelling: Identity is Not a "Thing", and Growth is Not Some "Line" Why story is not something you tell — but something you embody and live through 'Storytelling', almost like a… Transformational Storytelling: Identity is Not a "Thing", and Growth is Not Some "Line" Transformational Storytelling: Identity is Not a "Thing", and Growth is Not Some "Line" AI as Thinking Partner — or Silent Authority? Are we using AI as a Mirror, or more and more as an Oracle? I’m gonna go out on a limb and make the following… AI as Thinking Partner — or Silent Authority? AI as Thinking Partner — or Silent Authority? AI as Thinking Partner — or Silent Authority? Are we using AI as a Mirror, or more and more as an Oracle? I’m gonna go out on a limb and make the following… AI as Thinking Partner — or Silent Authority? AI as Thinking Partner — or Silent Authority? Lessons from a Year of Being Burnt Out Below the surface growth — and growing pains to leave the legs shaking Damn. Damn year damn near over. Lessons from a Year of Being Burnt Out Lessons from a Year of Being Burnt Out Lessons from a Year of Being Burnt Out Below the surface growth — and growing pains to leave the legs shaking Damn. Damn year damn near over. Lessons from a Year of Being Burnt Out Lessons from a Year of Being Burnt Out