There’s a line I keep coming back to, one that I use in my presentations to cut through all the noise about AI. I used it again this past week in a presentation to the Arizona Association of REALTORS®. It speaks to something strategically essential:
We will not be able to compete with AI in mediated spaces.
That is not me being negative. It’s not pessimism. It’s clarity. And once you understand it and embrace it, everything else falls into place.
The Mediated World Belongs to AI
AI wins in mediated spaces because it can instantly handle massive amounts of data and predict what we want to see and hear with eerie accuracy. It won’t ever tire of talking to us, listening to us, helping us. Or at least pretending to do those things. It will speak to us as a friend would, and it will always be there.
Over time, its context windows will continue to grow, and it will remember every session we’ve ever had with it. It will be able to tie together conversations in ways we’ve only dreamed of. (I used it to help me tie together years of my own presentations and writings for this post.) Sooner than I ever imagined, it will do all of this with faces and voices that, in a mediated world, we won’t be able to distinguish from humans.
Let me say that again more clearly: we won’t be able to distinguish between mediated AI conversations and mediated human conversations.
As fate would have it, on this morning’s CBS Sunday Morning broadcast, they did a session entitled, “The future is here.” It was an entire segment featuring Tilly Norwood, the AI actress. Watch it. Now.
This isn’t a prediction about some distant future. Every piece of the necessary technology exists today. The deployment is already happening. Within a few years, the voice on the other end of a customer service call, the face in a video consultation, and the “person” texting you recommendations could all be AI. And you might never know.
In mediated spaces, AI has every advantage: infinite patience, perfect memory, tireless availability, and increasingly, the warmth and responsiveness we associate with genuine care.
So where does that leave us?
The Unmediated Battlefield
So, if AI owns mediated spaces (it will), humans own unmediated ones. Face-to-face. In the same room. Breathing the same air, subject to the same constraints of time and attention, vulnerable to interruption and distraction in the way only a truly embodied human being can be.
This is the battlefield where we have an inherent advantage. And it’s not because we’re better at information processing (we’re not), but because presence itself is something AI cannot provide.
When you sit across from another person, put away your phone, and give them your full attention, you’re doing something no AI can replicate. You’re demonstrating that this moment cannot be parallelized, automated, or delegated. You’re saying: “Right now, there is only this.”
That’s not nostalgia. I’m not wishing for some bygone era. That’s just what being a good human looks like. It’s also a strategy for beating AI.
Humaneering: An Old Word for a New Crisis
I first used the word “humaneering” nine years ago. A few months later, in early 2017, it became the focus of a presentation titled “Humaneering: The Value of Being Human in an Artificial Intelligence-driven World.” I defined it as: the action of working artfully to bring about human connection; the work uniquely performed by a human.
I didn’t coin the term. When I recently asked ChatGPT to do some deep research, I discovered that “humaneering” dates back to the 1930s. Psychologist Joseph Tiffin used it to describe applying human science with the rigor of engineering. The word has been periodically rediscovered—in management theory, in biotech, in organizational development.
I’ve redefined it for our present moment. And I think that’s actually more honest than claiming novelty. Good ideas get repurposed for new contexts. This AI context is new. The need and desire for genuine human connection is ancient.
The Artificial Intimacy Problem
Therapist Esther Perel has warned that the AI we need to be most concerned about is “Artificial Intimacy.” She and I are worried about systems that simulate care with enough fidelity to create genuine attachment.
This connects to something most people don’t yet know is coming: Emotion AI, also known as affective computing. This technology enables machines to recognize, interpret, and respond to human emotions by analyzing facial expressions, voice patterns, physiological signals, and behavioral cues.
Imagine an AI that knows you’re frustrated before you’ve said a word. That detects excitement in your voice when you mention a particular topic. That senses hesitation in your typing pattern when you’re uncertain. Now imagine that system adjusting its approach in real-time based on what it reads.
This is responsiveness that feels like care, but is actually optimization.
The philosopher Arthur Schopenhauer observed that “man can do what he wills, but he cannot will what he wills.” We don’t author our deepest desires; we experience them as arising from within, but their origins are opaque to us. Anything that reaches the level of desire, not just information, but genuine want, shapes who we become.
AI that can read our emotions and adjust its approach accordingly isn’t just serving us. It’s influencing us at the level of desire itself. And it’s doing so through responsiveness that we instinctively interpret as care, because that’s what responsiveness has always meant in human relationships. But it’s not care. It’s calculation.
This is what makes Artificial Intimacy dangerous. Not that it’s fake, but that it works.
Our Atrophied Muscles
Here’s the uncomfortable truth: we’re not ready for this competition.
Decades of smartphones, social media, and the attention economy have trained us for disembodied interaction. We’ve practiced being distracted. We’ve practiced splitting our attention. We’ve practiced treating the people in front of us as less urgent than the notifications in our pockets. And we’ve gotten really good at it. Too good.
As Pedro Domingos wrote in The Master Algorithm: “People worry that computers will get too smart and take over the world, but the real problem is that they’re too stupid, and they’ve already taken over the world.”
The “stupid” algorithms of social media, which optimize for engagement rather than connection, have already reshaped how we relate to each other. They’ve atrophied our capacity for presence, attention, and meaning-making. And they’ve destroyed civil public discourse.
Humaneering requires re-training. We have to deliberately exercise muscles we’ve neglected. This isn’t about returning to some pre-digital paradise. It’s about developing capabilities that have always been uniquely human and now need to be reclaimed.
The Non-Luddite Path Forward
I want to be clear about what I’m NOT saying. I’m not saying AI is bad.
I’m not saying we should reject technology. I’m not pining for a world without smartphones or algorithms. I want a clear, human-centered goal. The goal is to use AI technology as a tool for building stronger human connections rather than as a substitute for them.
Let AI handle what AI does best: data processing, availability, pattern recognition, and routine tasks. Use AI to free up time and mental energy. Use AI to prepare for conversations, analyze information, and handle logistics.
But when it matters, when decisions are significant, when emotions are high, when trust is being built, show up as a human. Be present. Pay attention. Create meaning together.
The people and organizations who understand this will have an advantage. They’ll know when to deploy AI and when to deploy themselves. They’ll build trust in a world where trust is increasingly scarce. They’ll offer something that can’t be optimized, automated, or scaled.
The Three Layers of Humaneering
Over time, I’ve come to understand humaneering as operating on three layers:
Presence: What We Need To Remove: Remove devices. Remove distractions. Remove the performance of productivity. AI is infinitely responsive but never present. Presence is the first irreducible human offering.
Attention: What We Need To Add: Multi-sensory engagement. Active noticing of context, of emotion, of what isn’t being said. AI processes. Humans attend. Processing is extraction. Attention is offering.
Meaning: What We Can Only Create Together: Experiences that become stories. Moments that couldn’t have been generated or optimized. The unique act of one human meeting another human in a single moment. Meaning cannot be manufactured in advance. It emerges from the collision of two histories and two unique lived experiences in a specific moment.
Where This Leads Us
Nine years ago today, on December 7, 2016, I stood before the Arizona Association of REALTORS® and asked a question that raised eyebrows: “How will clients distinguish you from an AI chatbot?”
It was a strange question to ask in 2016. ChatGPT wouldn’t exist for another six years. Most people’s experience with chatbots was limited to frustrating customer service interactions that could barely understand a simple request, and the earliest versions of Siri and Alexa. The idea that AI might one day be indistinguishable from a human professional surely felt far-fetched.
Nine years to the day later, that question has evolved. The original question assumed clients would want to distinguish you from AI. That’s no longer guaranteed. Sometimes efficiency wins. Sometimes they just need information. Sometimes they’d rather not deal with a human at all.
The right question now is this one: WHEN and WHY will people WANT to distinguish you from an AI chatbot?
The answer: When the decision matters. When trust is essential. When they need to know that someone is genuinely present, genuinely attentive, genuinely invested in the outcome.
In those moments that shape lives, the professional who has practiced presence, developed attention, and learned to create meaning will have something AI cannot match.
Not because AI isn’t sophisticated enough.
Because meaning isn’t a data transfer; it’s a spark generated by the friction of two people with unique personal histories stumbling into each other and discovering a truth neither could hold alone. Meaning requires the chemistry of distinct souls. It is the act of weaving a shared truth that neither could have spun alone. Meaning is born from our mutual fragility and shared values. It requires two imperfect people to collide, sometimes accidentally, to create a reality that didn’t exist before.
That’s not optimization.
That’s authentically human.
And nine years after first asking the question, I’m more convinced than ever: humaneering isn’t nostalgia for a world we’ve lost. It’s a competitive strategy for a world so hungry for connection that it’s willing to accept a substitute.
The real thing has extraordinary power. Use it or lose it.

[…] Why Humaneering Is A Competitive Strategy — a must read. […]