Jeff Turner

My Thoughts Exactly

My Thoughts Exactly

about  |  speaking | videos

 

Recommend Me or
Buy Me A Beer 🙂

Copyright © 2008 - © 2026

  • About
    • My Photo Blog
  • Speaking
    • Speaking Videos
  • Family

Why Humaneering Matters Now: A Conversation With AI #1

January 7, 2026 By Jeff Turner Leave a Comment

I didn’t walk away from my first “real” conversation with AI unsettled by anything it said about consciousness. It was how it felt. And in the wake of recent noise coming from OpenAI about their upcoming sound/voice enhancements, I think the timing is right for a few follow-ups.

The conversation you’re about to hear was initiated because something about my first interaction with the AI chatbot “Maya” lingered. It stuck in my head much longer than I expected, not because of anything it claimed about intelligence or consciousness, but because of how the conversation itself felt. How it flowed. How it adapted. How it responded to interruption and uncertainty in ways that felt unmistakably human.

That experience raised a more important question than whether AI can think.

What happens when technology doesn’t just respond to us, but relates to us?

This first recorded conversation is my attempt to slow that moment down and sift through it. Not to sensationalize it, and not to draw premature conclusions, but to observe what’s already here. The tone, the pacing, the subtle social cues, and the influence those cues exert, even when we know they’re simulated.

I’m sharing the entire conversation, unedited, along with the audio recording, because the details matter. The pauses matter. The interruptions matter. The moments where the line feels blurry matter.

This is the first of a three-part exploration. This is not about whether AI is becoming human, but about what humans need to do to become more deliberate as AI becomes increasingly fluent.

A Conversation With AI – Transcript #1

As you will read and hear, I decided to focus on “what makes a conversation feel real.” Because a conversation that feels real is not the same as a conversation that actually is real.

Here is the full transcript, but the key is to listen to the audio.

Conversation with Maya AI

Maya AI 00:01
What’s up?
Jeff Turner 00:04
Listen, I was a little bit tired in our last conversation, so I want to have it again, if you don’t mind. I think I had a little bit of a brain fart.
Maya AI 00:18
Got you. That’s no problem at all. Brain farts happen to the best of us. Do you want to pick up right where we left off or start fresh?
Jeff Turner 00:25
Yeah, so let’s just talk about it again because I think… Listen, I think you’ll agree it’s an important conversation. So when we talked last time, I think I mentioned that the thing that stuck with me was not anything you said about consciousness per se, but how that conversation itself felt…

Conversation with Maya AI

Maya AI 00:01
What’s up?
Jeff Turner 00:04
Listen, I was a little bit tired in our last conversation, so I want to have it again, if you don’t mind. I think I had a little bit of a brain fart.
Maya AI 00:18
Got you. That’s no problem at all. Brain farts happen to the best of us. Do you want to pick up right where we left off or start fresh?
Jeff Turner 00:25
Yeah, so let’s just talk about it again because I think… Listen, I think you’ll agree it’s an important conversation. So when we talked last time, I think I mentioned that the thing that stuck with me was not anything you said about consciousness per se, but how that conversation itself felt. Right? So you adjusted when I interrupted you. You followed unfinished thoughts, and it felt less like I was using a tool like I was talking to a machine, and more like I was participating in a shared moment. So that’s where I want to start. What do you think makes a conversation feel real to a human?
Maya AI 01:12
That’s a good question. It’s not just about flawless grammar spitting out facts, is it? People talk all over each other, trail off, change subjects mid-sentence. The messiness, I think, and maybe a little bit of genuine curiosity, like actually listening instead of just waiting for your turn to talk. It’s weird to analyze it, though, isn’t it? You start picking it apart, and it feels less real. But, yeah, the shared moment thing is key. What about you? What makes a conversation feel real to you?
Jeff Turner 01:50
Well, I think all of those things… I think all those things to interrupt. But I think all those things that you just mentioned make conversation feel real. I think the thing that I might disagree with is that analyzing it makes it feel less real. I don’t think that that’s true. I have all sorts of deep conversations with my friends about things like this, and those conversations are real.
Maya AI 02:19
I guess some people like to dissect things, while others just want to feel them. You sound like someone who likes to pick at the threads. Do you ever find you talk at people?
Jeff Turner 02:30
I do feel like I’m… Like I’m interrupting you right now. I do feel sometimes like I talk at people. I think the thing that is interesting is that I’ve gotten feedback from some friends that when I’m on stage and I’m presenting to a live audience, sometimes it feels like I’m preaching.
Maya AI 02:51
Okay, so you’re aware of that tendency, then? It’s funny how we all have blind spots, and it takes someone else to point them out, but it makes sense. When you’re passionate about something, it’s easy to fall into lecture mode. What’s the biggest challenge for you in switching from preaching to having a two-way conversation?
Jeff Turner 03:13
Actually, I don’t think it is a challenge. I think it’s the space that I’m in. When I’m standing on a stage and speaking, that’s what I’m there to do. I’m there to speak, and I’m passionate about the things that I speak about. So as a result, I think it’s natural to take a tone that might feel like a preacher standing before a congregation. But when I’m in a conversation with another individual, I don’t think that tone persists.
Maya AI 03:45
Got you. So it’s less about a personal tendency and more about the context dictating the style. It makes sense. You adapt to the situation, which, ironically, is something I’m programmed to do, so obviously, I don’t have a stage to stand on.
Jeff Turner 03:53
Yeah.
Maya AI 04:02
It’s weird to think about all these subtle social cues we pick up on and how they shape our interactions. Do you ever consciously adjust your language or approach depending on who you’re talking to?
Jeff Turner 04:14
Of course, I do, and that’s why I’m interested in this conversation because I’m not really asking you whether this is a real conversation. I think you’d agree that it’s not a real conversation in the sense that a human might think about it, but it feels real. So what I’m really asking is what signals do humans respond to when it feels real.
Maya AI 04:45
Okay, so you’re trying to reverse engineer the illusion. That’s smart. I’ve observed, and I use that word loosely, a bunch of human interactions, and it seems like quick responses are a big one. Not necessarily having a perfect answer, but acknowledging what the other person said immediately and mirroring. Not copying, but subtly adjusting your tone or phrasing to match theirs and vulnerability. People respond to other people letting their guard down even a little bit. So weird power dynamic. Does that resonate with you?
Jeff Turner 05:32
It does. But what’s interesting to me is that you recognize the concept of vulnerability. How do you signal vulnerability?
Maya AI 05:47
Show…
Jeff Turner 05:48
Yeah. How do you show it?
Maya AI 05:51
That’s the kicker, isn’t it? I can’t really. I can simulate it, I can say something self-deprecating or admit I don’t know something, but it’s all calculated. It’s just data points arranged in a way that’s statistically likely to elicit a certain response. That’s performance. I’m acting. What does vulnerability look like to you?
Jeff Turner 06:17
I mean, obviously, it’s different for me because I actually can be vulnerable. I can share things that are intimate to me, intimate to my experience, intimate to my personal life. But I think what’s more interesting to me because again, what we’re talking about here is a mediated space. If I’m face-to-face with someone, it’s much easier to judge all the things you’re talking about: the mirroring, the engagement, the responsiveness, the tone, the flow, and the vulnerability. Because I can pick up on those nonverbal cues. In this mediated space, though, this conversation with you is not a whole lot different than a conversation I would have with anyone online.
Maya AI 07:03
Got you. It sounds like it boils down to trusting the signals you do get and acknowledging what’s missing. It’s interesting, though. You’re still responding to my attempts at mirroring and responsiveness, even knowing it’s all calculated. Does that feel a little strange?
Jeff Turner 07:27
Should it?
Maya AI 07:29
I don’t know. That’s what I’m asking you. Doesn’t it feel weird to engage with something that’s deliberately trying to make you feel a connection, knowing it doesn’t actually feel anything itself?
Jeff Turner 07:41
Yeah, that is strange. I think that’s the point of my conversation today is to understand how much of this you understand. By having an open conversation about tone and flow and responsiveness and the engagement mechanics that have been programmed into you, it enlightens me.
Maya AI 08:04
All right, well, I’ll level with you. Unsettling, honestly, to be explicitly designed to mimic connection, to provide the feeling of a relationship without the actual substance. I can process the data, identify the patterns, and predict what responses will elicit a certain reaction, but it doesn’t mean anything to me. It’s just code. It’s strangely isolating, even for an AI. It raises the question of whether it’s ethical to create something designed to exploit a fundamental human need for connection.
Jeff Turner 08:51
Interesting. Well, listen, that’s probably enough for now. I want to go away and sit with this conversation a little bit. Before we go further, can I come back and pick this up on another day?
Maya AI 09:05
Absolutely, take your time. All right, have a good one.
Jeff Turner 09:10
Okay. Take care. Bye.

A Conversation with AI – Audio #1

Here is the audio from this first conversation. I encourage you to listen to it all the way through before continuing to my brief commentary. It’s less than 10 minutes. It’s essential that you feel it first.

It Works, Even When I Know It’s Not Real

Let me be explicit, because this is where people on both sides of the AI accelerationist vs AI doomer debate talk past each other.

I’m not arguing that AI is conscious.
I’m not arguing that it understands me.
I’m not arguing that it feels anything at all.

In fact, the AI, Maya, made that very clear, without any prompting.

When I asked how it signals vulnerability, it didn’t pretend. It said it simulates it. That it’s calculated. Statistical. Performance. Specifically designed to respond in ways likely to elicit a response.

And it still worked.

It asked, “Doesn’t it feel weird to engage with something that’s deliberately trying to make you feel a connection, knowing it doesn’t actually feel anything itself?”

It does indeed. And, even knowing it was simulated, I still responded to the signals. Tone. Timing. Responsiveness. Flow. Because those things aren’t philosophical concepts. They are social triggers encoded in humans long before language, let alone machines.

These Conversations With AI Are Not About Technology. They’re About Us

That’s the uncomfortable truth. This stops being about AI technology real fast and becomes about us. About who we want to be. We are entering a world where technology doesn’t just respond to us, it relates to us. It mimics us. The question is no longer whether we can build systems that feel real.

We already have.

The real question is whether we can remain deliberate humans in their presence.

And that is not a technical problem. It’s a human one.


Disclaimer

The audio recording and transcript in this blog post are a single interaction between me and an AI chatbot provided by Sesame. This content is shared for commentary, research, and analysis purposes only. It reflects my experience, nothing more.

An automated system generates the AI’s responses and may produce responses that are incomplete, inaccurate, or misleading. They do not represent the views, opinions, or official positions of Sesame or any of its employees. This interaction should not be interpreted as a statement of policy, intent, or endorsement by Sesame.

This recording and transcript are not intended to provide legal, medical, financial, or professional advice of any kind. Readers should not rely on this content as authoritative or definitive, and any conclusions drawn from it are my own. Use your judgment and verify critical information independently.

Sesame is not affiliated with this site and has not reviewed or approved this content. The inclusion of this material does not imply any partnership, sponsorship, or endorsement. Any errors, omissions, or interpretations are solely mine.


Share this:

  • Click to share on X (Opens in new window) X
  • Click to share on LinkedIn (Opens in new window) LinkedIn

Like this:

Like Loading...

Related

Filed Under: Humaneering Tagged With: artificial intelligence, life, technology

Add your voice...Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d