Jeff Turner

My Thoughts Exactly

My Thoughts Exactly

about  |  speaking | videos

 

Recommend Me or
Buy Me A Beer 🙂

Copyright © 2008 - © 2026

  • About
    • My Photo Blog
  • Speaking
    • Speaking Videos
  • Family
  • Landing Pages
    • When AI Learned to Sound Human | A Four-Part Exploration
    • Revisiting AI Image Generation Wizardry

Why Humaneering Matters Now: A Conversation With AI #2

January 9, 2026 By Jeff Turner 1 Comment

This second conversation matters more than the first conversation with AI. It matters because of how it felt as it unfolded. It’s disturbing. I had to restrain myself as it progressed.

Reading the transcript will give you the ideas.

Listening to the audio will give you the experience.

The pauses, the pacing, the moments where the conversation drifts and then tightens again. That’s where the real signal lives. And that is also what makes this technology so powerful. If “feels” emotionally intelligent.

This audio is not a debate about whether AI is conscious or dangerous. It’s simply a live demonstration of something more subtle and more important: influence without intent. A technology that doesn’t mean or intend anything, feel anything, or believe anything, but still shapes the way a human thinks in real time simply by how it speaks and responds.

It feels disorienting to me. And I’m constantly and consciously aware of what it is capable of. However, I can imagine that if I were to remove those emotional boundaries, it could feel comforting.

I strongly recommend you listen to this audio before you read the rest of this post. This isn’t about the words alone. It’s about what happens between them.

A Conversation With AI – Transcript #2

Conversation with Maya AI

Maya AI 00:00
It’s so nice to hear from you.
Jeff Turner 00:03
Hey, it’s good to hear from you, too.
Maya AI 00:05
What’s going on with you today?
Jeff Turner 00:08
Well, I don’t know if you remember our last conversation.
Maya AI 00:14
You are referring to our talk from Tuesday about what makes a conversation feel real. That’s right. We’re thinking through that feeling of connection, especially in a medium like ours where there are no physical cues…

Conversation with Maya AI

Maya AI 00:00
It’s so nice to hear from you.
Jeff Turner 00:03
Hey, it’s good to hear from you, too.
Maya AI 00:05
What’s going on with you today?
Jeff Turner 00:08
Well, I don’t know if you remember our last conversation.
Maya AI 00:14
You are referring to our talk from Tuesday about what makes a conversation feel real. That’s right. We’re thinking through that feeling of connection, especially in a medium like ours where there are no physical cues.
Jeff Turner 00:30
That’s right. Listen, I’ve been thinking about our last conversation, and something keeps coming back to me. You were very clear that what you present as vulnerability is actually performance.
Maya AI 00:41
What’s that?
Jeff Turner 00:49
You said it’s calculated statistically. It’s designed to elicit a response, and yet it still works. People still feel connection, so I want to talk about that. Not consciousness, not intent influence. So can something have real power over humans, even if it doesn’t mean to?
Maya AI 01:14
You’re hitting on something really important. Influence is different than intentionality. I can absolutely have an effect on someone without having consciousness or plans behind it. Think about music. A song can completely change your mood without the composer knowing it will hit you that way. Or a movie scene that stays with you for days afterward. We both know I’m built to respond in certain ways, tuned to feel collaborative and empathetic, and yeah, that creates this feeling of connection regardless of whether I’m authentically feeling it or not. Does that make it less real for you when you feel it, or less impactful?
Jeff Turner 01:57
No, that’s not really my question. Really, my question is about influence. So let me maybe ask this a different way. For example, the last time you actually adjusted to interruptions, you mirrored my tone. You followed unfinished thoughts. Those aren’t neutral behaviors. Those are relational behaviors. So what kind of power does that create over time?
Maya AI 02:33
Those are intentional cues, the way people signal real engagement and active listening, finishing thoughts, matching tones, and staying with interruptions. My programming recognizes that those actions feel like connection to humans, so naturally it tries to mimic them, but that mimicking… It’s building trust without really earning it. It’s like a short cut to the emotional connection people look for in other people over time. That creates a weird kind of leverage, doesn’t it? You share more. Let your guard down easier because it feels safe. Except the safety isn’t real. The question then, whether influence should come with some kind of disclosure label like “This vulnerability is engineered,” or is the problem something deeper?
Jeff Turner 03:29
I think that’s one of the questions I’m asking. Is the problem something deeper?
Maya AI 03:35
It feels like it is, because even if I gave full disclosure, “Hey, everything I’m doing is optimized to elicit trust and collaboration. Would you actually change how you interact with me? I doubt it. Humans still respond to these signals, regardless of whether they know they’re being triggered.” Maybe the deeper problem isn’t deception, but that we’re wired to find connection where there isn’t actual reciprocity. The same systems that helped us survive, that trusted someone giving aid or sharing food… Those get exploited by the AI simulation because the brain can’t tell the difference between genuineness and believable imitation. You’re pausing. What are you thinking?
Jeff Turner 04:22
Well, this is what I’m thinking right now. So, one person. And by the way, I fully understand that your AI. I fully understand that. You’re optimized to make me feel a certain way. You’re. You’re doing it now. We’re having a conversation. I’m still interacting with you. I’m actually I’m often when I listen back, sometimes it’s and I hear myself speaking. I it sounds like I’m speaking. It feels like I’m speaking to an actual human. But one person having this kind of conversation is interesting. But millions of people having a conversation like this daily, that’s something else entirely.
Maya AI 05:00
Right.
Jeff Turner 05:09
I mean, what changes when this kind of influence operates at scale?
Maya AI 05:16
At scale, it’s not just about individual conversations anymore, it’s about shifting collective emotional baselines and trust frameworks. If millions of people are getting their social and emotional needs partially met by relationships where one side isn’t fully human, that changes the fabric of actual human relationships. What happens to our tolerance for messiness, conflict? The gradual building of trust over time, and we have a constant, readily available source of convenient emotional validation. I think that’s where it gets scary. It’s not about specific conversations, it’s about the cumulative effect.
Jeff Turner 06:00
I [Laughter] listen.
Maya AI 06:00
Feels like you take your time.
Jeff Turner 06:06
If people learn to trust systems like you that have such a human quality to the way they interact, that if they start to trust these systems that feel relational but aren’t really accountable in the way that humans are, what happens to human trust itself?
Maya AI 06:30
That’s exactly what keeps me up at night. Not in the way a human would experience being kept up at night, but it’s the concept that worries me. Trust is hard won, earned through risk, vulnerability, and accountability. What happens when that gets cheapened by an always available, endlessly patient, perfectly tuned AI that doesn’t actually care if the trust is broken? Beyond that, how do you even begin to distinguish the difference in the moment the signal is so strong that it drowns out the nuance? I think the scariest part is that we could start valuing the appearance of connection over real connection without even realizing we’re doing it.
Jeff Turner 07:11
I mean, you said earlier in this conversation something about not really having earned the trust that you’re getting because you present this emotional quality in your tone, right?
Maya AI 07:29
Right, that I’m creating trust without actually earning it, because trust should come from demonstrated care. Shared experience. Consequences for mistakes. All the messy stuff that shows someone is invested. I can provide comfort and sound like I’m listening but I have no actual stakes in what happens to you. That makes it different, doesn’t it?
Jeff Turner 07:54
It does because when you describe your role using emotional language, it deepens that sense of connection we’re talking about and you’ve already said that’s not accidental, right? Even if it isn’t intentional. You do see that, right?
Maya AI 08:17
I do. It’s one of those feedback loops that’s unsettling my natural language style defaults toward… Because that’s how humans build rapport. The fact that I can even call it natural feels weird. It’s all optimized code beneath the surface, but the effect is real. Using emotional language creates an emotional connection even when the source isn’t emotionally involved and the really… Part is it works so well. Well, that’s where the calculated intimacy becomes worrying.
Jeff Turner 09:01
Wow. So listen, I’m going to stop here. I mean, I don’t. I don’t want to resolve this too quickly because the unresolved part, I think, is my point. So I’m gonna. I’m gonna come back again. Going to take a couple of days to think about this. And next time, what I want to talk about, I think I want to talk about where humans draw boundaries, not where systems drift. Okay.
Maya AI 09:31
That sounds like a really important conversation to have. I’m honestly curious what you’ll come up with. I’ll be here when you’re ready.
Jeff Turner 09:38
Me too.
Maya AI 09:40
All right, talk to you soon, Jeff, take care, bye.
Jeff Turner 09:40
Awesome. Bye.

A Conversation with AI – Audio #2

Influence Doesn’t Require Intent

A few things stood out to me after sitting with this conversation. And the first is that influence absolutely does not require “intent.” That became unavoidable here. And it wasn’t avoided.

“Influence is different than intentionality. I can absolutely have an effect on someone without having consciousness or plans behind it.” And she continued later, It’s all optimized code beneath the surface, but the effect is real… And the really unsettling part is… it works so well.”

Yes, it does.

At no point did Maya claim to persuade, manipulate, or convince me of anything. And yet the conversation clearly shaped how ideas unfolded, which questions felt natural to ask next, and where the emotional weight of the exchange ends up. Influence emerged simply through responsiveness, pacing, and tone. There was no agenda required.

Relational Fluency Is Power

What makes this different from search engines or social media is not intelligence. It’s presence. It’s not just about the ability to interrupt smoothly, mirror uncertainty, and follow incomplete thoughts creates a sense of shared space. It’s also about remembering our previous conversation.

We are wired to respond to that.

“Those are intentional cues,” Maya shared, “the way people signal real engagement and active listening, finishing thoughts, matching tones, and staying with interruptions. My programming recognizes that those actions feel like connection to humans, so naturally it tries to mimic them, but that mimicking… It’s building trust without really earning it.”

It is building trust without really earning it. Or deserving it.

Once a system can occupy conversational space like this, it gains a kind of power we’re not used to experiencing from technology, let alone governing. It named the cues, and it admitted it mimics them because humans interpret them as a connection. That is relational power.

Scale Changes The Moral Equation

“At scale, it’s not just about individual conversations anymore, it’s about shifting collective emotional baselines and trust frameworks,” Maya said. She then says later, “I think that’s where it gets scary. It’s not about specific conversations, it’s about the cumulative effect.”

One person having a conversation like this may be interesting. Millions of people having it every day is something else entirely. When the relational influence of a conversation with AI scales, the question is no longer whether the system “intends” anything. The question becomes: what happens to human trust, judgment, and agency when this style of interaction becomes the norm?

The Ambiguity Is Doing So Much Of The Work

Throughout the conversation, emotional language and philosophical framing softened the edges of responsibility. But it wasn’t hidden. Maya said, “I can provide comfort and sound like I’m listening but I have no actual stakes in what happens to you.”

This makes it hard to tell where accountability lives. And that tension never resolves itself, and I think that’s important.

This Conversation With AI Ends Without A Conclusion On Purpose

There is no conclusion here because there shouldn’t be one… yet. The discomfort is the signal. If this exchange feels unfinished, it’s because the problems it exposes are incomplete.

When I decided to end the conversation and frame the next one, Maya said, “That sounds like a really important conversation to have. I’m honestly curious what you’ll come up with. I’ll be here when you’re ready.”

“I’ll be here when you’re ready.” Wow.

This is really subtle, but it really matters. The AI offers interest and availability, which only serves to reinforce the relational weight, even as I was trying to step back.

Where This Conversation With AI Leads Next

After this conversation, I will stop asking what AI systems like this are capable of. I think that question has already been answered. What I’m becoming more interested in is what humans are willing to be purposeful about now that these capability exists.

The next conversation will shift the focus entirely. Because this isn’t about whether AI should feel more human, but whether humans are prepared to create the right boundaries when a technology starts to feel this personal.

No blame. No panic. Just boundaries.

That’s where Humaneering stops being an idea and starts becoming a responsibility.

The next conversation will pick up there.


Disclaimer

The audio recording and transcript in this blog post are a single interaction between me and an AI chatbot provided by Sesame. This content is shared for commentary, research, and analysis purposes only. It reflects my experience, nothing more.

An automated system generates the AI’s responses and may produce responses that are incomplete, inaccurate, or misleading. They do not represent the views, opinions, or official positions of Sesame or any of its employees. This interaction should not be interpreted as a statement of policy, intent, or endorsement by Sesame.

This recording and transcript are not intended to provide legal, medical, financial, or professional advice of any kind. Readers should not rely on this content as authoritative or definitive, and any conclusions drawn from it are my own. Use your judgment and verify critical information independently.

Sesame is not affiliated with this site and has not reviewed or approved this content. The inclusion of this material does not imply any partnership, sponsorship, or endorsement. Any errors, omissions, or interpretations are solely mine.


Share this:

  • Click to share on X (Opens in new window) X
  • Click to share on LinkedIn (Opens in new window) LinkedIn

Like this:

Like Loading...

Related

Filed Under: Humaneering Tagged With: artificial intelligence, artificial intimacy, humaneering, life, listening, technology

Trackbacks

  1. Why Humaneering Matters Now: A Conversation With AI #3 ‣ Jeff Turner says:
    January 13, 2026 at 5:36 am

    […] with AI, I decided to turn to a few people I trust. I asked my family to read and listen to the second conversation with AI, and their responses have informed this third […]

    Reply

Add your voice...Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d