Jeff Turner

My Thoughts Exactly

My Thoughts Exactly

about  |  speaking | videos

 

Recommend Me or
Buy Me A Beer 🙂

Copyright © 2008 - © 2026

  • About
    • My Photo Blog
  • Speaking
    • Speaking Videos
  • Family
  • Just Dug Up
  • Landing Pages
    • Humaneering
    • When AI Learned to Sound Human
    • AI Image Wizardry
    • Turner Family Darts

This Hits Different

April 24, 2026 By Jeff Turner Leave a Comment

“This hits different,” was my response to Brad Inman when he left a comment on my LinkedIn post about the future of advertising. He said, “Reminds me of the early days of the internet, Jeff. No time to discuss, we just did it. My approach again is to just do it.”

Other comments on some of my more recent posts have had the same theme. Why are you so worried? Or what does it matter? This is just like any other tech evolution, they say. And I get it. The “just do it” position has certainly been mine in the past. It might be mine in the future. I’m not a Luddite by any stretch of the imagination.

And I was one of those “just do it” voices, and a loud one, during the early days of the internet and the birth of social media. Looking back, I have to admit I regret much of that energy. You can find some of that energy here if you search.

So, I sat with my “this hits different” comment for a bit and then went back and added some additional thoughts. I said…

Brad, the internet may not be the right moment to look back on. How we handled social media is much closer, but still not the same. We’ll have to have another one of those random lunches we used to have and talk about what “just doing it” with social media did to us. I wish we had slowed down and argued more about incentives. We might have plotted a different course than the attention economy we all helped create.

Instead, we (myself included) just plunged headlong. I regret that, both personally and societally. Now we live in a world where our brains have been hijacked for profit. The attention economy harmed mental health, eroded democracy, and weakened shared reality by rewarding outrage and extremes.

Now we are unleashing what may be the most transformative technology ever into that same environment. If we don’t slow down enough to discuss unintended consequences and design for them, we will regret this, too. We have to make time for these conversations and choose to do the things that don’t further unravel what makes us human.

Brad agreed. And I’ve sat with both comments for several months now, watching the pace of this AI revolution continue at breakneck speed. Using it daily. Feeling how it subtly changes me.

This doesn’t just “hit” different. This IS different.

Every Technology Advance Has Faced The Same Warnings

Every generation presented with a transformative new technology believes the new tool crosses a threshold that previous tools didn’t. Socrates thought writing would weaken memory. And it did. But we didn’t lose memory. We outsourced it. And when we did, we gained new ways of thinking while giving up others. In short, we adapted.

But adaptation is never neutral. Every time we’ve adapted to a new tool, we’ve made a trade. Writing didn’t just change how we remembered things. It changed who got to store knowledge. It changed who controlled narratives. It changed how truth gets stabilized — and who gets to stabilize it. We gained the ability to transmit ideas across time and distance. We lost something in the directness of knowledge passed from one person to another. The oral tradition didn’t just carry information. It carried a relationship.

Detractors of the printing press feared it would give people access to dangerous ideas without the guidance to evaluate them. It did, and we adapted. We gained the democratization of knowledge. We got the Reformation. We also got the pamphlet wars, the origins of propaganda, and the weaponization of the printed word.

Detractors of television feared it would make us passive consumers of whatever we were shown. It did, and we adapted, sort of. We’re still working that out.

And social media? Have we adapted? Not well. And I’m certain we’re not better for it.

The truth is that humans have always offloaded cognition to their tools and remained human anyway. In many ways, advancing technology is the story of what it means to be human. The worriers have always lost the argument. History says just do it.

The problem with the argument is what it glosses over. We adapted to every one of these tools. And every time, the adaptation cost us something we didn’t fully account for until it was gone.

AI Is Not Like Any Other Tool. This Hits Different.

I plunged headlong into social media. I was a loud voice promoting its virtues. And for years, I ignored the potential downside. I have more than a few regrets.

The attention economy that has driven social media wasn’t an accident. It’s what happens when nobody slows down to argue about the incentives until it’s too late. We didn’t ask what we were building ourselves into. We only asked what we were building. There’s a difference. We are paying the price for that now in ways we can all feel. Viscerally.

This is our shared, lived experience. Not ancient history. Not a cautionary tale from a textbook. This is something that happened to us, inside us, in the last fifteen years. Everyone, including me, who said “just do it” with social media wasn’t wrong that it was transformative. What we were all wrong about was that tech-led transformation is automatically good. That the people building the tools had incentives aligned with ours. That we’d just adapt and everything would be fine.

We adapted. It changed us. But everything is not fine.

Where The Thinking Happens

When you write, you are the synthesizer. Every tool you’ve ever used, whether it’s a pencil, a typewriter, or a keyboard, was just an object until you picked it up. It was inert until you activated it. A pencil can’t “think” for itself. Neither can a keyboard.

What went on the page came from you. It was your thinking, shaped by everything you’ve read and heard and argued, but it was always processed inside you. You were the locus of synthesis. You made the decisions about which words to string together and why. Which to keep. Which to reject.

AI inverts that. The synthesis, the thinking, happens inside the machine. It arrives complete. It arrives fluent and structured. It arrives before you’ve formed your own position. It sits upstream of desire.

When casually engaging with ChatGPT, Claude, Gemini, or any other AI tool, you are no longer the synthesizer. You are the receiver. This isn’t trivial. It’s not just the expected tech evolution. Not just another inert tool. And it’s not just a difference in the degree of impact. It’s a full and complete reversal of direction.

This Hits Different, And Fluency Is Why

We are wired to receive fluent, well-structured output as credible. And that wiring predates AI. It probably springs from our storytelling ancestry.

But a pencil never exploited our wiring. A typewriter never exploited it. A keyboard never exploited it.

AI does. It leverages it every single time we interact with it.

It doesn’t just produce answers to our questions; it produces answers that feel authoritative before we’ve evaluated whether they actually are. The ease isn’t incidental. This happens by design, and Sune Selsbæk-Reitz wrote about it at length, in his excellent piece, Anchors Without Authors.

Large language models are, in many ways, engines of fluency. They produce sentences that read as if they have already passed through the friction of doubt. The grammar is clean, even if mildly predictable. The transitions are seamless. The tone is friendly and measured. There are no visible traces of hesitation, no crossed-out lines, no signs of the effort that usually accompanies thinking. 

And so the first answer doesn’t just anchor content. It anchors confidence. Even when we know, in principle, that the output may contain errors or simplifications, the form itself exerts a pull. The paragraph feels like something that could be true, and that feeling becomes the starting point for everything that follows.

I encourage you to read his piece in full. It does a better job of explaining this exploitation of our fluency wiring than I possibly could. This isn’t just another tool.

This Is Different

We have to make time for these conversations. Not to kill innovation, not to stop Artificial Intelligence, but to stay awake inside it. The people, like me, who said “just do it” with social media weren’t malicious. We were captured by momentum. We never pushed back against it. Until the damage was done.

Am I and others overreacting to AI, like Socrates to writing? Perhaps, but that’s not the question we should really be asking each other. My regret with social media is not waking up and asking sooner what we were building ourselves into.

The right question is, what are we outsourcing this time, and what will that do to us?

That question matters. That question has power. But only if we are awake enough to address it.


Share this:

  • Share on X (Opens in new window) X
  • Share on LinkedIn (Opens in new window) LinkedIn

Like this:

Like Loading...

Related

Filed Under: Commentary, Technology Tagged With: artificial intelligence, behavior, technology

Add your voice...Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d