DiscoverThe Rip Current with Jacob WardAI Has Us Lying to One Another (and It's Changing How We Think)
AI Has Us Lying to One Another (and It's Changing How We Think)

AI Has Us Lying to One Another (and It's Changing How We Think)

Update: 2026-01-02
Share

Description

Okay, honest admission here: I don’t fully know what I think about this topic yet. A podcast producer (thanks Nancy!) once told me “let them watch you think out loud,” and I’m taking her to heart — because the thing I’m worried about is already happening to me.

Lately, I’ve been leaning hard on AI tools, God help me. Not to write for me — a little, sure, but for the most part I still do that myself — but to help me quickly get acclimated to unfamiliar worlds. The latest unfamiliar world is online marketing, which I do not understand AT ALL but now need to master to survive as an independent journalist. And here’s the problem: the advice these systems give isn’t neutral, because first of all it’s not really “advice,” it’s just statistically relevant language regurgitated as advice, and second, because it just vacuums up the language wherever it can find it, its suggestions come with online values baked in. I know this — I wrote a whole fucking book about it — but I lose track of it in my desperation to learn quickly.

I’m currently trying to analyze who it is that follows me on TikTok, and why, so I can try to port some of those people (or at least those types of people) over to Substack and YouTube, where one can actually make a living filing analysis like this. One of the metrics I was told to prioritize? Disagreement in the comments. Not understanding, learning, clarity, the stuff I’m after in my everyday work. Fighting. Comments in which people want to argue with me are “good,” according to ChatGPT. Thoughtful consensus? Statistically irrelevant.

Here’s the added trouble. It’s one thing to read that and filter out what’s unhelpful. It’s another thing to do so in a world where all of us are supposed to pretend we had this thought ourselves.

AI isn’t just helping us work faster. It’s quietly training us to behave differently — and to hide how that training happens. We’re all pretending this output is “ours,” because the unspoken promise of AI right now is that you can get help and still take the credit. (I believe this is a fundamental piece of the marketing that no one’s saying out loud, but everyone is implying.) And the danger isn’t just dishonesty toward others. It’s that we start believing our own act.

There’s a huge canon of scientific literature showing that lying about a thing causes us to internalize the lie over time. The Harvard psychologist Daniel Schachter wrote a sweeping review of the science in 1999 entitled “The Seven Sins of Memory,” in which he synthesized a range of studies that showed that memory is us building a belief on the prior belief, not drawing on a perfect replay of reality, and that repetition and suggestion can implant or strengthen false beliefs that feel subjectively true. Throw us enough ideas and culturally condition us to hide where we got them, and eventually we’ll come to believe they were our own. (And to be clear, I knew a little about the reconstructive nature of memory, but ChatGPT brought me Schachter’s paper. So there you go.)

What am I suggesting here? I know we’re creating a culture where machine advice is passed off as human judgment. I don’t know whether the answer is transparency, labeling, norms, regulation, or something else entirely. So I guess I’m starting with transparency.

In any event, I do know this: lying about how we did or learned something makes us less discerning thinkers. And AI’s current role in our lives is built on that lie.

Thinking out loud. Feedback welcome. Thanks!

Comments 
In Channel
loading
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

AI Has Us Lying to One Another (and It's Changing How We Think)

AI Has Us Lying to One Another (and It's Changing How We Think)

Jacob Ward