When AI Becomes Your Thinking Partner
Description
AI becomes a thinking partner, not a replacement, as Dan Sullivan and Dean Jackson compare their distinct approaches to working with artificial intelligence.
In this episode of Welcome to Cloudlandia, we explore how Dan uses Perplexity to compress his book chapter creation from 150 minutes to 45 minutes while maintaining his unique voice. Dean shares his personalized relationship with Charlotte, his AI assistant, demonstrating how she helps craft emails and acts as a curiosity multiplier for instant research. We discover that while AI tools are widely available, only 1-2% of the global population actively uses them for creative and profitable work.
The conversation shifts to examining how most human interactions follow predictable patterns, like large language models themselves. We discuss the massive energy requirements for AI expansion, with 40% of AI capacity needed just to generate power for future growth. Nuclear energy emerges as the only viable solution, with one gram of uranium containing the energy of 27 tons of coal.
Dan's observation about people making claims without caring if you're interested provides a refreshing perspective on conversation dynamics. Rather than viewing AI as taking over, we see it becoming as essential and invisible as electricity - a layer that enhances rather than replaces human creativity.
SHOW HIGHLIGHTS
Links:
WelcomeToCloudlandia.com
StrategicCoach.com
DeanJackson.com
ListingAgentLifestyle.com
TRANSCRIPT
(AI transcript provided as supporting material and may contain errors)
Speaker 1:
Welcome to Cloud Landia,
Speaker 2:
Mr. Sullivan?
Speaker 1:
Yes, Mr. Jackson.
Speaker 2:
Welcome to Cloud Landia.
Speaker 1:
Yes. Yeah. I find it's a workable place. Cloud Landia.
Speaker 2:
Very, yep. Very friendly. It's easy to navigate.
Speaker 1:
Yeah. Where would you say you're, you're inland now. You're not on
Speaker 2:
The beach. I'm on the mainland at the Four Seasons of Valhalla.
Speaker 1:
Yes. It's hot. I am adopting the sport that you were at one time really interested in. Yeah. But it's my approach to AI that I hit the ball over the net and the ball comes back over the net, and then I hit the ball back over the net. And it's very interesting to be in this thing where you get a return back over, it's in a different form, and then you put your creativity back on. But I find that it's really making me into a better thinker.
Speaker 3:
Yeah.
Speaker 1:
Yeah. I've noticed in, what is it now? I started in February of 24. 24, and it's really making me more thoughtful. Ai.
Speaker 2:
Well, it's interesting to have, I find you're absolutely right that the ability to rally back and forth with someone who knows everything is very directionally advantageous. I heard someone talking this week about most of our conversations with the other humans, with other people are basically what he called large language model conversations. They're all essentially the same thing that you are saying to somebody. They're all guessing the next appropriate word. Right. Oh, hey, how are you? I'm doing great. How was your weekend? Fantastic. We went up to the cottage. Oh, wow. How was the weather? Oh, the weather was great. They're so predictable and LLME type of conversations and interactions that humans have with each other on a surface level. And I remember you highlighted that at certain levels, people talk about, they talk about things and then they talk about people. And at a certain level, people talk about ideas, but it's very rare. And so most of society is based on communicating within a large language model that we've been trained on through popular events, through whatever media, whatever we've been trained or indoctrinated to think.
Speaker 1:
Yeah, it's the form of picking fleas off each other.
Speaker 2:
Yes, exactly. You can imagine that. That's the perfect imagery, Dan. That's the perfect imagery. Oh, man. We're just, yes.
Speaker 1:
Well, it's got us through a million years of survival. Yeah, yeah. But the big thing is that, I mean, my approach, it's a richer approach because there's so much computing power coming back over, but it's more of an organizational form. It's not just trying to find the right set of words here, but the biggest impact on me is that somebody will give me a fact about something. They read about something, they watch something, they listen to something, and they give the thought. And what I find is rather than immediately engaging with the thought, I said, I wonder what the nine thoughts are that are missing from this.
Speaker 3:
Right?
Speaker 1:
Because I've trained myself on this 10 things, my 10 things approach. It's very useful, but it just puts a pause in, and what I'm doing is I'm creating a series of comebacks. They do it, and one of them is, in my mind anyway, I don't always say this because it can be a bit insulting. I said, you haven't asked the most important question here. And the person says, well, what's the most important question? I said, you didn't ask me whether I care about what you just said. You care. Yeah. And I think it's important to establish that when you're talking to someone, that something you say to them, do they actually care? Do they actually care?
Speaker 1:
I don't mean this in that. They would dismiss it, but the question is, have I spent any time actually focused on what you just told me? And the answer is usually if you trace me, if you observed me, you had a complete surveillance video of my last year of how I spent my time. Can you find even five minutes in the last year where I actually spent any time on the subject that you just brought up? And the answer is usually no. I really have, it's not that I've rejected it, it's just that I only had time for what I was focused on over the last year, and that didn't include anything, any time spent on the thing that you're talking about. And I think about the saying on the wall at Strategic Coach, the saying, our eyes only see, and our ears only here what our brain is looking for.
Speaker 2:
That's exactly right.
Speaker 1:
Yeah. And that's true of everybody. That's just true of every single human being that their brain is focused on something and they've trained their ears and they've trained their eyes to pick up any information on this particular subject.
Speaker 2:
The more I think about this idea of that we are all basically in society living large language models, that part of the reason that we gather in affinity groups, if you say Strategic coach, we're attracting people who are entrepreneurs at the top of the game, who are growth oriented, ambitious, all of the things. And so in gatherings of those, we're all working from a very similar large language model because we've all been seeking the same kind of things. And so you get an enhanced higher likelihood that you're going to have a meaningful conversation with someone and meaningful only to you. But if we were to say, if you look at that, yeah, it's very interesting. There was, I just watched a series on Netflix, I think it was, no, it was on Apple App TV with Seth Rogan, and he was running a studio in Hollywood, took over at a large film studio, and he started
Speaker 1:
Dating. Oh yeah, they're really available these days.
Speaker 2:
He started dating this. He started dating a doctor, and so he got invited to these award events or charity type events with this girl he was dating. And so he was an odd man out in this medical where all these doctors were all talking about what's interesting to them. And he had no frame of reference. So he was like an odd duck in this. He wasn't tuned in to the LLM of these medical doc. And so I think it's really, it's very interesting, these conversations that we're having by questioning AI like this, or by questioning Charlotte or YouTube questioning perplexity or whatever, that we are having a conversation where we're not, I don't want to say this. We're not the smartest person in the conversation kind of thing, which often you can be in a conversation where you don't feel like the person is open to, or has even been exposed to a lot of the ideas and things that we talk about when we're at Strategic Coach in a workshop or whatever. But to have the conversation with Charlotte who's been exposed at a doctoral level to everything, it's very rewarding.
Speaker 1:
She's only really been exposed to what Dean is interested in.