Futuristic #39 – Chapter 3: The AI Revolution Will Not Be Televised
Description
After a six-week hiatus, Cameron and Steve return for a sprawling, charged conversation about AI, politics, ethics, and the future of civilization. Steve reveals he’s been 3D printing buildings for TV, while Cam unveils his bold new concept: _Chapter 3_, a movement to engineer the next phase of humanity before AI and robots rewrite society by default. They dig into Mirror World drift, political alignment tools, and why Australia isn’t even remotely ready for the revolution already underway. There’s talk of AI-led political parties, the death of Google search, capitalist collapse, and even starting a cult. Welcome to the next chapter.
FULL TRANSCRIPT
[00:00:00 ]
Cameron: This is futuristic, episode 39, recorded on the 16th of May, 2025. Our first show in six weeks. Steve Sammartino
Steve: I’m so sorry. I didn’t know it was that long, but we’re back and Cameron’s in the house ready to learn as good, including English and grammar.
Cameron: Well, look, there’s been a whole lot of things going on, um, in the world of tech and AI in the last six weeks since we’ve been busy doing other stuff. Steve, do you wanna gimme a quick burst of, uh, what you are proudest of tech-wise in this period, but since we last spoke?
Steve: Yes, so I have, uh, been doing 3D printing for a national TV show printed. Five buildings in five days. I can’t say who it is, but its initials are the block. So that is
Cameron: So it’s not your TV show. I thought this was your TV show. [00:01:00 ] You,
Steve: mine.
Cameron: doing it for
Steve: Yeah. Look, I, I think I can tell people I can’t show anyone anything, but, uh,
Cameron: five buildings.
Steve: Yep. In five
Cameron: This is with, uh, what’s the name of your
Steve: A
Cameron: building, c.
Steve: 3D with Tommy
Cameron: That’s right.
Steve: named after him because I’m not an egocentric guy. And, uh, this could be the breakthrough we’ve been looking for. ’cause we’ve, uh, we, uh,
Cameron: O 3D doesn’t sound as good. Sam O 3D isn’t as good as macro 3D.
Steve: real good. I
Cameron: It does, yeah. Yeah, yeah.
Steve: so that’s that. And the other thing is I’ve been thinking a lot about Mirror World drift, and I just posted, uh, a blog on that and I had a
Cameron: Explain.
Steve: was awesome.
Well, I think that we’ve created this mirror world, which has been explored by people like Kevin Kelly, where we create a proxy for the world that we live in. But increasingly this proxy, which used to be just the digital version of us, increasingly it’s not us. It starts out with us using AI as tools and then agents and then proxies, and then the ais talk to the ais, and then they [00:02:00 ] develop language and conversations where we just drift out of this mirror world because it’s no longer relevant to us or for us, and it becomes this almost a new sphere.
Uh, which was something that was popularized in the early, uh, 20th century, uh, where we kind of opt out and it becomes almost a, a new species like an ocean where we just dip our toes in there. But there’s a whole lot of species in there. We don’t understand what’s spawned them. We can’t talk to them, we don’t know.
But like another big ecosystem, it has a huge impact on our lives, but it becomes this other world that we are not really associated with, even though we built it.
Cameron: Yeah, I, I, look, I think that’s kind of inevitable, um, not just Kevin Kelly, but I know that, um, Eric, um, fuck, what was his name?
Steve: Ler.
Cameron: No, no, no. The former CEO of Google for a long time,
Steve: Schmidt.
Cameron: Eric Schmidt’s been talking a lot about this for the last year or two, [00:03:00 ] how will start to develop their own language that’s more efficient, and then they’ll start talking to each other and he says, that’s when we need to pull the plug on the whole thing.
But that’s not gonna happen.
Steve: No.
Cameron: Um, yeah, that’s, I think that’s inevitable and it’s very Philip k Dickey. Uh, just this whole idea of human intelligence spawning a new kind of intelligence, which is, becomes so vastly different to our own intelligence that we, you know, I actually, of the show notes I had, one of the things that I watched a couple of weeks ago was a YouTube. Interview mostly between Ben Gerel and Hugo de Garris. guys I know a little bit. Hugo and I were on stage together at a Singularity conference about nine or 10 years ago down in Melbourne. I. Um, but one of the, they were just sort of talking about, they’ve both been AI researchers for decades and they were talking about where things are at, but they were, uh, Hugo was talking about alignment.
You know, you hear the AI researchers talk about alignment, which is to make [00:04:00 ] sure that the AI’s values are aligned to human values. And Ben I think it’s kinda like squirrels at Yellowstone National Park. Like talking about are human values aligned with squirrel’s values? I guess at some level, you know, we both rely on oxygen. We both rely on the climate not getting too hot. You know, we, we value certain things, but really, I. know, don’t, you know, we, we look at squirrels, we find them cute and interesting, and generally speaking, we don’t wanna harm them. We don’t wanna hurt them. We want them to run around and do their thing, but we don’t really think about them on a day-to-day basis unless you’re a park ranger.
Steve: they’re outside of the consideration set unless you’re specifically working in ecosystems and the maintenance and the importance. And I think Bostrom talked about that too when he did his first, uh, artificial super intelligence thing. He said, if, if we want to build a highway, look, we don’t want to hurt the ants, but if you’re in the way, the highway’s going in, it doesn’t matter.[00:05:00 ]
Cameron: Mm-hmm. Yeah, so I, and they were basically saying, and I think this is right, that if we have a super intelligence, its relationship to us will be like our relationship to squirrels or ants and. Anyway. Listen, I wanna tell you what I’ve been talking, what I’ve been thinking a lot about since you and I last spoke, and I’ve been dying to speak to you because you are the guy I want to talk to about these sorts of things, right?
Steve: you.
Cameron: You are the, you’re the only
Steve: Someone
Cameron: someone wants to talk to you. You are the only guy I can have a serious conversation with this stuff about. It is politics. So we just had a federal election here of course my number one political issue right now apart from legalization of cannabis is what are we doing to prepare our society for the AI robot revolution that’s gonna hit in the next couple of years?
Uh, Steve’s doing a selfie. I’ve gotta put my gang sign up,
Steve: we got a, yeah,
Cameron: gang sign.
Steve: we go.
Cam