DiscoverDoom DebatesCarl Feynman, AI Engineer & Son of Richard Feynman, Says Building AGI Likely Means Human EXTINCTION!
Carl Feynman, AI Engineer & Son of Richard Feynman, Says Building AGI Likely Means Human EXTINCTION!

Carl Feynman, AI Engineer & Son of Richard Feynman, Says Building AGI Likely Means Human EXTINCTION!

Update: 2025-07-04
Share

Description

Carl Feynman got his Master’s in Computer Science and B.S. in Philosophy from MIT, followed by a four-decade career in AI engineering.

He’s known Eliezer Yudkowsky since the ‘90s, and witnessed Eliezer’s AI doom argument taking shape before most of us were paying any attention!

He agreed to come on the show because he supports Doom Debates’s mission of raising awareness of imminent existential risk from superintelligent AI.

00:00 - Teaser

00:34 - Carl Feynman’s Background

02:40 - Early Concerns About AI Doom

03:46 - Eliezer Yudkowsky and the Early AGI Community

05:10 - Accelerationist vs. Doomer Perspectives

06:03 - Mainline Doom Scenarios: Gradual Disempowerment vs. Foom

07:47 - Timeline to Doom: Point of No Return

08:45 - What’s Your P(Doom)™

09:44 - Public Perception and Political Awareness of AI Risk

11:09 - AI Morality, Alignment, and Chatbots Today

13:05 - The Alignment Problem and Competing Values

15:03 - Can AI Truly Understand and Value Morality?

16:43 - Multiple Competing AIs and Resource Competition

18:42 - Alignment: Wanting vs. Being Able to Help Humanity

19:24 - Scenarios of Doom and Odds of Success

19:53 - Mainline Good Scenario: Non-Doom Outcomes

20:27 - Heaven, Utopia, and Post-Human Vision

22:19 - Gradual Disempowerment Paper and Economic Displacement

23:31 - How Humans Get Edged Out by AIs

25:07 - Can We Gaslight Superintelligent AIs?

26:38 - AI Persuasion & Social Influence as Doom Pathways

27:44 - Riding the Doom Train: Headroom Above Human Intelligence

29:46 - Orthogonality Thesis and AI Motivation

32:48 - Alignment Difficulties and Deception in AIs

34:46 - Elon Musk, Maximal Curiosity & Mike Israetel’s Arguments3

6:26 - Beauty and Value in a Post-Human Universe

38:12 - Multiple AIs Competing

39:31 - Space Colonization, Dyson Spheres & Hanson’s “Alien Descendants”

41:13 - What Counts as Doom vs. Not Doom?

43:29 - Post-Human Civilizations and Value Function

44:49 - Expertise, Rationality, and Doomer Credibility

46:09 - Communicating Doom: Missing Mood & Public Receptiveness

47:41 - Personal Preparation vs. Belief in Imminent Doom48:56 - Why Can't We Just Hit the Off Switch?

50:26 - The Treacherous Turn and Redundancy in AI

51:56 - Doom by Persuasion or Entertainment

53:43 - Differences with Eliezer Yudkowsky: Singleton vs. Multipolar Doom

55:22 - Why Carl Chose Doom Debates

56:18 - Liron’s Outro

Show Notes

Carl’s Twitter — https://x.com/carl_feynman

Carl’s LessWrong — https://www.lesswrong.com/users/carl-feynman

Gradual Disempowerment — https://gradual-disempowerment.ai

The Intelligence Curse — https://intelligence-curse.ai

AI 2027 — https://ai-2027.com

Alcor cryonics — https://www.alcor.org

The LessOnline Conference — https://less.online

Watch the Lethal Intelligence Guide, the ultimate introduction to AI x-risk!

PauseAI, the volunteer organization I’m part of: https://pauseai.info

Join the PauseAI Discord — https://discord.gg/2XXWXvErfA — and say hi to me in the #doom-debates-podcast channel!

Doom Debates’ Mission is to raise mainstream awareness of imminent extinction from AGI and build the social infrastructure for high-quality debate.

Support the mission by subscribing to my Substack at DoomDebates.com and to youtube.com/@DoomDebates



Get full access to Doom Debates at lironshapira.substack.com/subscribe
Comments 
In Channel
loading
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

Carl Feynman, AI Engineer & Son of Richard Feynman, Says Building AGI Likely Means Human EXTINCTION!

Carl Feynman, AI Engineer & Son of Richard Feynman, Says Building AGI Likely Means Human EXTINCTION!

Liron Shapira