“Nuclear Non-Proliferation Is the Wrong Framework for AI Governance” by Michael C. Horowitz, Lauren A. Kahn
Description
The views in this article are those of the authors alone and do not represent those of the Department of Defense, its components, or any part of the US government.
In a recent interview, Demis Hassabis — co-founder and CEO of Google DeepMind, a leading AI lab — was asked if he worried about ending up like Robert Oppenheimer, the scientist who unleashed the atomic bomb and was later haunted by his creation. While Hassabis didn’t explicitly endorse the comparison, he responded by advocating for an international institution to govern AI, holding up the International Atomic Energy Agency (IAEA) as a guiding example.
Hassabis isn’t alone in comparing AI and nuclear technology. Sam Altman and others at OpenAI have also argued that artificial intelligence is so impactful globally that it requires an international regulatory agency on the scale of the IAEA. Back in 2019, Bill Gates, for example [...]
---
Outline:
(01:57 ) How AI Differs from Nuclear Technology
(02:31 ) AI is much more widely applicable than nuclear technology
(04:18 ) AI is less excludable than nuclear technology
(07:37 ) AI's strategic value is continuous, not binary
(09:22 ) Nuclear Non-Proliferation is the Wrong Framework for AI Governance
(11:44 ) Approaches to AI Governance that Are More Likely to Succeed
---
First published:
June 30th, 2025
Source:
https://aifrontiersmedia.substack.com/p/nuclear-non-proliferation-is-the
---
Narrated by TYPE III AUDIO.
---
Images from the article:



Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.