“New Statement Calls For Not Building Superintelligence For Now” by Zvi
Description
Building superintelligence poses large existential risks. Also known as: If Anyone Builds It, Everyone Dies. Where ‘it’ is superintelligence, and ‘dies’ is that probably everyone on the planet literally dies.
We should not build superintelligence until such time as that changes, and the risk of everyone dying as a result, as well as the risk of losing control over the future as a result, is very low. Not zero, but far lower than it is now or will be soon.
Thus, the Statement on Superintelligence from FLI, which I have signed.
Context: Innovative AI tools may bring unprecedented health and prosperity. However, alongside tools, many leading AI companies have the stated goal of building superintelligence in the coming decade that can significantly outperform all humans on essentially all cognitive tasks. This has raised concerns, ranging from human economic obsolescence and disempowerment, losses of freedom, civil liberties [...]
---
Outline:
(02:02 ) A Brief History Of Prior Statements
(03:51 ) This Third Statement
(05:08 ) Who Signed It
(07:27 ) Pushback Against the Statement
(09:05 ) Responses To The Pushback
(12:32 ) Avoid Negative Polarization But Speak The Truth As You See It
---
First published:
October 24th, 2025
---
Narrated by TYPE III AUDIO.
---
Images from the article:
Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.



