DiscoverThe TrajectoryJoshua Clymer - Where Human Civilization Might Crumble First (Early Experience of AGI - Episode 2)
Joshua Clymer - Where Human Civilization Might Crumble First (Early Experience of AGI - Episode 2)

Joshua Clymer - Where Human Civilization Might Crumble First (Early Experience of AGI - Episode 2)

Update: 2025-07-04
Share

Description

This is an interview with Joshua Clymer, AI safety researcher at Redwood Research, and former researcher at METR.

Joshua has spent years focused on institutional readiness for AGI, especially the kinds of governance bottlenecks that could become breaking points. His thinking is less about far-off futures and more about near-term institutional failure modes - the brittle places that might shatter first.

In this episode, Joshua and I discuss where AGI pressure might rupture our systems: intelligence agencies, the military, tech labs, and the veil of classification that surrounds them. What struck me most in this conversation with Joshua was his grounded honesty. He doesn’t offer easy predictions - just hard-won insight from years near the edge.

This is the second episode in our new “Early Experience of AGI” series - where we explore the early impacts of AGI on our work and personal lives.

This episode referred to the following other essay:
-- Josh's X thread titled "How AI Might Take Over in 2 Years": 
-- Closing the Human Reward Circuit: https://danfaggella.com/reward

Listen to this episode on The Trajectory Podcast: https://podcasts.apple.com/us/podcast/the-trajectory/id1739255954

Watch the full episode on YouTube: https://youtu.be/yMPJKjutm7M

See the full article from this episode: https://danfaggella.com/clymer1

...

There are three main questions we cover here on the Trajectory:

1. Who are the power players in AGI and what are their incentives?
2. What kind of posthuman future are we moving towards, or should we be moving towards?
3. What should we do about it?

If this sounds like it's up your alley, then be sure to stick around and connect:

-- Blog: danfaggella.com/trajectory
-- X: x.com/danfaggella
-- LinkedIn: linkedin.com/in/danfaggella
-- Newsletter: bit.ly/TrajectoryTw
-- YouTube: https://www.youtube.com/@trajectoryai

Comments 
In Channel
loading
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

Joshua Clymer - Where Human Civilization Might Crumble First (Early Experience of AGI - Episode 2)

Joshua Clymer - Where Human Civilization Might Crumble First (Early Experience of AGI - Episode 2)

Daniel Faggella