DiscoverFor Humanity: An AI Safety PodcastEpisode #47 Trailer : “Can AI Be Controlled?“ For Humanity: An AI Risk Podcast
Episode #47 Trailer : “Can AI Be Controlled?“ For Humanity: An AI Risk Podcast

Episode #47 Trailer : “Can AI Be Controlled?“ For Humanity: An AI Risk Podcast

Update: 2024-09-25
Share

Description

In Episode #47 Trailer, host John Sherman talks with Buck Shlegeris, CEO of Redwood Research, a non-profit company working on technical AI risk challenges. The discussion includes Buck’s thoughts on the new OpenAI o1-preview model, but centers on two questions: is there a way to control AI models before alignment is achieved if it can be, and how would the system that’s supposed to save the world actually work if an AI lab found a model scheming. Check out these links to Buck’s writing on these topics below:


https://redwoodresearch.substack.com/p/the-case-for-ensuring-that-powerful


https://redwoodresearch.substack.com/p/would-catching-your-ais-trying-to


Senate Hearing:


https://www.judiciary.senate.gov/committee-activity/hearings/oversight-of-ai-insiders-perspectives


Harry Macks Youtube Channel


https://www.youtube.com/channel/UC59ZRYCHev_IqjUhremZ8Tg


LEARN HOW TO HELP RAISE AI RISK AWARENESS IN YOUR COMMUNITY HERE


https://pauseai.info/local-organizing


Please Donate Here To Help Promote For Humanity


https://www.paypal.com/paypalme/forhumanitypodcast


EMAIL JOHN: forhumanitypodcast@gmail.com


This podcast is not journalism. But it’s not opinion either. This is a long form public service announcement. This show simply strings together the existing facts and underscores the unthinkable probable outcome, the end of all life on earth. 


For Humanity: An AI Safety Podcast, is the accessible AI Safety Podcast for all humans, no tech background required. Our show focuses solely on the threat of human extinction from AI.


Peabody Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction. The makers of AI openly admit it their work could kill all humans, in as soon as 2 years. This podcast is solely about the threat of human extinction from AGI. We’ll meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity.


RESOURCES:


JOIN THE FIGHT, help Pause AI!!!!


Pause AI


SUBSCRIBE TO LIRON SHAPIRA’S DOOM DEBATES on YOUTUBE!!


https://www.youtube.com/@DoomDebates


Join the Pause AI Weekly Discord Thursdays at 2pm EST


  / discord  


https://discord.com/invite/pVMWjddaW7


Max Winga’s “A Stark Warning About Extinction”


https://youtu.be/kDcPW5WtD58?si=i6IRy82xZ2PUOp22


For Humanity Theme Music by Josef Ebner


Youtube: https://www.youtube.com/channel/UCveruX8E-Il5A9VMC-N4vlg


Website: https://josef.pictures


BUY STEPHEN HANSON’S BEAUTIFUL AI RISK BOOK!!!


https://stephenhansonart.bigcartel.com/product/the-entity-i-couldn-t-fathom


22 Word Statement from Center for AI Safety


Statement on AI Risk | CAIS


https://www.safe.ai/work/statement-on-ai-risk


Best Account on Twitter: AI Notkilleveryoneism Memes 


https://twitter.com/AISafetyMemes



Comments 
In Channel
loading
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

Episode #47 Trailer : “Can AI Be Controlled?“ For Humanity: An AI Risk Podcast

Episode #47 Trailer : “Can AI Be Controlled?“ For Humanity: An AI Risk Podcast

John Sherman