Discover"The Cognitive Revolution" | AI Builders, Researchers, and Live Player AnalysisThe Case for Cautious AI Optimism, from the Consistently Candid podcast
The Case for Cautious AI Optimism, from the Consistently Candid podcast

The Case for Cautious AI Optimism, from the Consistently Candid podcast

Update: 2024-06-19
Share

Description

Dive into an accessible discussion on AI safety and philosophy, technical AI safety progress, and why catastrophic outcomes aren't inevitable. This conversation provides practical advice for AI newcomers and hope for a positive future.


Consistently Candid Podcast : https://open.spotify.com/show/1EX89qABpb4pGYP1JLZ3BB


SPONSORS:

Oracle Cloud Infrastructure (OCI) is a single platform for your infrastructure, database, application development, and AI needs. OCI has four to eight times the bandwidth of other clouds; offers one consistent price, and nobody does data better than Oracle. If you want to do more and spend less, take a free test drive of OCI at https://oracle.com/cognitive

The Brave search API can be used to assemble a data set to train your AI models and help with retrieval augmentation at the time of inference. All while remaining affordable with developer first pricing, integrating the Brave search API into your workflow translates to more ethical data sourcing and more human representative data sets. Try the Brave search API for free for up to 2000 queries per month at https://bit.ly/BraveTCR

Omneky is an omnichannel creative generation platform that lets you launch hundreds of thousands of ad iterations that actually work customized across all platforms, with a click of a button. Omneky combines generative AI and real-time advertising data. Mention "Cog Rev" for 10% off https://www.omneky.com/

Head to Squad to access global engineering without the headache and at a fraction of the cost: head to https://choosesquad.com/ and mention “Turpentine” to skip the waitlist.


Recommended Podcast:

Byrne Hobart, the writer of The Diff, is revered in Silicon Valley. You can get an hour with him each week. See for yourself how his thinking can upgrade yours.

Spotify: https://open.spotify.com/show/6rANlV54GCARLgMOtpkzKt

Apple: https://podcasts.apple.com/us/podcast/the-riff-with-byrne-hobart-and-erik-torenberg/id1716646486


CHAPTERS:

(00:00:00 ) About the Show

(00:03:50 ) Intro

(00:08:13 ) AI Scouting

(00:14:42 ) Why arent people adopting AI more quickly?

(00:18:25 ) Why dont people take advantage of AI?

(00:22:35 ) Sponsors: Oracle | Brave

(00:24:42 ) How to get a better understanding of AI

(00:31:16 ) How to handle the public discourse around AI

(00:34:02 ) Scaling and research

(00:43:18 ) Sponsors: Omneky | Squad

(00:45:03 ) The pause

(00:47:29 ) Algorithmic efficiency

(00:52:52 ) Red Teaming in Public

(00:55:41 ) Deepfakes

(01:01:02 ) AI safety

(01:04:00 ) AI moderation

(01:07:03 ) Why not a doomer

(01:09:10 ) AI understanding human values

(01:15:00 ) Interpretability research

(01:18:30 ) AI safety leadership

(01:21:55 ) AI safety respectability politics

(01:33:42 ) China

(01:37:22 ) Radical uncertainty

(01:39:53 ) P(doom)

(01:42:30 ) Where to find the guest

(01:44:48 ) Outro

Comments 
In Channel
loading
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

The Case for Cautious AI Optimism, from the Consistently Candid podcast

The Case for Cautious AI Optimism, from the Consistently Candid podcast

Erik Torenberg, Nathan Labenz