DiscoverTraining DataOpenAI Researcher Dan Roberts on What Physics Can Teach Us About AI
OpenAI Researcher Dan Roberts on What Physics Can Teach Us About AI

OpenAI Researcher Dan Roberts on What Physics Can Teach Us About AI

Update: 2024-10-221
Share

Description

In recent years there’s been an influx of theoretical physicists into the leading AI labs. Do they have unique capabilities suited to studying large models or is it just herd behavior? To find out, we talked to our former AI Fellow (and now OpenAI researcher) Dan Roberts.


Roberts, co-author of The Principles of Deep Learning Theory, is at the forefront of research that applies the tools of theoretical physics to another type of large complex system, deep neural networks. Dan believes that DLLs, and eventually LLMs, are interpretable in the same way a large collection of atoms is—at the system level. He also thinks that emphasis on scaling laws will balance with new ideas and architectures over time as scaling asymptotes economically.


Hosted by: Sonya Huang and Pat Grady, Sequoia Capital 


Mentioned in this episode:

AI Math Olympiad: Dan is on the prize committee

Comments 
In Channel
loading
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

OpenAI Researcher Dan Roberts on What Physics Can Teach Us About AI

OpenAI Researcher Dan Roberts on What Physics Can Teach Us About AI

Sequoia Capital