DiscoverThe Embodied AI Podcast#5 Felix Hill: Grounded Language, Transformers, and DeepMind
#5 Felix Hill: Grounded Language, Transformers, and DeepMind

#5 Felix Hill: Grounded Language, Transformers, and DeepMind

Update: 2022-07-06
Share

Description

Felix is a research scientist at DeepMind. He is interested in grounded language understanding and natural language processing (NLP). After finding out about Felix's background, we bring up compositionality and explore why natural language is NonCompositional (also, the name of Felix's blog). Then, Felix tells us a bit about his work in Cambridge on abstract vs concrete concepts and gives us a quick crash course on the role of recurrent neural networks (RNNs), long short-term memory (LSTMs), and transformers in language models. Next, we talk about Jeff Elman's landmark paper 'Finding Structure in Time' and how neural networks can learn to understand analogies. After, we discuss the core of Felix work: Training language agents in 3D simulations, where we raise some questions on language learning as an embodied agent in space and time, and Allan Paivio's dual coding theory implemented in the memory of a language model. Next, we stick with the theme of memory retrieval and discuss Felix and Andrew Lampinen's work on 'mental time travel' in language models. Finally, I ask Felix on some good strategies on how to get into DeepMind and the best way to learn NLP.




Timestamps:


(00:00 ) - Intro


(07:57 ) - Compositionality in natural language


(16:42 ) - Abstract vs concrete concepts


(24:03 ) - RNNs, LSTMs, Transformers


(34:12 ) - Prediction, time and Jeff Elman


(48:04 ) - Neural networks & analogies


(56:32 ) - Grounded language, 3D simulations, babies,


(01:05:20 ) - Keeping vision and language data separate


(01:13:51 ) - NeuroAI and mental time travel


(01:21:47 ) - Getting into DeepMind and learning NLP




Felix Website (good overview for his papers)




Papers


Abstract vs concrete concepts paper


Jeff Elman (1990): Finding structure in time paper


Analogies paper


Dual coding theory paper


Mental Time Travel paper




My Twitter


My LinkedIn

Comments 
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

#5 Felix Hill: Grounded Language, Transformers, and DeepMind

#5 Felix Hill: Grounded Language, Transformers, and DeepMind

Akseli Ilmanen