DiscoverNLP Highlights138 - Compositional Generalization in Neural Networks, with Najoung Kim
138 - Compositional Generalization in Neural Networks, with Najoung Kim

138 - Compositional Generalization in Neural Networks, with Najoung Kim

Update: 2023-01-20
Share

Description

Compositional generalization refers to the capability of models to generalize to out-of-distribution instances by composing information obtained from the training data. In this episode we chatted with Najoung Kim, on how to explicitly evaluate specific kinds of compositional generalization in neural network models of language. Najoung described COGS, a dataset she built for this, some recent results in the space, and why we should be careful about interpreting the results given the current practice of pretraining models of lots of unlabeled text.

Najoung's webpage: https://najoungkim.github.io/

Papers we discussed:
1. COGS: A Compositional Generalization Challenge Based on Semantic Interpretation (Kim et al., 2020): https://www.semanticscholar.org/paper/b20ddcbd239f3fa9acc603736ac2e4416302d074
2. Compositional Generalization Requires Compositional Parsers (Weissenhorn et al., 2022): https://www.semanticscholar.org/paper/557ebd17b7c7ac4e09bd167d7b8909b8d74d1153
3. Uncontrolled Lexical Exposure Leads to Overestimation of Compositional Generalization in Pretrained Models (Kim et al., 2022): https://www.semanticscholar.org/paper/8969ea3d254e149aebcfd1ffc8f46910d7cb160e

Note that we referred to the final paper by an earlier name in the discussion.
Comments 
In Channel
Are LLMs safe?

Are LLMs safe?

2024-02-2942:15

loading
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

138 - Compositional Generalization in Neural Networks, with Najoung Kim

138 - Compositional Generalization in Neural Networks, with Najoung Kim

Allen Institute for Artificial Intelligence