DiscoverInterconnects AudioOLMoE and the hidden simplicity in training better foundation models
OLMoE and the hidden simplicity in training better foundation models

OLMoE and the hidden simplicity in training better foundation models

Update: 2024-09-04
Share

Description

Ai2 released OLMoE, which is probably our "best" model yet relative to its peers, but not much has changed in the process.
This is AI generated audio with Python and 11Labs.
Source code: https://github.com/natolambert/interconnects-tools
Original post: https://www.interconnects.ai/p/olmoe-and-building-better-llms

00:00 OLMoE and the hidden simplicity in training better foundation models
02:04 Frontier model team compute allocations
04:19 De-risking training complexity
06:40 On organizational complexity
09:05 Compounding improvements -- the key to building better language models

Fig 1: https://huggingface.co/datasets/natolambert/interconnects-figures/resolve/main/olmoe/img_005.png
Fig 2: https://huggingface.co/datasets/natolambert/interconnects-figures/resolve/main/olmoe/img_007.png
Fig 3: https://huggingface.co/datasets/natolambert/interconnects-figures/resolve/main/olmoe/img_009.png
Fig 4: https://huggingface.co/datasets/natolambert/interconnects-figures/resolve/main/olmoe/img_011.png
Fig 5: https://huggingface.co/datasets/natolambert/interconnects-figures/resolve/main/olmoe/img_028.png
Fig 6: https://huggingface.co/datasets/natolambert/interconnects-figures/resolve/main/olmoe/img_030.png
Fig 7: https://huggingface.co/datasets/natolambert/interconnects-figures/resolve/main/olmoe/img_032.png

Comments 
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

OLMoE and the hidden simplicity in training better foundation models

OLMoE and the hidden simplicity in training better foundation models

Nathan Lambert