DiscoverVanishing GradientsEpisode 33: What We Learned Teaching LLMs to 1,000s of Data Scientists
Episode 33: What We Learned Teaching LLMs to 1,000s of Data Scientists

Episode 33: What We Learned Teaching LLMs to 1,000s of Data Scientists

Update: 2024-08-12
Share

Description

Hugo speaks with Dan Becker and Hamel Husain, two veterans in the world of data science, machine learning, and AI education. Collectively, they’ve worked at Google, DataRobot, Airbnb, Github (where Hamel built out the precursor to copilot and more) and they both currently work as independent LLM and Generative AI consultants.



Dan and Hamel recently taught a course on fine-tuning large language models that evolved into a full-fledged conference, attracting over 2,000 participants. This experience gave them unique insights into the current state and future of AI education and application.



In this episode, we dive into:




  • The evolution of their course from fine-tuning to a comprehensive AI conference

  • The unexpected challenges and insights gained from teaching LLMs to data scientists

  • The current state of AI tooling and accessibility compared to a decade ago

  • The role of playful experimentation in driving innovation in the field

  • Thoughts on the economic impact and ROI of generative AI in various industries

  • The importance of proper evaluation in machine learning projects

  • Future predictions for AI education and application in the next five years

  • We also touch on the challenges of using AI tools effectively, the potential for AI in physical world applications, and the need for a more nuanced understanding of AI capabilities in the workplace.



During our conversation, Dan mentions an exciting project he's been working on, which we couldn't showcase live due to technical difficulties. However, I've included a link to a video demonstration in the show notes that you won't want to miss. In this demo, Dan showcases his innovative AI-powered 3D modeling tool that allows users to create 3D printable objects simply by describing them in natural language.



LINKS



Comments 
In Channel
loading
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

Episode 33: What We Learned Teaching LLMs to 1,000s of Data Scientists

Episode 33: What We Learned Teaching LLMs to 1,000s of Data Scientists

Hugo Bowne-Anderson