DiscoverMicrosoft Cloud IT Pro PodcastEpisode 397 – Local LLMs: Why Every Microsoft 365 & Azure Pro Should Explore Them
Episode 397 – Local LLMs: Why Every Microsoft 365 & Azure Pro Should Explore Them

Episode 397 – Local LLMs: Why Every Microsoft 365 & Azure Pro Should Explore Them

Update: 2025-03-131
Share

Description

Welcome to Episode 397 of the Microsoft Cloud IT Pro Podcast. In this episode, Scott and Ben dive into the world of local LLMs—large language models that run entirely on your device. We’re going to explore why more IT pros and developers are experimenting with them, the kinds of models you can run, and how you can integrate them directly into your workflow, including in Visual Studio Code for AI-assisted coding.

Your support makes this show possible! Please consider becoming a premium member for access to live shows and more. Check out our membership options.
Show Notes

Ollama
Running LLMs Locally: A Beginner's Guide to Using Ollama
open-webui/open-webui
LM Studio
LM. Studio Model Catalog
Why do people like Ollama more than LM Studio?
A Starter Guide for Playing with Your Own Local AI!
host ALL your AI locally
Run your own AI (but private)

About the sponsors




Would you like to become the irreplaceable Microsoft 365 resource for your organization? Let us know!
Comments 
In Channel
loading
00:00
00:00
x

0.5x

0.8x

1.0x

1.25x

1.5x

2.0x

3.0x

Sleep Timer

Off

End of Episode

5 Minutes

10 Minutes

15 Minutes

30 Minutes

45 Minutes

60 Minutes

120 Minutes

Episode 397 – Local LLMs: Why Every Microsoft 365 & Azure Pro Should Explore Them

Episode 397 – Local LLMs: Why Every Microsoft 365 & Azure Pro Should Explore Them