The combination of the right software and commodity hardware will prove capable of handling most machine learning tasks
Update: 2020-01-09
Description
In this episode of the Data Exchange I speak with Nir Shavit, Professor of EECS at MIT, and cofounder and CEO of Neural Magic, a startup that is creating software to enable deep neural networks to run on commodity CPUs (at GPU speeds or faster). Their initial products are focused on model inference, but they are also working on similar software for model training.
Our conversation spanned many topics, including:
- Neurobiology, in particular the combination of Nir’s research areas of multicore software and connectomics – a branch of neurobiology.
- Why he believes the combination of the right software and CPUs will prove capable of handling many deep learning tasks.
- Speed is not the only factor: the “unlimited memory” of CPUs are able to unlock larger problems and architectures.
- Neural Magic’s initial offering is in inference, model training using CPUs is also on the horizon.
Detailed show notes can be found on The Data Exchange web site.
Comments
Top Podcasts
The Best New Comedy Podcast Right Now – June 2024The Best News Podcast Right Now – June 2024The Best New Business Podcast Right Now – June 2024The Best New Sports Podcast Right Now – June 2024The Best New True Crime Podcast Right Now – June 2024The Best New Joe Rogan Experience Podcast Right Now – June 20The Best New Dan Bongino Show Podcast Right Now – June 20The Best New Mark Levin Podcast – June 2024
In Channel