The combination of the right software and commodity hardware will prove capable of handling most machine learning tasks
In this episode of the Data Exchange I speak with Nir Shavit, Professor of EECS at MIT, and cofounder and CEO of Neural Magic, a startup that is creating software to enable deep neural networks to run on commodity CPUs (at GPU speeds or faster). Their initial products are focused on model inference, but they are also working on similar software for model training.
Our conversation spanned many topics, including:
- Neurobiology, in particular the combination of Nir’s research areas of multicore software and connectomics – a branch of neurobiology.
- Why he believes the combination of the right software and CPUs will prove capable of handling many deep learning tasks.
- Speed is not the only factor: the “unlimited memory” of CPUs are able to unlock larger problems and architectures.
- Neural Magic’s initial offering is in inference, model training using CPUs is also on the horizon.