Chasing Silicon: The Race for GPUs
With the world constantly generating more data, unlocking the full potential of AI means a constant need for faster and more resilient hardware.
In this episode – the second in our three-part series – we explore the challenges for founders trying to build AI companies. We dive into the delta between supply and demand, whether to own or rent, where moats can be found, and even where open source comes into play.
Look out for the rest of our series, where we dive into terminology and technology that is the backbone of the AI, how much the cost of compute truly costs!
00:00 – Supply and demand
02:44 – Competition for AI hardware
04:32 – Who gets access to the supply available
06:16 – How to select which hardware to use
08:39 – Cloud versus bringing infrastructure in house
12:43 – What role does open source play?
15:47 – Cheaper and decentralized compute
19:04 – Rebuilding the stack
20:29 – Upcoming episodes on cost of compute
- Find Guido on LinkedIn: https://www.linkedin.com/in/appenz/
- Find Guido on Twitter: https://twitter.com/appenz
Find a16z on Twitter: https://twitter.com/a16z
Find a16z on LinkedIn: https://www.linkedin.com/company/a16z
Subscribe on your favorite podcast app: https://a16z.simplecast.com/
Follow our host: https://twitter.com/stephsmithio
Please note that the content here is for informational purposes only; should NOT be taken as legal, business, tax, or investment advice or be used to evaluate any investment or security; and is not directed at any investors or potential investors in any a16z fund. a16z and its affiliates may maintain investments in the companies discussed. For more details please see a16z.com/disclosures.