“Musings on Reported Cost of Compute (Oct 2025)” by Vladimir_Nesov
Description
There are many ways in which costs of compute get reported. A 1 GW datacenter site costs $10-15bn in the infrastructure (buildings, cooling, power), plus $30-35bn in compute hardware (servers, networking, labor), assuming Nvidia GPUs. Useful life of the infrastructure is about 10-15 years, and with debt financing a developer only needs to ensure it's paid off over those 10-15 years, which comes out at $1-2bn per year. For the compute hardware, the useful life is taken as about 5 years, which gives $6-7bn per year. Operational expenses (electricity, maintenance) are about $2.0-2.5bn per year.
In total, 1 GW of compute costs about $9-11bn per year, but whoever paid the compute hardware capex needs to ensure the payments continue for 5 years, so a contract for 1 GW of compute will be 5 years long, which makes it a single contract for at least $45-55bn, which might become [...]
---
Outline:
(02:22 ) Non-Nvidia Hardware
(03:29 ) Model Sizes in 2026
---
First published:
October 24th, 2025
Source:
https://www.lesswrong.com/posts/oPWB7SBn5j6Nw8RSX/musings-on-reported-cost-of-compute-oct-2025
---
Narrated by TYPE III AUDIO.



