The GPU depreciation debate is fasinating because it gets at the heart of capital intensity in AI infrastructure. Burry's point about earnings overstatement is valid, but the counterargument around actual utilization patterns feels more grounded in operational reality. What strikes me most is Nadella's comments about scaffolding vs model value capture. The auto-routing feature in Copilot he described could fundamentaly reshape where margins accrue if models truly become commoditized through open source checkpoints.
Fascinating, the shift to longer GPU lifecycles makes sense, but does the sheer pace of AI innovation still make older tech fuctionally 'depreciated' much faster?
On the question of GPU depreciation, the broader issue I see is a kind of duration arbitrage failure. The Ai stack is commiting long capital (data centers, power generation infra, transmission infra) against shrot technological half lives (GPUs, models, etc.) Not sure how this ends but the fiber overbuild of the late '90s/early 2000s seems like an approximate model, though in that case the tech (fiber) didn't become obsolete as quickly as GPUs. When demand for data came along you could just light up unlit fiber and satisfy the demand. Not necessarily the case with unused GPUs.
may i thank you for explaining the gpu depreciation debate
You think you’ll keep doing these?
Yes in the new year for sure
The GPU depreciation debate is fasinating because it gets at the heart of capital intensity in AI infrastructure. Burry's point about earnings overstatement is valid, but the counterargument around actual utilization patterns feels more grounded in operational reality. What strikes me most is Nadella's comments about scaffolding vs model value capture. The auto-routing feature in Copilot he described could fundamentaly reshape where margins accrue if models truly become commoditized through open source checkpoints.
Fascinating, the shift to longer GPU lifecycles makes sense, but does the sheer pace of AI innovation still make older tech fuctionally 'depreciated' much faster?
On the question of GPU depreciation, the broader issue I see is a kind of duration arbitrage failure. The Ai stack is commiting long capital (data centers, power generation infra, transmission infra) against shrot technological half lives (GPUs, models, etc.) Not sure how this ends but the fiber overbuild of the late '90s/early 2000s seems like an approximate model, though in that case the tech (fiber) didn't become obsolete as quickly as GPUs. When demand for data came along you could just light up unlit fiber and satisfy the demand. Not necessarily the case with unused GPUs.
Super helpful
thank you!