7 Comments
User's avatar
Ray Zhou's avatar

may i thank you for explaining the gpu depreciation debate

The Curious LP's avatar

You think you’ll keep doing these?

Sophie's avatar

Yes in the new year for sure

Rainbow Roxy's avatar

Fascinating, the shift to longer GPU lifecycles makes sense, but does the sheer pace of AI innovation still make older tech fuctionally 'depreciated' much faster?

Dave Friedman's avatar

On the question of GPU depreciation, the broader issue I see is a kind of duration arbitrage failure. The Ai stack is commiting long capital (data centers, power generation infra, transmission infra) against shrot technological half lives (GPUs, models, etc.) Not sure how this ends but the fiber overbuild of the late '90s/early 2000s seems like an approximate model, though in that case the tech (fiber) didn't become obsolete as quickly as GPUs. When demand for data came along you could just light up unlit fiber and satisfy the demand. Not necessarily the case with unused GPUs.

Anon's avatar

Super helpful