Budget the chips wil be redundant in 3 years ? 2 years, 1 year?
If 1-2 years everyone books are cooked and a massive write off is coming at some point (is what’s the article is implying per its back of envelope calcs)
So folks what’s your call? How many years is the real useful life of the NVIDIA chips they’ve all currently bought and filled their server farms with?
State years or months and why you think that timeframe :)
Right now, any workable AI hardware is valuable because the market is not presently saturated, and people are in a "buy what you can get" mindset.
Once the market is reasonably saturated, people will get more selective. Older parts will be less desirable-- less efficient, less featureful-- and if you have trucks full of them to dump on the market, it's going to depress the price.
It's like the PC market of 30 years ago. You got a new Pentium-100, but you could still sell on the old 386/16 for a fair amount of cash because for someone else, it beats "no computer". That market doesn't really exist anymore-- today, you may as well just leave a Haswell or Ryzen 1000 box at the curb unless you want to spend 6 weeks dealing with Craigslist flakes for $30.
dehrmann•4mo ago
Maybe it's a nit, but the advances are in how the chips access memory and are networked together.
> In reality, not all of the AI quintet’s servers would be useless after three years, let alone 12 months. They can keep performing oodles of non-AI work.
Not really. The AI servers are essentially useless for non-AI work.
nextaccountic•4mo ago
JackSlateur•4mo ago
Can I run k8s on a GPU ? Yes, why not. Will it be efficient ? No.
(replace k8s by "whatever random code you are mostly running")
fluidcruft•4mo ago
monkeydreams•4mo ago