Maybe I’m missing something, but isn’t this just a standard American put option with a strike of $100 and expiry of Dec 31st?
He's answering the question "How should options be priced?"
Sure, it's possible for a big crash in Nvidia just due to volatility. But in that case, the market as a whole would likely be affected.
Whether Nvidia specifically takes a big dive depends much more on whether they continue to meet growth estimates than general volatility. If they miss earnings estimates in a meaningful way the market is going to take the stock behind the shed and shoot it. If they continue to exceed estimates the stock will probably go up or at least keep its present valuation.
Other way around: if NVidia sinks, it likely takes a bunch of dependent companies with it, because the likely causes of NVidia sinking all tell us that there was indeed an AI bubble and it is popping.
They are maintaining this astronomical growth through data centers margins from the design of their chips and all of that started from graphics related to video games.
No? That’s why they have almost no competition. Hardware starting costs are astronomical
Are they already "too big to fail"? For better or worse, they are 'all in' on AI.
My 30k ft view is that the stock will inevitably slide as AI datacenter spending goes down. Right now Nvidia is flying high because datacenters are breaking ground everywhere but eventually that will come to an end as the supply of compute goes up.
The counterargument to this is that the "economic lifespan" of an Nvidia GPU is 1-3 years depending on where it's used so there's a case to be made that Nvidia will always have customers coming back for the latest and greatest chips. The problem I have with this argument is that it's simply unsustainable to be spending that much every 2-3 years and we're already seeing this as Google and others are extending their depreciation of GPU's to something like 5-7 years.
Which is absolutely the right move when your latest datacenter's power bill is literally measured in gigawatts. Power-efficient training/inference hardware simply does not look like a GPU at a hardware design level (though admittedly, it looks even less like an ordinary CPU), it's more like something that should run dog slow wrt. max design frequency but then more than make up for that with extreme throughput per watt/low energy expense per elementary operation.
The whole sector of "neuromorphic" hardware design has long shown the broad feasibility of this (and TPUs are already a partial step in that direction), so it looks like this should be an obvious response to current trends in the power and cooling demands of big AI workloads.
However, it’s beyond my comprehension how anyone would think that we will see a decline in demand growth for compute.
AI will conquer the world like software or the smartphone did. It’ll get implemented everywhere, more people will use it. We’re super early in the penetration so far.
While thinking computers will replace human brains soon is rabid fanaticism this statement...
> AI will conquer the world like software or the smartphone did.
Also displays a healthy amount of fanaticism.
As far as AI conquering the world. It needs a "killer app". I don't think we'll really see that until AR glasses that happen to include AI. If it can have context about your day, take action on your behalf, and have the same battery life as a smartphone...
But yes. Cisco's value dropped when there was not same amount to spend on networking gear. Nvidia's value will drop as there is not same amount of spend on their gear.
Other impacted players in actual economic downturn could be Amazon with AWS, MS with Azure. And even more so those now betting on AI computing. At least general purpose computing can run web servers.
If it had given me the right easy to understand answer right away I would have spent 2 minutes of both MY time and ITS time. My point is if AI will improve we will need less of it, to get our questions answered. Or, perhaps AI usage goes up if it improves its answers?
The data is very strongly showing the quality of AI answers is rapidly improving. If you want a good example, check out the sixty symbols video by Brady Haran, where they revisited getting AI to answer a quantum physics exam after trying the same thing 3 years ago. The improvement is IMMENSE and unavoidable.
Doesn't mean that crypto is not being used, of course. Plenty of people do use things like USDT, gamble on bitcoin or try to scam people with new meme coins, but this is far from what crypto enthusiasts and NFT moguls promised us in their feverish posts back in the middle of 2010s.
So imagine that AI is here to stay, but the absolutely unhinged hype train will slow down and we will settle in some kind of equilibrium of practical use.
AI is different and businesses are already using it a lot. Of course there is hype, it’s not doing all the things the talking heads said but it does not mean immense value is not being generated.
It's like if your taxi company bought taxis that were more fuel efficient every year.
You kind of have to.
Replacing cars every 3 years vs a couple % in efficiency is not an obvious trade off. Especially if you can do it in 5 years instead of 3.
Isn't that precisely how leasing works? Also, don't companies prefer not to own hardware for tax purposes? I've worked for several places where they leased compute equipment with upgrades coming at the end of each lease.
That's where the analogy breaks. There are massive efficiency gains from new process nodes, which new GPUs use. Efficiency improvements for cars are glacial, aside from "breakthroughs" like hybrid/EV cars.
It's not like the CUDA advantage is going anywhere overnight, either.
Also, if Nvidia invests in its users and in the infrastructure layouts, it gets to see upside no matter what happens.
I have not seen hard data, so this could be an oft-repeated, but false fact.
If this was anywhere close to a common failure mode, I'm pretty sure we'd know that already given how crypto mining GPUs were usually ran to the max in makeshift settings with woefully inadequate cooling and environmental control. The overwhelming anecdotal evidence from people who have bought them is that even a "worn" crypto GPU is absolutely fine.
Another commonly forgotten issue is that many electrical components are rated by hours of operation. And cheaper boards tend to have components with smaller tolerances. And that rated time is actually a graph, where hour decrease with higher temperature. There were instances of batches of cards failing due to failing MOSFETs for example.
This doesn't mean much for inference, but for training, it is going to be huge.
(1) We simply don't know what the useful life is going to be because of how new the advancements of AI focused GPUs used for training and inference.
(2) Warranties and service. Most enterprise hardware has service contracts tied to purchases. I haven't seen anything publicly disclosed about what these contracts look like, but the speculation is that they are much more aggressive (3 years or less) than typical enterprise hardware contracts (Dell, HP, etc.). If it gets past those contracts the extended support contracts can typically get really pricey.
(3) Power efficiency. If new GPUs are more power efficient this could be huge savings on energy that could necessitate upgrades.
Their stock trajectory started with one boom (cryptocurrencies) and then seamlessly progressed to another (AI). You're basically looking at a decade of "number goes up". So yeah, it will probably come down eventually (or the inflation will catch up), but it's a poor argument for betting against them right now.
Meanwhile, the investors who were "wrong" anticipating a cryptocurrency revolution and who bought NVDA have not much to complain about today.
Technical analysis fails completely when there's an underlying shift that moves the line. You can't look at the past and say "nvidia is clearly overvalued at $10 because it was $3 for years earlier" when they suddenly and repeatedly 10x earnings over many quarters.
I couldn't get through to the idiots on reddit.com/r/stocks about this when there was non-stop negativity on nvidia based on technical analysis and very narrow scoped fundamental analysis. They showed a 12x gain in quarterly earnings at the time but the PE (which looks on past quarters only) was 260x due to this sudden change in earnings and pretty much all of reddit couldn't get past this.
I did well on this yet there were endless posts of "Nvidia is the easiest short ever" when it was ~$40 pre-split.
Once the money dries up, a new bubble will be invented to capture the middle class income, like NFTs and crypto before that, and commissionless stocks, etc etc
It’s not all pump-and-dump. Again, this is a pretty reductive take on market forces. I’m just saying I don’t think it’s quite as unsustainable as you might think.
All hypothetical, of course, but to me that's the most convincing bear case I've heard for NVIDIA.
Still, it's interesting the probability is so high while ignoring real-world factors. I'd expect it to be much higher due to: - another adjacent company dipping - some earnings target not being met - china/taiwan - just the AI craze slowing down
Everything that can't go on forever will eventually stop. But when?
I do hope they crash so that I can buy as much as possible at a discount.
Nvidia stock crash will happen when the vendor financing bubble bursts.
They are engaged in a dangerous game of circular financing. So it is case of when, not if the chickens come home to roost.
It is simply not sustainable.
The only way the stock could remain at its current price or grow (which is why you'd hold it) is if demand would just keep going up (with the same lifecycle as current GPUs) and that there would be no competition, which the latter to me us just never going to be a thing.
Investors are convinced that Nvidia can maintain its lead because they have the "software" side, I.e. CUDA, which to me is so ridiculous, as if with the kind of capital that's being deployed into these datacenters, you couldn't fit your models into other software stacks by hiring people....
assuming LLM coding agents are good, but if they aren't any good, then what is the value of the CUDA code?
My personal opinion, having witnessed first hand nearly 40 years of tech evolution, is that this AI revolution is different. We're at the very beginning of a true paradigm shift: the commoditization of intelligence. If that's not enough to make people think twice before betting against it, I don't know what is. And it's not just computing that is going to change. Everything is about to change, for better or worse.
rwmj•2h ago
fkarg•1h ago
utopiah•1h ago
zitterbewegung•1h ago
utopiah•1h ago
LunaSea•1h ago
mikkupikku•1h ago
rwmj•1h ago
throwaway5752•1h ago
immibis•1h ago
bpodgursky•1h ago
There would be a supply crunch but a lot of dollars will be shuffled VERY fast to ramp up production.
rwmj•1h ago
georgeburdell•1h ago
blackoil•1h ago
maxglute•1h ago
khalic•1h ago
cjbgkagh•1h ago
whatevaa•1h ago
mikkupikku•1h ago
Ekaros•1h ago
If something even more drastic happens. China might even attempt unification with some reasoning like protecting Taiwan from USA or other nations.
blackoil•1h ago
nebula8804•1h ago
blackoil•16m ago
eagerpace•1h ago
rwmj•1h ago
eagerpace•1h ago
rbtprograms•1h ago
eagerpace•38m ago
wordpad•59m ago
fullshark•1h ago
bob1029•1h ago
heathrow83829•54m ago
flowerthoughts•32m ago
toephu2•3m ago
[1]https://www.tsmc.com/english/aboutTSMC/TSMC_Fabs