Edit:
- I wonder what's stopping Nvida from releasing an AI phone
- A LLM competitor service (Hey, how about you guys make your own chips?)
- They are already releasing an AI PC
- Their own self driving cars
- Their own robots
If you mess with them, why won't they just compete with you?
Just wanted to say one more thing, that Warren Buffet famously said he regretted not investing in both Google and Apple. I think something like this is happening again, especially as there are lulls that the mainstream public perceives, but enthusiasts don't. To maintain the hyperbole, if you are not a full believer as a developer, then you are simply out of your mind.
I've tried explaining that one or two AI data center clients for Nvidia dwarfs the entire gaming GPU market, but he just doesn't get it.
> I've tried explaining that one or two AI data center clients for Nvidia dwarfs the entire gaming GPU market, but he just doesn't get it.
I have a feeling that the different judgements come from the fact that the coworker thinks that the AI bubble will soon burst - thus, in his judgement, the AI data center sector of Nvidia is insanely overvalued, and will collapse. What will "save" Nvidia then will be the gaming GPUs. Thus, in his opinion, this is the sector that matters most for Nvidia, since it will become Nvidia's lifeline when (not "if"! - in your coworker's judgement) things will go wrong in AI.
You, on the other hand believe AI data centers are here to stay (which is a bold assumption: it could happen that AI will move more to the edge), and no big competition will arise for NVidia for "big AI ASICs" (another bold assumption). Your judgment is based on these two strong assumptions about the future, while your coworker's is based on different (possibly similarly bold) assumptions.
Currently. :-)
You're silly if you think otherwise.
Before the current AI hype, except for some rather specialized applications, people had rather little use for GPU acceleration (GPGPU) in data centers.
The market is so tiny that their capex investments into AI stuff would catch them with massive debt that the gaming revenue couldn’t support and they would have to go through bankruptcy
EDIT: brief search says last year apple sold 300m ios vs 20m macos devices.
B2C is a hellish headache that has marginal returns if you are not B2C first, and the amount of investment needed to be B2C competent just isn't worth it when there are alternative options to invest in
> A LLM competitor service (Hey, how about you guys make your own chips
Already exists. AI Foundary
> They are already releasing an AI PC
It's just an OEM rehash
> Their own self driving cars
Not worth the headache and also losing customers like Google or Amazon due to competitive pressure
------
Cannot reply: releasing their own "Nvidia Car" means they will lose their existing automotive partners becuase they will not spend on a competitor. Same reason Walmart stipulates EVERY tech vendor must decouple from AWS when selling to them.
I'm curious to know more about this if you (or anyone else) can elaborate on it.
What constitutes a tech vendor? Are you talking about Walmart buying PCs from Dell, from buying/renting a SaaS from someone, from IT-consulting coming in to do a one-time service for them (even if that service takes years)?
You're not talking about stuff like "Apple wants to sell iPhones to Walmart customers", I assume - yes?
WRT your edit: The answer to all of this is that it's very hard and requires a huge amount of investment to produce good vertical solutions in each of these spaces. You cannot build a good AI phone without first building a good phone. You cannot build a self-driving car without starting with a good car, etc. For robots, I'll point you to someone using Nvidia chips: the Matic is a complete ground-up rethink of how robot vacuums should work. It's taken them 7 years to get to early adopter phase.
More like you cannot build a self-driving car without starting with a good phone. See Huawei.
Agreed, and for all the "price crash" I still can't just whip out my credit card and purchase an hour or two on an H100/B100.
It's still multi-year contracts and "Contact Sales".
It's a low margin business and would hurt the balance sheet more than the completely irrelevant revenue from a project like that.
I've been investing in semi for decades and what strikes me about this recent cycle is that so many don't seem to understand that semi is a highly cyclical business that is prone to commoditization waves and inventory/capacity overbuild.
And speaking as a trader, instead of reinforcing your firmly held base case, I'd strongly consider painting out the bear cases. Look at the roadmaps of the hyperscalers that are designing their own chips for internal use, etc. And never use the word faith when it comes to markets.
You could easily see NVIDIAs margins get chopped down, and see the multiple re-rate lower from here. Actually, I'd argue the name is well on the way down this path already.
It's almost guaranteed to happen sooner or later. Semi down cycles are usually brutal for semi equities.
That's not to say it isn't a great company. It's certainly not a Buffett name though.
The cyclical stuff was the argument made for semis during the 2010s when no one gave a shit about semis really. I think the game changed, but again, I do operate on faith, or in investor terms, conviction. The main evidence for why the game has changed to me (well, other than AI being the most incredible piece of tech we ever built) is mostly that there are companies that have no business making chip hardware now interested in making chip hardware. That's not usually part of the cycle.
Anyway, maybe marvell should focus more on the the consumer side since hobbyists seem to be building crazy ai rigs and likely needs drives, at the very least, for models. It sort of seems like hobbyists are devouring any worthy gpu that gets produced.
Gaming is only 7% of Nvidia's revenue.
They are losing their biggest customers to custom in-house silicon, and smaller orders are going to compete with a market being flooded by superfluous hardware from companies which either went bust due the the AI bubble shrinking, or went bust because they weren't able to compete with the big fish.
That's not a trend yet. We're about to enter an era where most media is generated. Demand is only going to go up, and margins may not matter if volume goes up.
> The open question is long-term (>6yrs) durability1. Hyperscalers (Google, Microsoft, Amazon, and Meta) are aggressively consolidating AI demand to become the dominant consumers of AI accelerators; while developing competitive, highly-credible chip efforts.
Hyperscalers aren't the only players building large GPU farms. There are large foundation model companies doing it too, and there are also new clouds that offer compute outside of the hyperscaler offerings (CoreWeave, Lambda, and dozens of others). Granted, these may be a drop in the bucket and hyperscalers may still win this trend.
But to your point, Disney is using GenAI in their new live action Moana film. Presumably that'll do lots of sales.
> Disney is using genAI in their new live action Moana film
If this is your bar then sure, but I think the interesting question is when/if we cross into a regime where mostly self-managed genAI is in true competition with the media you consume day to day. Not something that's hundreds of thousands of person hours being enhanced by genAI. I don't think there's any chance we see a total collapse of demand for this stuff but I think the jury is still out on how valuable it truly is imo.
I'm sure it's non zero and I expect demand will continue to rise, but it may plateau or slow sooner than people think, I just don't think we can clearly say yet.
At some point one of these Nvidia doomers will be right but there is a long line of them who failed miserably.
The article explains that Nvidia's biggest customers (50% of datacenter revenue) are switching to their own hardware.
Google runs AI / HPC workloads on their own hardware and has been doing that for more than a decade. Google Gemini was trained on TPUs developed in house. It does not run on Nvidia hardware.
https://www.tomshardware.com/tech-industry/artificial-intell...
https://finance.yahoo.com/news/apple-might-ai-game-1-1951003...
Before that, in a Wired article from 10 years ago about siri and ai, one of the apple higher ups was quoted bragging about having one of the baddest gpu farms around (paraphrasing)
But their internal workloads and their frontier model (Gemini) runs on TPUs.
The article seems focused more on stock price and where to bet, than the market for GPUs or generic hardware vendors.
There's a kernel of truth in it, but if I was McDonalds I'd care a lot more about what KFC is doing than the market trends of cast iron pans.
90% of Nvidia's revenue is from datacenters.
If the datacenters stop buying Nvidia's products, and use their own hardware instead, then Nvidia loses 90% of its revenue.
Nope; if hypothetically 100% left NVidia, whether to their own hardware or to not use GPUs at all, it'd be easy to say NVidia would be last in the market
NVIDIA is very strategic about building product to avoid commodification -- both by building out network effects where software is tied to their proprietary sdk libraries, and by always focusing on being at the cutting edge of product.
Both these things can be true: a large company should try to build their own hardware to reduce supplier risk, and a large company should be open to suppliers that have better product that delivers business value.
So far, these large companies' internal hardware has been useful internally but never a complete replacement for NVIDIA, which keeps staying at the cutting edge of new capabilities.
NVIDIA already faced existential risk when Intel was commodifying all the dedicated motherboard components in the late 90s, 2000s, (like sound cards etc), so they're hyper-aware of this.
don't know much about the rest tho
Google use their own hardware for AI/HPC. Nvidia hardware is offered to external customers who demand it on Google Cloud Platform.
The AI datacenter explosion has only occurred within the past two years. We are talking about plans that will take years to implement. The hyperscalers are trying to cut out Nvidia as soon as possible.
Not ideal for them but hardly a death blow
I figure regardless of tariffs and competition and other fluctuations the demand for computing power will endlessly trend upwards and as much as we can produce will be consumed.
Good luck to you.
Personally, I have found NVIDIA to be one of the most hyped stocks I have ever seen and it feels weird to take a position that it is under-hyped. That said, people aren't capable of beating the market consistently so I would never invest based on this, or any intuition or information I had.
alephnerd•5h ago
This is what will help protect Nvidia now that DC and cluster spend is cooling.
They own the ecosystem thanks to CUDA, Infiniband, NGC, NVLink, and other key tools. Now they should add additional applications (the AI Foundry is a good way to do that), or forays into adjacent spaces like white-labeled cluster management.
Working on building custom designs and consulting on custom GPU projects would be helpful as well by helping monetize their existing design practice during slower markets.
Of course, Nvidia is starting to do both, with Nvidia AI Foundry for the former and is working on the latter by starting a GPU architecture and design consulting as announced at GTC and under McKinney
scrlk•5h ago
Apart from Nintendo, who has successfully partnered with Nvidia? Apple, Microsoft and Sony have all been burnt in the past.
alephnerd•5h ago
Nvidia has started formalizing that last year [0], but it's a new muscle for them.
[0] - https://www.reuters.com/technology/nvidia-chases-30-billion-...
schaefer•5h ago
AlotOfReading•4h ago
voidspark•4h ago
No they do not. The article explains that Google, Amazon, Microsoft, and Meta are developing their own hardware and software for AI/HPC.
Google Gemini was not trained using CUDA or Nvidia hardware.
liuliu•4h ago
Chinese CSPs are the only ones can develop their own hardware / software for AI / HPC.
voidspark•4h ago
https://www.reuters.com/technology/artificial-intelligence/m...
https://azure.microsoft.com/en-us/blog/azure-maia-for-the-er...
https://aws.amazon.com/ai/machine-learning/trainium/
liuliu•4h ago
Meta will not be able to produce a chip that can run GenAI workload in the next 2 years.
Microsoft is doing a side-quest, and they haven't even proved themselves with their FPGA adventure and ARM server adventure.
Amazon is legit, they have done well on ARM server, but trainium is TBD, and how much they will pull back in a recession given Jassy is a number guy will be a question mark.
No need to discuss, we can just see this in 2 years, everything will be crystal clear.
voidspark•3h ago
solidasparagus•3h ago
I suspect the closed nature of the ecosystem will preclude them from winning as much as they could.
bee_rider•2h ago
There’s maybe some wiggle room, in that these AI distributed systems might not (?) look like HPC/scientific computing systems—maybe they don’t need Infiniband style low latency. So these other funky networks might work.
But like, Nvidia has the good nodes and the good network. That’s a rough combination to compete against.