https://bsky.app/profile/anthonycr.bsky.social/post/3lz7qtjy...
(pencil in another loop between Nvidia and OpenAI now)
so, the money flow is:
nvidia -> openAI -> oracle -> nvidia
looks like openAI is the lynchpin on which the entire AI ecosystem is based on.
It does seem like Satya believes models will get commoditized, so no need to hitch themselves with OpenAI that strongly.
https://www.reuters.com/business/microsoft-use-some-ai-anthr...
(Quick, inaccurate googling) says there will be "well over 1 million GPUs" by end of the year. With ~800 million users, that's 1 NVIDIA GPU per 800 people. If you estimate people are actively using ChatGPT 5% of the day (1.2 hours a day), you could say there's 1 GPU per 40 people in active use. Assuming consistent and even usage patterns.
That back of the envelope math isn't accurate, but interesting in the context of understanding just how much compute ChatGPT requires to operate.
Edit: I asked ChatGPT how many GPUs per user, and it spit out a bunch of calculations that estimates 1 GPU per ~3 concurrent users. Would love to see a more thorough/accurate break down.
With varying consumption/TDP, could be significantly more, could be significantly less, but at least it gives a starting figure. This doesn't account for overhead like energy losses, burst/nominal/sustained, system overhead, and heat removal.
To be clear, I am comparing power consumption only. In terms of mining power, all these GPUs could only mine a negligible fraction of what all specialized Bitcoin ASIC mine.
Edit: some math I did out of sheer curiosity: a modern top-of-the-line GPU would mine BTC at about 10 Ghash/s (I don't think anyone tried but I wrote GPU mining software back in the day, and that is my estimate). Nvidia is on track to sell 50 million GPUs in 2025. If they were all mining, their combined compute power would be 500 Phash/s, which is 0.05% of Bitcoin's global mining capacity.
The NVL72 is 72 chips is 120 kW total for the rack. If you throw in ~25 kW for cooling its pretty much exactly 2 kW each.
Google is pretty useful.
It uses >15 TWh per year.
Theoretically, AI could be more useful than that.
Theoretically, in the future, it could be the same amount of useful (or much more) with substantially less power usage.
It could be a short-term crunch to pull-forward (slightly) AI advancements.
Additionally, I'm extremely skeptical they'll actually turn on this many chips using that much energy globally in a reasonable time-frame.
Saying that you're going to make that kind of investment is one thing. Actually getting the power for it is easier said than done.
VC "valuations" are already a joke. They're more like minimum valuations. If OpenAI is worth anywhere near it's current "valuations", Nvidia would be criminally negligent NOT to invest at a 90% discount (the marginal profit on their chips).
30 TWh per year is equivalent to an average power consumption of 3.4 GW for everything Google does. This partnership is 3x more energy intensive.
Ultimately the difference in `real value/MWh` between these two must be many orders of magnitude.
[1] https://sustainability.google/reports/google-2025-environmen...
AI that could find a cure for cancer isn't the driving economic factor in LLM expansion, I don't think. I doubt cancer researchers are holding their breath on this.
I imagine this as a subtractive process starting with the maximum energy window.
Because if some card with more FLOPS comes available, and the market will buy all your FLOPS regardless, you just swap it in at constant y / for no appreciable change in how much you're spending to operate.
(I have no idea if y is actually much larger than x)
Therefore, they are listing in terms of the critical limit: power.
Personally, I expect this to blow up first in the faces of normal people who find they can no longer keep their phones charged or their apartments lit at night, and only then will the current AI investment bubble pop.
> to invest up to
i.e. 0 to something something
I know watts but I really can’t quantify this. How much of Nvidia is there in the amount of servers that consume 10GW? Do they all use the same chip? What if there is newer chip that consumes less, does the deal imply more servers? Did GPT write this post?
Also, the idea of a newer Nvidia card using less power is très amusant.
In the actual shady version of this, Company B isn’t the hottest AI investment around, it’s a shell company created by your brother’s cousin that isn’t actually worth what you’re claiming on the balance sheet because it was only created for the round tripping shell game.
Every time HackerNews talks about anything in the legal or finance realm, people trip over themselves to make arguments for why something a big tech is doing is illegal. This is definitively neither illegal nor shady. If Nvidia believes, for example, that OpenAI can use their GPUs to turn a profit, then this is inherently positive sum economically for both sides: OpenAI gets capital in the form of GPUs, uses them to generate tokens which they sell above the cost of that capital and then the return some of the excess value to Nvidia. This is done via equity. It's a way for Nvidia to get access to some of the excess value of their product.
https://www.cnbc.com/2025/09/17/ai-startup-nscale-from-uk-is...
Also, investing in OpenAI means they get equity in return, which is not a worthless asset. There is actual mutually beneficial trade occurring.
It's a good time to gently remind everyone that there are a whole pile of legal things one can do to change how a security looks "by the numbers" and this isn't even close to the shadiest. Heck some sell-side research makes what companies themselves do look benign.
Two economists are walking in a forest when they come across a pile of shit.
The first economist says to the other “I’ll pay you $100 to eat that pile of shit.” The second economist takes the $100 and eats the pile of shit.
They continue walking until they come across a second pile of shit. The second economist turns to the first and says “I’ll pay you $100 to eat that pile of shit.” The first economist takes the $100 and eats a pile of shit.
Walking a little more, the first economist looks at the second and says, "You know, I gave you $100 to eat shit, then you gave me back the same $100 to eat shit. I can't help but feel like we both just ate shit for nothing."
"That's not true", responded the second economist. "We increased total revenue by $200!"
This kind of corporate behavior is bad and will end up hurting somebody. If we're lucky the fallout will only hurt Nvidia. More likely it will end up hurting most taxpayers.
... and we've seen this before in previous bubbles ...
Microsoft and Google have been doing it for decades. Probably, MS started that practice.
In the end, Nvidia will have OpenAI shares, which are valuable, and OpenAI will have GPUs, which are also valuable. It is not fake revenue, the GPUs will be made, sold at market price, and used, they are not intended to be bought back and sold to another customer. And hopefully, these GPUs will be put to good use by OpenAI so that they can make a profit, which will give Nvidia some return on investment.
It doesn't look so different from a car loan, where the dealer lends you the money so that you can buy their car.
"Good news everybody, your power bills are going up and your creaking, chronically underfunded infrastructure is even closer to collapse!"
https://www.newstarget.com/2025-08-02-texas-ai-data-centers-...
Execs ask their employees to return to office, because they don't know how to measure good employee output.
Now OpenAI and Nvidia measure success by gigawatt input into AI instead of successful business outcomes from AI.
The electric bills are getting out of hand.
I think the eventual AI bust will lead to the same thing, as the costs for developing a domain-specific model have cratered over the past couple years.
AI/ML (and the infra around it) is overvalued at their current multiples, but the value created it real, and as the market grows to understand the limitations but also the opportunities, a more realistic and permanent boo' will occur.
It appears that the answer is "more likely yes than not".
Counting some examples:
- self driving / autonomous vehicles (seeing real deployments now with Waymo, but 99% deployment still ahead; meanwhile, $$$ billions of value destroyed in the last 10-15 years with so many startups running out of money, getting acquihired, etc)
- Humanoid robots... (potential bubble?? I don't know of a single commercial deployment today that is worth any solid revenues, but companies keep getting funded left / right)
I think you make a very interesting observation about these bubbles potentially being an inherent part of new technology expansion.
It makes sense too from a human behavior perspective. Whenever there are massive wins to be had, speculation will run rampant. Everyone wants to be the winner, but only a small fraction will actually win.
Solar does compete economically with methane already, and it's only going to improve even more.
Firstly, there is no such thing as an infinitely scaling system.
Secondly, because power transmission isn't moving freight. The infrastructure to move electricity long distances is extremely complicated. Even moving past basic challenges like transmission line resistance and voltage drop, power grids have to be synchronized in both phase and frequency. Phase instability is a real problem for transmission within hundreds of miles, let alone thousands upon thousands.
Also that infrastructure is quite a bit more expensive to build than rail or even roads, and it's very maintenance hungry. An express built piece of power transmission that goes direct from a desert solar farm to one of the coasts is just fragile centralization. You have a long chain of high-maintenance infrastructure, a single point of failure makes the whole thing useless. So instead you go through the national grid, and end up with nothing, because all of that power is getting sucked up by everyone between you and the solar farm. It probably doesn't even make it out of the state it's being generated in.
BTW the vast majority of the cost of electricity is in the infrastructure, not its generation. Even a nuclear reactor is cheap compared to a large grid. New York city's collection of transmission lines, transformers, etc. (not even any energy generation infrastructure, just transmission) ballparks a couple hundred billion dollars. Maintenance is complex and extremely dangerous, which means the labor is $$$$. That's what you're paying for. That's why as we continue to move towards renewables price/watt will continue to go up, even though we're not paying for the expensive fuel anymore. The actual ~$60 million worth of fuel an average natural gas plant burns in a year pales in comparison to the billions a city spends making sure the electrons are happy.
If the coast-to-coast railways hadn't been built in the past, I don't think the US could build them today. There are too many parties who can now block big projects altogether or force the project to spend another 18 months proving that it should be allowed to move forward.
67% of new grid capacity in the US was solar in 2024 (a further 18% was batteries, 9% wind, and 6% for everything else). In the first half of 2025 that dropped to 56% solar, 26% batteries, 10% wind, and 8% everything else (gas). Source for numbers: https://seia.org/research-resources/solar-market-insight-rep...
I'm confused because if I assume each rack takes up 1 square meter I get a much smaller footprint: around 12 hectares or 17 football fields.
And that assumes that the installation is one floor. I don't know much about data centers but I would have thought they'd stack them a bit.
Am I the only person who had to look up how big Monaco was?
[1]: https://en.wikipedia.org/wiki/Monaco
[2]: https://www.wolframalpha.com/input?i=10+GW+%2F+%2880kw+%2F+m...
I generally like what’s been happening with AI but man this is gonna crash hard when reality sets in. We’re reaching the scary stage of a bubble where folks are forced to throw more and more cash on the fire to keep it going with no clear path to ever get that cash back. If anyone slows down, even just a bit, the whole thing goes critical and implodes.
The current SOTA is going to pale in comparison to what we have 10 years from now.
What advancements?
We have done a fabulous job at lowering power consumption while exponentially increasing density of cores and to a lesser extent transistors.
Delivering power to data centers was becoming a problem 20 ish years ago. Today Power density and heat generation are off the charts. Most data center owners are lowering per rack system density to deal with the "problem".
There are literal projects pushing not only water cooling but refrigerant in the rack systems, in an attempt to get cooling to keep up with everything else.
The dot com boom and then Web 2.0 were fueled by Mores law, by Clock doubling and then the initial wave of core density. We have run out of all of those tricks. The new steps that were putting out have increased core densities but not lowered costs (because yields have been abysmal). Look at Nvidia's latests cores, They simply are not that much better in terms of real performance when compared to previous generations. If the 60 series shows the same slack gains then hardware isnt going to come along to bail out AI --- that continues to demand MORE compute cycles (tokens on thinking anyone) rather than less with each generation.
Collapse might look a little like the dot com bubble (stock crashes, bankruptcies, layoffs, etc)
These hype cycles aren't even bad per se. There is lots of capital to test out lots of useful ideas. But only a fraction of those will turn out to be both useful and currently viable, and the readjustment will be painful
Bubble collapsing looks like enshittification of OpenAI tools as they try to raise revenues. It’ll ripple all throughout tech as everyone is tied into LLMs, and capital will be harder to come by.
This is totally False, NVDA has not done any stock offerings. The money is coming from the ungodly amount of GPUs they are selling. In fact they are doing the opposite, they are buying back their stock because they have more money that they know what to do with.
Meanwhile NVDA stock is mildly up on this news, so the current owners of NVDA seem to like this investment. Or at least not hate it.
Agreed that we’ll see ad-enabled ChatGPT in about five minutes. What’s not clear is how easily we’ll be able to identify the ads.
Then consider we are about to lower interest rates and kick off the growth cycle again. The only way these valuations are going is way up for the foreseeable future
Why does monetizing OpenAI tools lead to bubble collapse? People are clearly willing to pay for LLMs
[0] dyson spheres are a joke / Angela Collier https://youtu.be/fLzEX1TPBFM
How much investment and prioritization in scaling laws is justified?
What is the source of the cash in steps 3, 4, and 7?
Disclaimer: I also have a small amount of money in vanguard IRA
I’m not saying there isn’t a bubble, but I am saying if the researchers and strategists absolutely closest to the “metal” of realtime frontier models are correct that AGI is in reach, then this isn’t a bubble, it’s a highly rational race. One that large players seem to be winning right now.
Inference services are wildly profitable. Currently companies believe it’s economically sensible to plow that money into R&D / Investment in new models through training.
For reference, oAI’s monthly revs are reportedly between $1b and $2b right now. Monthly. I think if you do a little napkin math you’ll see that they could be cashflow positive any time they wanted to.
Then my selling 2 dollars for 1 dollar is a wildly profitable business as well! Can't sell them fast enough!
Why does it seem like so many people have ceased to think critically?
Even if we assume this is true, the downstream customers paying for that inference also need to profit from it on average in order for the upstream model training to be sustainable, otherwise the demand for inference will dry up when the music stops. There won't always be a parade of over-funded AI startups burning $10 worth of tokens to make $1 in revenue.
I can maybe digest the fact that it helped prototype and ship a bit more code in a shorter time frame... but does that warrant in enough new customers or a higher value product that would justify $100k a month?!
Right now, I assume more the former than the latter. But if you're an optimistic investor, I can see why one might think a few hundred billion dollars more might get us an AI that's close enough to the latter to be worth it.
Me, I'm mostly hoping that the bubble pops soon in a way I can catch up with what the existing models can already provide real help with (which is well short of an entire project, but still cool and significant).
* e.g. the tokens are bad financial advice that might as well be a repeat of SBF
** how many tokens would get you the next Minecraft?
The real question is what are we gonna do with all this cheap GPU compute when the bubble pops! Will high def game streaming finally have its time to shine? Will VFX outsource all of its render to the cloud? Will it meet the VR/AR hardware improvements in time to finally push the tech mainstream? Will it all just get re-routed back to crypto? Will someone come up with a more useful application of GPU compute?
Even if AI somehow bucks the trend and stops advancing in leaps? It's still on track to be the most impactful technology since smartphones, if not since the Internet itself. And the likes of Nvidia? They're the Cisco of AI infrastructure.
AI is here to stay, but the question is whether the players can accurately forecast the growth rate, or get too far ahead of it and get financially burnt.
Is there some (tax?) efficiency where OpenAI could take money from another source, then pay it to Nvidia, and receive GPUs. But instead taking investment from Nvidia acts as a discount in some way.
(In addition to Nvidia being realistically the efficient/sole supplier of an input OpenAI currently needs. So this gives
1. Nvidia an incentive to prioritize OpenAI and induces a win/win pricing component on Nvidia's GPU profit margin so OpenAI can bet on more GPUs now
2. OpenAI some hedge on GPU pricing's effect on their valuations as the cost/margin fluctuates with new entrants
)?But now that there is a new SEC, they are doing a bunch of these deals. There is this one, which is huge. They also invested in Lambda, who is deploying Gigawatt scale datacenters of NVIDIA GPUs. And they are doing smaller deals too.
I'm asking because its not just OpenAI that they are apparently doing this with, instead its with multiple other major GPU providers, like Coreweave.
And its just being done all out in the open? How?
I'm just surprised that nobody is yelling to the rooftops about practices that are just so out in the open right now.
As an investor you may decide that round-tripping is dumb but in that case your recourse is to sell the stock.
I (as a uninformed rando) think that there are a lot of research ideas that have not been fully explored because doing a small training run takes 100k. If that drops to 1000, then there is a lot more opportunities to try new techniques.
That's where the belief that we are in a bubble comes from.
Inference has extremely different unit economics from a typical SaaS like Salesforce or adtech like google or facebook.
But the "round tripping" kind of makes sense. OpenAI is not listed, but if it was, some of the AI investment money would flow to it. So now, if you are an AI believer, NVidia is allocating some of that money for you.
By many different measures, we are at record valuations (though must be said, not P/E however). Tends not to end well. And housing prices are based on when mortgages were at 3% and have not reset accordingly. We are in everything bubble territory and have been.
Always keep in mind the old saying: pesimists get to be right and optimists get to be rich.
Nice metaphor! Huge bubbles usually get a historical name like "Tulip Craze" or "Dot Com Crash" and when this bubble bursts "House of Cards" is a good candidate.
The biggest difference here though is that most of these moves seem to to involve direct investment and the movement of equity, not debt. I think this is an important distinction, because if things take a downturn debt is highly explosive (see GE during the GFC) whereas equity is not.
Not to say anyone wants to take a huge markdown on their equity, and there are real costs associated with designing, building, and powering GPUs which needs to be paid for, but Nvidia is also generating real revenue which likely covers that, I don't think they're funding much through debt? Tech tends to be very high margin so there's a lot of room to play if you're willing to just reduce your revenue (as opposed to taking on debt) in the short term.
Of course this means asset prices in the industry are going to get really tightly coupled, so if one starts to deflate it's likely that the market is going to wipe out a lot of value quickly and while there isn't an obvious debt bomb that will explode, I'm sure there's a landmine lying around somewhere...
> intends to invest up to xxx progressively
> preferred strategic compute and networking partner
> work together to co-optimize their roadmaps
> look forward to finalizing the details of this new phase of strategic partnership
I don't think I have seen so much meaningless corporate speak and so many outs in a public statement. "Yeah we'll maybe eventually do something cool".
In a sense, it's just an ask to public investors for added capital to do a thing, and evidently a number of investors found the pitch compelling enough.
Adding 10GW of offtake to any grid is going to cause significant problems and likely require CAPEX intensive upgrades (try buy 525kV dc cable from an established player and you are waiting until 2030+), as well as new generation for the power!
How many atrophic,xAi,google,Microsoft would be????
having around 5% entire country infrastructure on AI hardware seems excessive no???
If each human brain consumes ~20W then 10 GW is like 500 M people, that sounds like a lot of thinking. Maybe LLMs are moving in the complete opposite direction and at some point something else will appear that vaporizes this inefficiency making all of this worthless.
I don’t know, just looking at insects like flies and all the information they manage to process with what I assume is a ridiculous amount of energy suggests to me there must be a more efficient way to ‘think’, lol.
https://en.m.wikipedia.org/wiki/List_of_countries_by_electri...
Seriously, is there anyone in the media keeping unbiased tabs on how much we're spending on summarizing emails and making creatives starve a little more?
> Google is pretty useful. It uses 15 TWh per year.
15TWh per year is about 1.7GW.
Assuming the above figures, that means OpenAI and Nvidia new plan will consume about 5.8 Googles worth of power, by itself.
At that scale, there's a huge opportunity for ultra-low-power AI compute chips (compared with current GPUs), and right now there are several very promising technology pathways to it.
Sharing an example would be nice. Of how much power reduction are we talking here?
I think even Byrne Hobart would agree (from his interview with Ben): -- Bubbles are this weird financial phenomenon where asset prices move in a way that does not seem justified by economic fundamentals. A lot of money pours into some industry, a lot of stuff gets built, and usually too much of it gets built and a bunch of people lose their shirts and a lot of very smart, sophisticated people are involved with the beginning, a lot of those people are selling at the peak, and a lot of people who are buying at the peak are less smart, less sophisticated, but they’ve been kind of taken in by the vibe and they’re buying at the wrong time and they lose their shirts, and that’s really bad. --
This is a classic bubble. It starts, builds, and ends the same way. The technology is valuable, but it gets overbought/overproduced. Still no telling when it may pop, but remember asset values across many categories are rich right now and this could hurt.
eagerpace•2h ago
rubyfan•2h ago
fancyfredbot•2h ago
newfocogi•2h ago
brcmthrowaway•2h ago
richwater•2h ago
gpm•2h ago
bertili•2h ago
threetonesun•1h ago