frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

OpenAI and Nvidia announce partnership to deploy 10GW of Nvidia systems

https://openai.com/index/openai-nvidia-systems-partnership/
258•meetpateltech•2h ago•300 comments

A collection of technical things every software developer should know about

https://github.com/mtdvio/every-programmer-should-know
19•redbell•39m ago•5 comments

PlanetScale for Postgres is now GA

https://planetscale.com/blog/planetscale-for-postgres-is-generally-available
194•munns•3h ago•98 comments

Cloudflare is sponsoring Ladybird and Omarchy

https://blog.cloudflare.com/supporting-the-future-of-the-open-web/
431•jgrahamc•6h ago•268 comments

A board member's perspective of the RubyGems controversy

https://apiguy.substack.com/p/a-board-members-perspective-of-the
53•janpio•2h ago•27 comments

SWE-Bench Pro

https://github.com/scaleapi/SWE-bench_Pro-os
58•tosh•2h ago•12 comments

Qwen3-Omni: Native Omni AI Model for Text, Image & Video

https://github.com/QwenLM/Qwen3-Omni
14•meetpateltech•1h ago•1 comments

A simple way to measure knots has come unraveled

https://www.quantamagazine.org/a-simple-way-to-measure-knots-has-come-unraveled-20250922/
80•baruchel•4h ago•35 comments

The Beginner's Textbook for Fully Homomorphic Encryption

https://arxiv.org/abs/2503.05136
128•Qision•1d ago•21 comments

Mentra (YC W25) Is Hiring to build smart glasses

1•caydenpiercehax•2h ago

Cap'n Web: a new RPC system for browsers and web servers

https://blog.cloudflare.com/capnweb-javascript-rpc-library/
189•jgrahamc•5h ago•82 comments

Choose Your Own Adventure

https://www.filfre.net/2025/09/choose-your-own-adventure/
9•naves•42m ago•1 comments

Morgan and Morgan takes Disney to court over 'Steamboat Willie' in ads

https://www.clickorlando.com/news/local/2025/09/17/morgan-morgan-takes-disney-to-court-over-right...
44•wrayjustin•2d ago•25 comments

Easy Forth (2015)

https://skilldrick.github.io/easyforth/
150•pkilgore•7h ago•80 comments

CompileBench: Can AI Compile 22-year-old Code?

https://quesma.com/blog/introducing-compilebench/
101•jakozaur•6h ago•37 comments

What is algebraic about algebraic effects?

https://interjectedfuture.com/what-is-algebraic-about-algebraic-effects/
56•iamwil•4h ago•22 comments

Show HN: Python Audio Transcription: Convert Speech to Text Locally

https://www.pavlinbg.com/posts/python-speech-to-text-guide
4•Pavlinbg•46m ago•3 comments

A New Internet Business Model?

https://blog.cloudflare.com/cloudflare-2025-annual-founders-letter/
167•mmaia•3h ago•162 comments

Testing is better than Data Structures and Algorithms

https://nedbatchelder.com/blog/202509/testing_is_better_than_dsa.html
20•rsyring•2h ago•7 comments

Appleii Air Attack.BAS

https://basic-code.bearblog.dev/applesoft-air-attackbas/
4•ibobev•3d ago•0 comments

Beyond the Front Page: A Personal Guide to Hacker News

https://hsu.cy/2025/09/how-to-read-hn/
143•firexcy•9h ago•64 comments

SGI demos from long ago in the browser via WASM

https://github.com/sgi-demos
203•yankcrime•11h ago•53 comments

AI-Generated "Workslop" Is Destroying Productivity

https://hbr.org/2025/09/ai-generated-workslop-is-destroying-productivity
35•McScrooge•57m ago•7 comments

The Strange Tale of the Hotchkiss

https://www.edrdg.org/~jwb/mondir/hotchkiss.html
17•rwmj•1d ago•2 comments

The American Nations regions across North America

https://colinwoodard.com/new-map-the-american-nations-regions-across-north-america/
61•loughnane•3h ago•78 comments

Diffusion Beats Autoregressive in Data-Constrained Settings

https://blog.ml.cmu.edu/2025/09/22/diffusion-beats-autoregressive-in-data-constrained-settings/
4•djoldman•43m ago•1 comments

California issues historic fine over lawyer's ChatGPT fabrications

https://calmatters.org/economy/technology/2025/09/chatgpt-lawyer-fine-ai-regulation/
74•geox•2h ago•41 comments

Human-Oriented Markup Language

https://huml.io/
34•vishnukvmd•3h ago•34 comments

Biconnected components

https://emi-h.com/articles/bcc.html
41•emih•19h ago•13 comments

Dear GitHub: no YAML anchors, please

https://blog.yossarian.net/2025/09/22/dear-github-no-yaml-anchors
148•woodruffw•4h ago•117 comments
Open in hackernews

OpenAI and Nvidia announce partnership to deploy 10GW of Nvidia systems

https://openai.com/index/openai-nvidia-systems-partnership/
256•meetpateltech•2h ago

Comments

eagerpace•2h ago
Where is Apple? Even from an investment perspective.
rubyfan•2h ago
Being rationale.
fancyfredbot•2h ago
Rational.
newfocogi•2h ago
Maybe we're not sure if they're being rational or rationalizing.
brcmthrowaway•2h ago
Losing the race
richwater•2h ago
This is not something that can be won. The LLM architecture has been reaching it's limitations slowly but surely. New foundational models are now being tweaked for user engagement rather than productive output.
gpm•2h ago
Right, but is the race to the pot of gold, or the stoplight (in which case by "losing" they save on gas)?
bertili•2h ago
Apple doing fine and often spend the same 100B in a year buying back Apple stocks.
threetonesun•1h ago
My MacBook Pro runs local models better than anything else in the house and I have not yet needed to install a small nuclear reactor to run it, so, I feel like they're doing fine.
me551ah•2h ago
So OpenAI is breaking up with Microsoft and Azure?
freedomben•2h ago
They've been sleeping with Oracle too recently, so I don't think they're breaking up, just dipping a toe in the poly pool
jsheard•2h ago
It's more resembling a Habsburg family tree at this point

https://bsky.app/profile/anthonycr.bsky.social/post/3lz7qtjy...

(pencil in another loop between Nvidia and OpenAI now)

bwfan123•33m ago
openAI had a deal with oracle for 300B.

so, the money flow is:

nvidia -> openAI -> oracle -> nvidia

looks like openAI is the lynchpin on which the entire AI ecosystem is based on.

sekai•1h ago
In true Bay Area fashion?
Handy-Man•2h ago
It was more like Microsoft refused to build the capacity OpenAI was asking for, so they gave them blessing to buy additional compute from others.

It does seem like Satya believes models will get commoditized, so no need to hitch themselves with OpenAI that strongly.

FinnKuhn•1h ago
I would say Microsoft cheated on OpenAI first ;)

https://www.reuters.com/business/microsoft-use-some-ai-anthr...

mmmllm•1h ago
Are Anthropic and Google breaking up with Nvidia?
ddtaylor•2h ago
For someone who doesn't know what a gigawat worth of Nvidia systems is, how many high-end H100 or whatever does this get you? My estimates along with some poor-grade GPT research leads me to think it could be nearly 10 million? That does seem insane.
thrtythreeforty•2h ago
Safely in "millions of devices." The exact number depends on assumptions you make regarding all the supporting stuff, because typically the accelerators consume only a fraction of total power requirement. Even so, millions.
cj•1h ago
"GPUs per user" would be an interesting metric.

(Quick, inaccurate googling) says there will be "well over 1 million GPUs" by end of the year. With ~800 million users, that's 1 NVIDIA GPU per 800 people. If you estimate people are actively using ChatGPT 5% of the day (1.2 hours a day), you could say there's 1 GPU per 40 people in active use. Assuming consistent and even usage patterns.

That back of the envelope math isn't accurate, but interesting in the context of understanding just how much compute ChatGPT requires to operate.

Edit: I asked ChatGPT how many GPUs per user, and it spit out a bunch of calculations that estimates 1 GPU per ~3 concurrent users. Would love to see a more thorough/accurate break down.

skhameneh•2h ago
Before reading your comment I did some napkin math using 600W per GPU: 10,000,000,000 / 600 = 16,666,666.66...

With varying consumption/TDP, could be significantly more, could be significantly less, but at least it gives a starting figure. This doesn't account for overhead like energy losses, burst/nominal/sustained, system overhead, and heat removal.

kristjansson•1h ago
B200 is 1kW+ TDP ;)
iamgopal•2h ago
and How much is that in terms of percentage of bitcoin network capacity ?
cedws•1h ago
I'm also wondering what kind of threat this could be to PoW blockchains.
mrb•1h ago
Bitcoin mining consumes about 25 GW: https://ccaf.io/cbnsi/cbeci so this single deal amounts to about 40% of that.

To be clear, I am comparing power consumption only. In terms of mining power, all these GPUs could only mine a negligible fraction of what all specialized Bitcoin ASIC mine.

Edit: some math I did out of sheer curiosity: a modern top-of-the-line GPU would mine BTC at about 10 Ghash/s (I don't think anyone tried but I wrote GPU mining software back in the day, and that is my estimate). Nvidia is on track to sell 50 million GPUs in 2025. If they were all mining, their combined compute power would be 500 Phash/s, which is 0.05% of Bitcoin's global mining capacity.

ProofHouse•2h ago
How much cable (and what kind) to connect them all? That number would be 100x the number of gpus. I would think they just clip on metal racks no cables but then I saw the xai data center that can blue wire cables everywhere
hbarka•1h ago
It was announced last week that Nvidia acquired-hired a company that can connect more than 100,000 GPUs together as a cluster that can effectively serve as a single integrated system.
ddtaylor•58m ago
Do you have a link or info?
kingstnap•1h ago
It's a ridiculous amount claimed for sure. If its 2 kW per it's around 5 million, and 1 to 2 kW is definitely the right ballpark at a system level.

The NVL72 is 72 chips is 120 kW total for the rack. If you throw in ~25 kW for cooling its pretty much exactly 2 kW each.

sandworm101•1h ago
At this scale, I would suggest that these numbers are for the entire data center rather than a sum of the processor demands. Also the "infrastructure partnership " language suggest more than just compute. So I would add cooling into the equation, which could be as much a half the power load, or more depending on where they intend to locate these datacenters.
awertjlkjl•1h ago
You could think of it as "as much power as is used by NYC and Chicago combined". Which is fucking insanely wasteful.
onlyrealcuzzo•1h ago
I dunno.

Google is pretty useful.

It uses >15 TWh per year.

Theoretically, AI could be more useful than that.

Theoretically, in the future, it could be the same amount of useful (or much more) with substantially less power usage.

It could be a short-term crunch to pull-forward (slightly) AI advancements.

Additionally, I'm extremely skeptical they'll actually turn on this many chips using that much energy globally in a reasonable time-frame.

Saying that you're going to make that kind of investment is one thing. Actually getting the power for it is easier said than done.

VC "valuations" are already a joke. They're more like minimum valuations. If OpenAI is worth anywhere near it's current "valuations", Nvidia would be criminally negligent NOT to invest at a 90% discount (the marginal profit on their chips).

Capricorn2481•1h ago
Does Google not include AI?
dns_snek•1h ago
According to Google's latest environmental report[1] that number was 30 TWh per year in 2024, but as far as I can tell that's their total consumption of their datacenters, which would include everything from Google Search, to Gmail, Youtube, to every Google Cloud customer. Is it broken down by product somewhere?

30 TWh per year is equivalent to an average power consumption of 3.4 GW for everything Google does. This partnership is 3x more energy intensive.

Ultimately the difference in `real value/MWh` between these two must be many orders of magnitude.

[1] https://sustainability.google/reports/google-2025-environmen...

tmiku•1h ago
For other readers: "15 Twh per year" is equivalent to 1.71 GW, 17.1% of the "10GW" number used to describe the deal.
jazzyjackson•1h ago
I mean if 10GW of GPUs gets us AGI and we cure cancer than that's cool, but I do get the feeling we're just getting uncannier chatbots and fully automated tiktok influencers
junon•43m ago
This is also my take. I think a lot of people miss the trees for the forest (intentionally backward).

AI that could find a cure for cancer isn't the driving economic factor in LLM expansion, I don't think. I doubt cancer researchers are holding their breath on this.

yard2010•28m ago
Current llms are just like farms. Instead of tomatoes by the pound you buy tokens by the pound. So it depends on the customers.
alphabetag675•1h ago
Account for around 3MW for every 1000 GPUs. So, 10GW is around 333 * 10 * 3MW so 3.33 * 1k * 1k GPUs, so around 3.33 M GPUs
gmm1990•2h ago
Strange unit of measurement. Who would find that more useful than expected compute or even just the number of chips.
credit_guy•2h ago
A point of reference is that the recently announced OpenAI-Oracle deal mentioned 4.5 GW. So this deal is more than twice as big.
leetharris•2h ago
Probably because you can't reliably predict how much compute this will lead to. Power generation is probably the limiting factor in intelligence explosion.
skhameneh•2h ago
I wouldn't be surprised if power consumption is a starting point due to things like permitting and initial load planning.

I imagine this as a subtractive process starting with the maximum energy window.

zozbot234•2h ago
It's a very useful reference point actually because once you hit 1.21 GW the AI model begins to learn at a geometric rate and we finally get to real AGI. Last I've heard this was rumored as a prediction for AI 2027, so we're almost there already.
jsnell•1h ago
1.21GW is an absurd level of precision for this kind of prediction.
leptons•1h ago
It's from the movie "Back to the Future"
outside2344•1h ago
Is this a crafty reference to Back to the Future? If so I applaud you.
aprdm•1h ago
At large scales a lot of it is measured on power instead of compute, as power is the limitation
isoprophlex•1h ago
If a card costs x money, and operating it every year/whatever costs y money in electricity, and y >> x, it makes sense to mostly talk about the amount of electricity you are burning.

Because if some card with more FLOPS comes available, and the market will buy all your FLOPS regardless, you just swap it in at constant y / for no appreciable change in how much you're spending to operate.

(I have no idea if y is actually much larger than x)

ben_w•20m ago
For a while, it's become increasingly clear that the current AI boom's growth curve rapidly hits the limits of the existing electricity supply.

Therefore, they are listing in terms of the critical limit: power.

Personally, I expect this to blow up first in the faces of normal people who find they can no longer keep their phones charged or their apartments lit at night, and only then will the current AI investment bubble pop.

xnx•2h ago
What does this mean? "To support the partnership, NVIDIA intends to invest up to $100 billion in OpenAI progressively as each gigawatt is deployed."
jstummbillig•2h ago
I am confused as to what the question is.
solarexplorer•2h ago
That they will invest 10$ in OpenAI for each W of NVIDIA chips that is deployed? EDIT: In steps of 1GW it seems.
re-thc•2h ago
> What does this mean?

> to invest up to

i.e. 0 to something something

losteric•2h ago
so nvidia's value supported by the value of AI companies, which nvidia then supports?
patapong•2h ago
Perhaps it means OpenAI will pay for the graphics card in stock? Nvidia would become an investor in OpenAI thereby moving up the AI value chain as well as ensuring demand for GPUs, while OpenAI would get millions of GPUs to scale their infrastructure.
vlovich123•2h ago
Nvidia is buying their own chips and counting it as a sale. In exchange they’re maybe getting OpenAI stock that will be worth more in the future. Normally this would count as illegally cooking the books I think but if the OpenAI investment pays off no one will care.
toomuchtodo•2h ago
What if it doesn't?
vlovich123•1h ago
Still unlikely they’d get prosecuted because they’re not trying to hide how they’re doing this and there’s no reasonable expectation that OpenAI is likely to fold. I doubt they’d improperly record this in their accounting ledger either.
nutjob2•1h ago
It's a good question since it's probably the 99% case.
dtech•2h ago
They're investing in kind. They're paying with chips instead of money
dsr_•2h ago
It means this is a bubble, and Nvidia is hoping that their friends in a white house will keep them from being prosecuted, of at least from substantial penalties.
mmmllm•1h ago
They will transfer the money to buy their own chips right before each chip is purchased
isodev•2h ago
> Strategic partnership enables OpenAI to build and deploy at least 10 gigawatts of AI datacenters with NVIDIA systems representing millions of GPUs

I know watts but I really can’t quantify this. How much of Nvidia is there in the amount of servers that consume 10GW? Do they all use the same chip? What if there is newer chip that consumes less, does the deal imply more servers? Did GPT write this post?

mr_toad•2h ago
You don’t need AI to write vague waffly press releases. But to put this in perspective an H100 has a TDP of 700 watts, the newer B100s are 1000 watts I think?

Also, the idea of a newer Nvidia card using less power is très amusant.

nick__m•1h ago
A 72 GPUs NVL72 rack consumes up to 130kW, so it's a little more than 5 500 000 GPUs
hooloovoo_zoo•2h ago
These $ figures based on compute credits or the investor's own hardware seem pretty sketchy.
fufxufxutc•2h ago
In accounting terms, this is a shady business practice known as "round tripping" where you invest in a company for the sole purpose of them buying your product. It allows you to count your revenue multiple times.
klysm•2h ago
Is it counting revenue multiple times? It's buying your own products really, but not sure how that counts as double counting revenue
fufxufxutc•2h ago
The "investment" came from their revenue, and will be immediately counted in their revenue again.
weego•2h ago
In this case it seems that if we're being strict here the investment could then also show up as fixed assets on the same balance sheet
lumost•2h ago
It's real revenue, but you are operating a fractional reserve revenue operation. If the person your investing in has trouble, or you have trouble - the whole thing falls over very fast.
rsstack•2h ago
Customer A pays you $100 for goods that cost you $10. You invest $100-$10=$90 in customer B so that they'll pay you $90 for goods that cost you $9. Your reported revenue is now $100+$90=$190, but the only money that entered the system is the original $100.
FinnKuhn•1h ago
And your evaluation also rises as a consequence of your increased revenue.
Aurornis•1h ago
Yes, but you’ve also incurred a $90 expense in purchasing the stock of Company B and that stock is on the balance sheet.

In the actual shady version of this, Company B isn’t the hottest AI investment around, it’s a shell company created by your brother’s cousin that isn’t actually worth what you’re claiming on the balance sheet because it was only created for the round tripping shell game.

creddit•1h ago
Except that this is isn't round-tripping at all. Round-tripping doesn't result in a company actually incurring expenses to create more product. Round-tripping is the term for schemes that enable you to double count assets/revenue without any economic effects taking place.

Every time HackerNews talks about anything in the legal or finance realm, people trip over themselves to make arguments for why something a big tech is doing is illegal. This is definitively neither illegal nor shady. If Nvidia believes, for example, that OpenAI can use their GPUs to turn a profit, then this is inherently positive sum economically for both sides: OpenAI gets capital in the form of GPUs, uses them to generate tokens which they sell above the cost of that capital and then the return some of the excess value to Nvidia. This is done via equity. It's a way for Nvidia to get access to some of the excess value of their product.

bob1029•36m ago
At some point one might simply argue that the nature and timing of these wildly fantastical press releases is tantamount to a "scheme to defraud".
selectodude•2h ago
This is some Enron shit. Lets see NVDA mark to market these profits. Keep the spice flowing.
FinnKuhn•2h ago
They for example did a similar deal with Nscale just last week.

https://www.cnbc.com/2025/09/17/ai-startup-nscale-from-uk-is...

Aurornis•1h ago
This is being done out in the open (we’re reading the press announcement) and will be factored into valuations.

Also, investing in OpenAI means they get equity in return, which is not a worthless asset. There is actual mutually beneficial trade occurring.

landl0rd•1h ago
Nvidia has consistently done this with Coreweave, Nscale, really most of its balance sheet investments are like this. On the one hand there's a vaguely cogent rationale that they're a strategic investor and it sort of makes sense as an hardware-for-equity swap; on the other, it's obviously goosing revenue numbers. This is a bigger issue when it's $100B than with previous investments.

It's a good time to gently remind everyone that there are a whole pile of legal things one can do to change how a security looks "by the numbers" and this isn't even close to the shadiest. Heck some sell-side research makes what companies themselves do look benign.

yannyu•1h ago
A relevant joke, paraphrased from the internet:

Two economists are walking in a forest when they come across a pile of shit.

The first economist says to the other “I’ll pay you $100 to eat that pile of shit.” The second economist takes the $100 and eats the pile of shit.

They continue walking until they come across a second pile of shit. The second economist turns to the first and says “I’ll pay you $100 to eat that pile of shit.” The first economist takes the $100 and eats a pile of shit.

Walking a little more, the first economist looks at the second and says, "You know, I gave you $100 to eat shit, then you gave me back the same $100 to eat shit. I can't help but feel like we both just ate shit for nothing."

"That's not true", responded the second economist. "We increased total revenue by $200!"

paxys•1h ago
The punchline is supposed to be GDP, but yeah, same concept.
hoosieree•1h ago
This should go without saying but unfortunately it really doesn't these days:

This kind of corporate behavior is bad and will end up hurting somebody. If we're lucky the fallout will only hurt Nvidia. More likely it will end up hurting most taxpayers.

Mistletoe•1h ago
Isn’t our stock market basically propped up on this AI credits etc. house of cards right now?
rzerowan•1h ago
Its the same loop de loop NVIDIA is doing with Coreweave as i understand.'Investing' in coreweave which then 'buys' NVIDIA merch for cloud rental , resulting in Coreweave being the top 4 customers of NVIDIA chips.
vessenes•1h ago
Wait, why the quotes? NVDA sends cash, and the Coreweave spends it, no? I don’t think quotes are accurate, if they imply these transactions aren’t real, and material. At the end of the day, NVDA owns Coreweave stock, and actual, you know, physical hardware is put into data centers, and cash is wired.
rsync•1h ago
"In accounting terms, this is a shady business practice known as "round tripping" where you invest in a company for the sole purpose of them buying your product. It allows you to count your revenue multiple times."

... and we've seen this before in previous bubbles ...

mandeepj•1h ago
> this is a shady business practice known as "round tripping" where you invest in a company for the sole purpose of them buying your product.

Microsoft and Google have been doing it for decades. Probably, MS started that practice.

GuB-42•1h ago
I don't really understand how it is round tripping.

In the end, Nvidia will have OpenAI shares, which are valuable, and OpenAI will have GPUs, which are also valuable. It is not fake revenue, the GPUs will be made, sold at market price, and used, they are not intended to be bought back and sold to another customer. And hopefully, these GPUs will be put to good use by OpenAI so that they can make a profit, which will give Nvidia some return on investment.

It doesn't look so different from a car loan, where the dealer lends you the money so that you can buy their car.

treis•49m ago
If OpenAI doesn't pan out than Nvidia has worthless OpenAI stock and OpenAI has a pile of mostly useless GPUs.
dwaltrip•22m ago
That’s still not round tripping?
udkl•9m ago
It looks like NVDIA looking to move up the value chain to have a stake in the even higher margin/addressable market instead of simply selling the tools.
bertili•2h ago
Whats in it for Nvidia? At the recent 300B valuation, 25% equity?
searine•2h ago
I look forward to subsidizing this effort with my skyrocketing home power bill.
DebtDeflation•2h ago
Wouldn't Nvidia be better served investing the $100B in expanding GPU manufacturing capacity?
ecshafer•2h ago
By investing in TSMC? By buying TSMC? I don't think $100B would buy them enough current generation capacity to make a difference from scratch.
paxys•1h ago
The don't have to pick just one.
vessenes•59m ago
They’re already spending as much money as they possibly can on growth, and have no further use for cash currently - they’ve been doing share buybacks this year.
TheRealGL•2h ago
Did I miss the part where they mention the 10 large nuclear plants needed to power this new operation? Where's all the power coming from for this?
HDThoreaun•2h ago
Build this thing in the middle of the desert and you would need around 100 sq mile of solar panels + a fuck load of batteries for it to be energy independent. The solar farm would be around $10 billion which is probably far less than the gpus cost
boringg•1h ago
Won't get you the necessary 4 9's uptime and energy sadly. Im still 100% for this -- but need another model for energy delivery.
xnx•1h ago
Dissipating 10GW of heat is also a challenge in a sunny, hot, dry environment.
newyankee•1h ago
100 sq km should suffice
udkl•6m ago
$10 billion is small change compare to the estimated all-inclusive cost of $10 billion for EACH 500MW data center ... $200 billion for 10GW.
catigula•2h ago
Consumer electric grids.
delfinom•2h ago
Yep. Consumers are screwed and $500/month electric bills are coming for the average consumer within a year or two. We do not have the electricity available for this.
leptons•1h ago
I'm pretty average, living in a small home, and my electric bill is already >$500/mo in the summer, and that's with the A/C set at 76F during the day.
davis•1h ago
Exactly this. This is essentially a new consumer tax in your electrical bill. The buildout of the electrical grid is being put on consumers essentially as a monthly tax with the increase in electrical costs. Everyone in the country is paying for the grid infrastructure to power these data centers owned by trillion dollar companies who aren't paying for their needs.
nutjob2•2h ago
Also, the fact they they announce not how much computing power they are going to deploy but rather how much electricity it's going to use (as if power usage is a useful measurement of processing power) is kind of gross.

"Good news everybody, your power bills are going up and your creaking, chronically underfunded infrastructure is even closer to collapse!"

hoosieree•1h ago
Also water. You will be rationed, OpenAI will not.

https://www.newstarget.com/2025-08-02-texas-ai-data-centers-...

perihelions•37m ago
That link is AI slop, ironically.
nitwit005•59m ago
I assumed this headline was not aimed at the public, but at some utility they want to convince to expand capacity. Otherwise, bragging about future power consumption seems a bit perplexing.
lumenwrites•2h ago
Yaay, one step closer to torment nexus.
nh23423fefe•1h ago
low effort comment, whose content is a stale reference to other low effort memes
andreicaayoha•2h ago
pls
zuInnp•2h ago
Yeah, who cares about the enviroment... who needs water and energy, if you AI agent can give you better pep talk
rlv-dan•53m ago
Don't forget about better filters for influencers talking about the climate crisis!
moduspol•2h ago
Waiting patiently for the Ed Zitron article on this...
nextworddev•1h ago
He single-handedly cost people more than anyone with his bearish takes lol
topaz0•49m ago
Or he saved them more than anyone by limiting their losses when it does finally crash
nextworddev•45m ago
except he called the top in 2023
gitremote•1h ago
When executives can't measure success by output, they measure success by input, a perverse incentive that rewards inefficiency.

Execs ask their employees to return to office, because they don't know how to measure good employee output.

Now OpenAI and Nvidia measure success by gigawatt input into AI instead of successful business outcomes from AI.

catigula•2h ago
Can we get some laws to force these companies to start subsidizing the consumer grids they're pummeling?

The electric bills are getting out of hand.

2OEH8eoCRo0•2h ago
What will happen if/when the AI bubble pops and there is far more grid capacity than demand? Power plant bailouts?
davis•1h ago
Load growth for the last 15 years has been very small but load growth going forward is expected to rise due to electrification of all things to decarbonize the economy. This means home heating, electrical cars, heavy industries, obviously data centers and the list goes on. So even if we have more grid capacity than demand (this seems unlikely), it will be used before too long.
vmg12•1h ago
They would build their own power lines / grid if they could.
bananapub•1h ago
... why? the current (heh) situation is that they do these big announcements and then local/state governments around the US get in a bidding war to try to shift costs from the datacenter operator on to their own citizens, in addition to offloading all of the capex.
catigula•4m ago
No thanks, I'll just take subsidies to my bill.
aanet•1h ago
I'm old enough to remember when vendor financing was both de rigueur and also frowned upon... (1990s: telecom sector, with all big players like Lucent, Nortel, Cisco, indulging in it, ending with the bust of 2001/2002, of course)
alephnerd•1h ago
This absolutely feels like the Telco Bubble 2.0, and I've mentioned this on HN as well a couple times [0]

[0] - https://news.ycombinator.com/item?id=44069086

boringg•1h ago
For sure a great infrastructure build out -- lets hope the leftover are better energy infrastructure so that whatever comes next in 7 years after the flame out has some great stuff to build on (similar to telco bubble 1.0) and less damaging to planet earth in the long arc.
alephnerd•1h ago
Yep. The Telco Bust 1.0 along with the Dotcom Bust is what enabled the cloud computing boom, the SaaS boom, and the e-commerce boom by the early-mid 2010s.

I think the eventual AI bust will lead to the same thing, as the costs for developing a domain-specific model have cratered over the past couple years.

AI/ML (and the infra around it) is overvalued at their current multiples, but the value created it real, and as the market grows to understand the limitations but also the opportunities, a more realistic and permanent boo' will occur.

aanet•1h ago
Yeah - no doubt on the eventual productivity gains due to AI/ML (which are real, of course, just like the real gains due to telecom infra buildup), but must an economy go through a bubble first to realize these productivity gains??

It appears that the answer is "more likely yes than not".

Counting some examples:

- self driving / autonomous vehicles (seeing real deployments now with Waymo, but 99% deployment still ahead; meanwhile, $$$ billions of value destroyed in the last 10-15 years with so many startups running out of money, getting acquihired, etc)

- Humanoid robots... (potential bubble?? I don't know of a single commercial deployment today that is worth any solid revenues, but companies keep getting funded left / right)

Deegy•56m ago
Happened with the electrical grid too.

I think you make a very interesting observation about these bubbles potentially being an inherent part of new technology expansion.

It makes sense too from a human behavior perspective. Whenever there are massive wins to be had, speculation will run rampant. Everyone wants to be the winner, but only a small fraction will actually win.

boringg•42m ago
Everyone learned from the picks and shovels analogy of the past and the VC model (take market share on below cost prices then expand and raise prices) so equity has some really big demands baked in.
lawlessone•1h ago
Nvidia if you're listening give me 10K and i'll bu...*invest 10K+ 10 euro worth of cash in your product.
EcommerceFlow•1h ago
If Solar can't compete with natural gas economically, and subsidizing solar ends up de-incentivizing natural gas production by artificially lowering energy prices, what's the solution here?
henearkr•1h ago
Your question is weird.

Solar does compete economically with methane already, and it's only going to improve even more.

EcommerceFlow•1h ago
If true, why aren't we mass scaling it all over the American West? We have railways running from West -> East, why not include power lines that can take power from energy farms in the West -> East? No major project in AZ, TX, or CA to give a city free power? etc
henearkr•1h ago
It is massively scaling everywhere, and notably in Texas btw.
ux266478•47m ago
> We have railways running from West -> East, why not include power lines that can take power from energy farms in the West -> East?

Firstly, there is no such thing as an infinitely scaling system.

Secondly, because power transmission isn't moving freight. The infrastructure to move electricity long distances is extremely complicated. Even moving past basic challenges like transmission line resistance and voltage drop, power grids have to be synchronized in both phase and frequency. Phase instability is a real problem for transmission within hundreds of miles, let alone thousands upon thousands.

Also that infrastructure is quite a bit more expensive to build than rail or even roads, and it's very maintenance hungry. An express built piece of power transmission that goes direct from a desert solar farm to one of the coasts is just fragile centralization. You have a long chain of high-maintenance infrastructure, a single point of failure makes the whole thing useless. So instead you go through the national grid, and end up with nothing, because all of that power is getting sucked up by everyone between you and the solar farm. It probably doesn't even make it out of the state it's being generated in.

BTW the vast majority of the cost of electricity is in the infrastructure, not its generation. Even a nuclear reactor is cheap compared to a large grid. New York city's collection of transmission lines, transformers, etc. (not even any energy generation infrastructure, just transmission) ballparks a couple hundred billion dollars. Maintenance is complex and extremely dangerous, which means the labor is $$$$. That's what you're paying for. That's why as we continue to move towards renewables price/watt will continue to go up, even though we're not paying for the expensive fuel anymore. The actual ~$60 million worth of fuel an average natural gas plant burns in a year pales in comparison to the billions a city spends making sure the electrons are happy.

philipkglass•32m ago
Getting approval across multiple states for lines takes a very long time. The federal government and just about any state, municipality, or private land owner along the proposed route can block or delay it. The TransWest Express transmission line project started planning in 2007 but couldn't start construction until 2023, and it only needed to cross 4 states.

If the coast-to-coast railways hadn't been built in the past, I don't think the US could build them today. There are too many parties who can now block big projects altogether or force the project to spend another 18 months proving that it should be allowed to move forward.

gpm•7m ago
60% tariffs on solar components from China, an executive that is actively hostile to renewable energy, and you still are massively scaling it to some extent.

67% of new grid capacity in the US was solar in 2024 (a further 18% was batteries, 9% wind, and 6% for everything else). In the first half of 2025 that dropped to 56% solar, 26% batteries, 10% wind, and 8% everything else (gas). Source for numbers: https://seia.org/research-resources/solar-market-insight-rep...

zitterbewegung•1h ago
To put this into perspective this datacenter would have the land area of Monaco (740 acres) given assumptions of a 80kW/rack per case.
dguest•1h ago
Monaco is 2 km^2 [1].

I'm confused because if I assume each rack takes up 1 square meter I get a much smaller footprint: around 12 hectares or 17 football fields.

And that assumes that the installation is one floor. I don't know much about data centers but I would have thought they'd stack them a bit.

Am I the only person who had to look up how big Monaco was?

[1]: https://en.wikipedia.org/wiki/Monaco

[2]: https://www.wolframalpha.com/input?i=10+GW+%2F+%2880kw+%2F+m...

zitterbewegung•33m ago
My ChatGPT calculation is below. ChatGPT is factoring in the actually size of building and a campus and it have the range of 340 to 740 acres. https://chatgpt.com/share/68d195ff-3528-8004-8418-ec462b1433...
vessenes•1h ago
So, basically a single BYD factory
oezi•37m ago
Monaco is so tiny it fits into Berlin's Tempelhofer Feld (a circular park inside the city).
ben_w•27m ago
I mean, you're not wrong, but Tempelhofer is also a former airport, so had to be quite big. And since Brexit, Berlin is the biggest city in the EU.
JCM9•1h ago
This is throwing more cards on the house of cards. Nvidia is “investing” in OpenAI so OpenAI can buy GPUs from NVidia. Textbook “round tripping.”

I generally like what’s been happening with AI but man this is gonna crash hard when reality sets in. We’re reaching the scary stage of a bubble where folks are forced to throw more and more cash on the fire to keep it going with no clear path to ever get that cash back. If anyone slows down, even just a bit, the whole thing goes critical and implodes.

anothermathbozo•1h ago
What is this a bubble on? What does said bubble collapsing look like?
Drunkfoowl•1h ago
High end server gpus and AI roi expectations.
reactordev•1h ago
I think everyone is underestimating the advancements in wafer tech and server compute over the last decade. Easy to miss when it’s out of sight out of mind but this isn’t going anywhere but up.

The current SOTA is going to pale in comparison to what we have 10 years from now.

zer00eyz•1h ago
> I think everyone is underestimating the advancements in wafer tech and server compute over the last decade.

What advancements?

We have done a fabulous job at lowering power consumption while exponentially increasing density of cores and to a lesser extent transistors.

Delivering power to data centers was becoming a problem 20 ish years ago. Today Power density and heat generation are off the charts. Most data center owners are lowering per rack system density to deal with the "problem".

There are literal projects pushing not only water cooling but refrigerant in the rack systems, in an attempt to get cooling to keep up with everything else.

The dot com boom and then Web 2.0 were fueled by Mores law, by Clock doubling and then the initial wave of core density. We have run out of all of those tricks. The new steps that were putting out have increased core densities but not lowered costs (because yields have been abysmal). Look at Nvidia's latests cores, They simply are not that much better in terms of real performance when compared to previous generations. If the 60 series shows the same slack gains then hardware isnt going to come along to bail out AI --- that continues to demand MORE compute cycles (tokens on thinking anyone) rather than less with each generation.

shawabawa3•1h ago
AI and tech companies

Collapse might look a little like the dot com bubble (stock crashes, bankruptcies, layoffs, etc)

wongarsu•1h ago
And it's worth reiterating that a bubble does not mean the technology is worthless. The dot com bubble collapsed despite the internet being a revolutionary technology that has shaped every decade since. Similarly LLMs are a great and revolutionary technology, but expectations, perception and valuations have grown much faster than what the technology can justify

These hype cycles aren't even bad per se. There is lots of capital to test out lots of useful ideas. But only a fraction of those will turn out to be both useful and currently viable, and the readjustment will be painful

HarHarVeryFunny•43m ago
Plus unused dark fiber = unused AI data centers and power generation capacity.
bitmasher9•1h ago
Nvidia is giving OpenAi money (through investment) to buy Nvidia chips. The bubble is that Nvidia got that money from its crazy high stock price, the extra investment raises OpenAi’s evaluation and the increased sells raises Nvidia’s evaluation. If the valuations see a correction then spending like this will decrease, further decreasing valuations.

Bubble collapsing looks like enshittification of OpenAI tools as they try to raise revenues. It’ll ripple all throughout tech as everyone is tied into LLMs, and capital will be harder to come by.

drexlspivey•1h ago
> The bubble is that Nvidia got that money from its crazy high stock price,

This is totally False, NVDA has not done any stock offerings. The money is coming from the ungodly amount of GPUs they are selling. In fact they are doing the opposite, they are buying back their stock because they have more money that they know what to do with.

vessenes•1h ago
NVDA outstanding shares are down ~1.2% year over year; the company has been buying back its own shares with —>> profits <<— to the tune of tens of billions.

Meanwhile NVDA stock is mildly up on this news, so the current owners of NVDA seem to like this investment. Or at least not hate it.

Agreed that we’ll see ad-enabled ChatGPT in about five minutes. What’s not clear is how easily we’ll be able to identify the ads.

mountainriver•1h ago
Valuations won’t see a correction for the core players, I have no idea why people think that. Both of these companies are already money factories.

Then consider we are about to lower interest rates and kick off the growth cycle again. The only way these valuations are going is way up for the foreseeable future

babelfish•1h ago
> Bubble collapsing looks like enshittification of OpenAI tools as they try to raise revenues

Why does monetizing OpenAI tools lead to bubble collapse? People are clearly willing to pay for LLMs

bitmasher9•15m ago
You read this backwards. If the bubble collapses we will see OpenAI raise capital by increasing revenue instead of investment.
jononor•1h ago
I do not think the leveraging is going to end there. I suspect this will be used to justify/secure power generation investments, possibly even nuclear. Likely via one or more of the OpenAI/Altman adjacent power startups.
amluto•1h ago
On the bright side, if lots of power capacity is added and most of the GPUs end up idle, then there might be cheap power available for other uses.
holoduke•1h ago
And computing in general gets cheaper.
lawlessone•1h ago
heating our homes next winter with clusters of h100s
jazzyjackson•1h ago
Power generation is not a monolithic enterprise. If more supply is built than needed, certain suppliers will go bankrupt.
lucianbr•1h ago
What are the chances suppliers will go bankrupt but the plants get sold and still produce power?
ogaj•1h ago
They may, but that doesn’t mean that the capacity disappears. It may require some assumptions about USG willingness to backstop an acquisition but it’s not a significant leap to think that the generation capacity remains in (more capable?) hands.
mcny•27m ago
Speaking of capacity, what happened to all the "dark fiber" that was supposedly built for Internet 2 or whatever? The fiber doesn't go away just because a bubble burst, right?
NewJazz•1h ago
Not if Ellison trickles it out for maximum profit.
bobmcnamara•1h ago
Altman is all in on converting the solar system into a Dyson sphere to power OpenAI.
diimdeep•56m ago
And it is hilarious [0]

[0] dyson spheres are a joke / Angela Collier https://youtu.be/fLzEX1TPBFM

yibg•39m ago
Isn't that already happening via Oklo? Up 500%+ YTD.
resters•1h ago
At least the deal is denominated in watts rather than currency which may hyperinflate soon.
gdiamos•1h ago
It forces us to confront a question.

How much investment and prioritization in scaling laws is justified?

aldousd666•35m ago
Regardless of the scaling hypothesis, they need the compute to serve the models at scale.
Jayakumark•1h ago
reminds me of this image https://www.reddit.com/media?url=https%3A%2F%2Fi.redd.it%2F6...
paxys•1h ago
It would be amusing if it also wasn't so accurate.
lotsofpulp•34m ago
I didn’t see the step where Larry has to sell any stock, and hence puts downward price pressure on Oracle share prices.

What is the source of the cash in steps 3, 4, and 7?

truelson•29m ago
Ultimately, debt will fuel this. Oracle can't pay with cashflow.
mcny•29m ago
It is us, index fund owners :clown:

Disclaimer: I also have a small amount of money in vanguard IRA

wmf•28m ago
Credit.
tobias3•44m ago
Same thing with the 1.3 billion EUR investment of ASML into Mistral. ASML -> Mistral -> NVIDIA -> TSMC -> ASML -> ...
vessenes•1h ago
Almost every model trained by the majors has paid for itself with inference fees.

I’m not saying there isn’t a bubble, but I am saying if the researchers and strategists absolutely closest to the “metal” of realtime frontier models are correct that AGI is in reach, then this isn’t a bubble, it’s a highly rational race. One that large players seem to be winning right now.

mossTechnician•1h ago
Which of these model-making companies have posted a profit? I'm not familiar with any.
vessenes•55m ago
They account internally for each model separately; Dario said they even think of each model as a separate company on Dwarkesh some time ago.

Inference services are wildly profitable. Currently companies believe it’s economically sensible to plow that money into R&D / Investment in new models through training.

For reference, oAI’s monthly revs are reportedly between $1b and $2b right now. Monthly. I think if you do a little napkin math you’ll see that they could be cashflow positive any time they wanted to.

mossTechnician•21m ago
But neither Anthropic nor OpenAI have ever posted a profit, have they?
jenkinomics•10m ago
Again with the "this is very profitable if you don't account for the cost of creating it?"

Then my selling 2 dollars for 1 dollar is a wildly profitable business as well! Can't sell them fast enough!

Why does it seem like so many people have ceased to think critically?

mountainriver•1h ago
The idea that it’s a bubble on the frontier model side is insane. AI assisted coding alone makes it the most valuable thing we’ve ever created.
switchers•23m ago
Get your head out of the proverbial, a bullshitting machine that lets some developers do things faster if they modify how they develop isn't even close to the most valuable thing we've ever created.
jsheard•1h ago
> Almost every model trained by the majors has paid for itself with inference fees.

Even if we assume this is true, the downstream customers paying for that inference also need to profit from it on average in order for the upstream model training to be sustainable, otherwise the demand for inference will dry up when the music stops. There won't always be a parade of over-funded AI startups burning $10 worth of tokens to make $1 in revenue.

Rover222•49m ago
My employer spends $100k/month or more on OpenAI fees. Money well spent, in both product features and developer process. This is just one fairly small random startup. Thousands of companies are spending this money and making more money because of it.
Rebuff5007•27m ago
Curious what makes you think the money is well spent.

I can maybe digest the fact that it helped prototype and ship a bit more code in a shorter time frame... but does that warrant in enough new customers or a higher value product that would justify $100k a month?!

ben_w•40m ago
Tokens that can be purchased for $10 may or may provide the purchaser with almost any dollar denominated result, from negative-billions* to postive-billions**.

Right now, I assume more the former than the latter. But if you're an optimistic investor, I can see why one might think a few hundred billion dollars more might get us an AI that's close enough to the latter to be worth it.

Me, I'm mostly hoping that the bubble pops soon in a way I can catch up with what the existing models can already provide real help with (which is well short of an entire project, but still cool and significant).

* e.g. the tokens are bad financial advice that might as well be a repeat of SBF

** how many tokens would get you the next Minecraft?

sylario•47m ago
The thing is that AI researchers that are not focused on only LLM do not seem to think it is in reach.
sindriava•21m ago
Demis Hassabis seems to think this and not only does he not focus only on LLMs, he got a nobel prize for a non-LLM system ;)
SilverElfin•1h ago
I also recall reading that OpenAI is developing its own chips. What happened to that?
aldousd666•34m ago
I don't think the NVIDIA deal is an exclusive one... They can still use TPUs and GPUs and other cloud providers if they like. They may still be planning to.
pixelready•1h ago
The real question is not whether this is a bubble since as you mentioned even if AI settles into a somewhat useful semi-mainstream tech, there is no way any of the likely outcomes can justify this level of investment.

The real question is what are we gonna do with all this cheap GPU compute when the bubble pops! Will high def game streaming finally have its time to shine? Will VFX outsource all of its render to the cloud? Will it meet the VR/AR hardware improvements in time to finally push the tech mainstream? Will it all just get re-routed back to crypto? Will someone come up with a more useful application of GPU compute?

halJordan•1h ago
Ai is already in semi-useful mainstream tech. There's a massive misunderstanding on this site (and other neo luddite sites) that somehow there is no "long tail" of business applications being transformed into ai applications.
lawlessone•45m ago
any examples?
sindriava•17m ago
Current systems are already tremendously useful in the medical field. And I'm not talking about your doctor asking ChatGPT random shit, I'm saying radiology results processing, patient monitoring, monitoring of medication studies... The list goes on. Not to mention many of the research advances done using automated systems already, for example for weather forecasting.
ACCount37•1h ago
"The bubble will pop any minute now, any second, just you wait" is cope.

Even if AI somehow bucks the trend and stops advancing in leaps? It's still on track to be the most impactful technology since smartphones, if not since the Internet itself. And the likes of Nvidia? They're the Cisco of AI infrastructure.

ben_w•35m ago
The importance of the Internet didn't prevent the .com bubble from bursting.
HarHarVeryFunny•28m ago
The dot com bubble popped. It doesn't mean that the internet wasn't successful, just that people got way too excited about extrapolating growth rates.

AI is here to stay, but the question is whether the players can accurately forecast the growth rate, or get too far ahead of it and get financially burnt.

big_toast•1h ago
Is this more of an accounting thing?

Is there some (tax?) efficiency where OpenAI could take money from another source, then pay it to Nvidia, and receive GPUs. But instead taking investment from Nvidia acts as a discount in some way.

(In addition to Nvidia being realistically the efficient/sole supplier of an input OpenAI currently needs. So this gives

  1. Nvidia an incentive to prioritize OpenAI and induces a win/win pricing component on Nvidia's GPU profit margin so OpenAI can bet on more GPUs now

  2. OpenAI some hedge on GPU pricing's effect on their valuations as the cost/margin fluctuates with new entrants
)?
wmf•26m ago
It sounds like Nvidia has so much cash already that they would prefer to own x% of OpenAI instead.
jedberg•1h ago
It's interesting how deals like this are politically relevant. Nvidia refused to do deals like this (investing in companies buying large amounts of NVIDIA GPUs) after they got the hammer from Biden's SEC for self dealing due to their investment in Coreweave.

But now that there is a new SEC, they are doing a bunch of these deals. There is this one, which is huge. They also invested in Lambda, who is deploying Gigawatt scale datacenters of NVIDIA GPUs. And they are doing smaller deals too.

throwaway667555•58m ago
It's not round tripping. Economically Nvidia is investing property is OpenAI. It's not investing nothing, far from it.
glitchc•51m ago
Does this mean they pay for it through consumer GPU sales?
wmf•22m ago
No, those are a drop in the bucket.
stale2002•50m ago
Does anyone in the finance business know how legal this all it? I am hearing terms like "round tripping" being thrown around. A practice where a company sells and buys back its own product to artificially inflate revenue.

I'm asking because its not just OpenAI that they are apparently doing this with, instead its with multiple other major GPU providers, like Coreweave.

And its just being done all out in the open? How?

yard2010•37m ago
How I see it - the people with the money make the rules, why would they make rules against themselves?
stale2002•33m ago
Yes, but my point is that this almost feels like an Enron case. Things were fine, until they weren't. And then in retrospect the fraud is obvious.

I'm just surprised that nobody is yelling to the rooftops about practices that are just so out in the open right now.

wmf•20m ago
IANAL but you can do pretty much anything as long as it's disclosed. The only problem with round-tripping is doing it secretly.

As an investor you may decide that round-tripping is dumb but in that case your recourse is to sell the stock.

dummydummy1234•49m ago
My thought is think of all the really cheap compute that will be available to researchers. Sure, it will crash but at the end of the day there will be a huge glut of gpus that datacenters will be trying to rent out near cost.

I (as a uninformed rando) think that there are a lot of research ideas that have not been fully explored because doing a small training run takes 100k. If that drops to 1000, then there is a lot more opportunities to try new techniques.

btown•18m ago
Following this thread to its logical conclusion: it would be a tremendous dramatic irony if, between the suppression of federal grants in the U.S., immigration restrictions on high-skilled researchers, and remarkably low cost of GPU resources if/when this alleged bubble bursts... the lion's share of the economic benefits of post-bubble-burst compute-accelerated research (from drug discovery to weather modeling to simulation-based chemistry) would accrue to those very countries that the current administration has been attempting to prevent from taking "unfair" advantage of U.S. infrastructure investment.
zitterbewegung•46m ago
I do think about this where they are creating a printing / cash burning cycle where both OpenAI keeps on doing raises and Nvidia can get more sales...
radium3d•42m ago
Really curious how xAI is working out financially. Grok blows me away for coding.
radium3d•39m ago
It's interesting how profitable Tesla is despite the huge investments in their AI training infrastructure. They seem to be one of the best positioned companies that can maintain enough profitability to be able to afford their AI infrastructure without issue.
Rebuff5007•42m ago
Sure, but its going to be a great plot for the movie that comes out in five years.
ivape•39m ago
You know you can sell inferencing at near 100% margins, right? More, even.
radu_floricica•28m ago
But real GPUs are being built, installed and used. It's not paper money, it's just buying goods and services partly with stock. Which is a very solid and time honored tradition which happens to align incentives very well.
mgh95•20m ago
What revenues do these GPUs generate for OpenAI? OpenAI is not currently profitable, and it is unclear if its business model will ever becomes profitable -- let alone profitable enough to justify this investment. Currently, this only works because the markets are willing to lend and let NVIDIA issue stock to cover the costs to manufacture the GPUs.

That's where the belief that we are in a bubble comes from.

davedx•12m ago
OpenAI generates plenty of revenues from their services. Don't conflate revenues with profits
mgh95•8m ago
I don't believe I am. Investors (value investors, not pump and dump investors) provide capital to companies on the expectation of profit, not revenue.
humanizersequel•12m ago
They're doing about a billion per month in revenue by running proprietary models on GPUs like these. Unless they're selling inference with zero/negative margin, it seems like a business model that could be made profitable very easily.
mgh95•8m ago
Revenue != profit, and you don't need to become net negative margin to be net unprofitable. Expensive researchers, expensive engineers, expensive capex, etc.

Inference has extremely different unit economics from a typical SaaS like Salesforce or adtech like google or facebook.

elorant•26m ago
Well I hope it crashes so we can get back to normalized GPU prices.
chairmansteve•26m ago
I agree that it is a bubble.

But the "round tripping" kind of makes sense. OpenAI is not listed, but if it was, some of the AI investment money would flow to it. So now, if you are an AI believer, NVidia is allocating some of that money for you.

truelson•26m ago
Going to leave this link here: https://www.hussmanfunds.com/comment/mc250814/

By many different measures, we are at record valuations (though must be said, not P/E however). Tends not to end well. And housing prices are based on when mortgages were at 3% and have not reset accordingly. We are in everything bubble territory and have been.

rapsey•19m ago
People are quite bearish and the stock market is making all time highs. This is actually a very good sign, because we are far from any euphoria.

Always keep in mind the old saying: pesimists get to be right and optimists get to be rich.

thfuran•17m ago
Only if they're optimistic at the right time and not the wrong one.
mrandish•23m ago
> throwing more cards on a house of cards.

Nice metaphor! Huge bubbles usually get a historical name like "Tulip Craze" or "Dot Com Crash" and when this bubble bursts "House of Cards" is a good candidate.

jama211•15m ago
I just hope it works out just like the dot com crash in the long run - which is that the internet kept going and bringing real value it just needed a big market reset when it popped.
neilv•8m ago
Oh, I see now: house of cards (usual meaning) + throwing more cards on (like throwing money on the fire, and also how you destabilize house of card) + GPU cards in this case (even though they're not necessarily cards). I like it.
andai•19m ago
I think you have just described the global economy.
kelvinjps•15m ago
The're just buying and investing from each other?
jama211•11m ago
Perhaps I should short openAI… would you try it?
crowcroft•8m ago
It seems similar to how GE under Jack Welch would use their rock solid financials to take on low cost debt that they could lend out to suppliers who needed finance to purchase their products.

The biggest difference here though is that most of these moves seem to to involve direct investment and the movement of equity, not debt. I think this is an important distinction, because if things take a downturn debt is highly explosive (see GE during the GFC) whereas equity is not.

Not to say anyone wants to take a huge markdown on their equity, and there are real costs associated with designing, building, and powering GPUs which needs to be paid for, but Nvidia is also generating real revenue which likely covers that, I don't think they're funding much through debt? Tech tends to be very high margin so there's a lot of room to play if you're willing to just reduce your revenue (as opposed to taking on debt) in the short term.

Of course this means asset prices in the industry are going to get really tightly coupled, so if one starts to deflate it's likely that the market is going to wipe out a lot of value quickly and while there isn't an obvious debt bomb that will explode, I'm sure there's a landmine lying around somewhere...

eitally•8m ago
I'm not saying you're wrong, but with Nvidia pulling back from DGX Cloud, it makes sense that they'd continue to invest in their strategic partners (whether it's software companies like OpenAI or infrastructure vendors like Coreweave).
belter•6m ago
Same as they are doing with CoreWeave. In a sane world the SEC would do something but we are past that. What about Boeing opening an airline company and selling airplanes to itself?
mrcwinn•1h ago
Very foolish of them not to leverage SoftwareFPU. And with minimal effort Performas are rackable.
paxys•1h ago
> letter of intent for a landmark strategic partnership

> intends to invest up to xxx progressively

> preferred strategic compute and networking partner

> work together to co-optimize their roadmaps

> look forward to finalizing the details of this new phase of strategic partnership

I don't think I have seen so much meaningless corporate speak and so many outs in a public statement. "Yeah we'll maybe eventually do something cool".

BHSPitMonkey•30m ago
NVDA's share price enjoyed a nice $6 bump today, so the announcement did what it was supposed to do.

In a sense, it's just an ask to public investors for added capital to do a thing, and evidently a number of investors found the pitch compelling enough.

paxys•19m ago
Increase in share price doesn't provide extra cash to a company. They'd have to issue new shares for that.
hshshshshsh•4m ago
But company owns stock right? So they can sell those stocks no?
labrador•1h ago
For scale: The 1960's era US Navy submarine I served on had a 78MW reactor, so 10GW is 128 nuclear submarines
dguest•1h ago
A typical reactor core is 1 GW, so it's also one rather big nuclear power plant.
Muromec•1h ago
More like two (and a half )
Muromec•1h ago
Or just ten very safe РБМК reactors rated 1GW each (they can't explode).
fragmede•1h ago
You almost got me. RBMKs had this problem with large positive void coefficients that was buried by the Soviet Union, which lead to Chernobyl.
fusionadvocate•44m ago
The control rods with the graphite on the tip was the cherry on top...
gehsty•1h ago
Some more context, Nuclear power stations can be up to 2GW, offshore windfarms are seemingly hitting a plateau at ~1.5GW, individual turbines in operations now are 15MW. Grids are already strained, 525kV DC systems can transmit ~2GW of power per cable bundle…

Adding 10GW of offtake to any grid is going to cause significant problems and likely require CAPEX intensive upgrades (try buy 525kV dc cable from an established player and you are waiting until 2030+), as well as new generation for the power!

vessenes•1h ago
Yeah the path forward here is going to be Apple-like vertical supply chain integration. There is absolutely no spare capacity in the infra side of electrical right now, at least in the US.
wongarsu•57m ago
And there is great cost saving potential in vertical integration. Distribution and transmission are huge costs. If you can build a data center right next to a power plant and just take all their power you get much better prices. Not trivial to do with the kinds of bursty loads that seem typical of AI data centers, but if you can engineer your way to a steady load (or at least steady enough that traditional grid smoothing techniques work) you can get a substantial advantage
gehsty•53m ago
I don’t think that’s possible with large scale power infrastructure, and specifically grid infrastructure is so tightly regulated. Closest that I’m aware of was TSMC buying the output of an entire offshore windfarm for 25yrs (largest power purchase contract ever - TSMC / Ørsted)… maybe Microsoft re starting nuclear power plants, or Google reporting offshore wind sites come out of contract (but nothing at the 10GW scale).
rlv-dan•56m ago
In the long run, perhaps this will give us a better power grid, just like the dotcom bubble gave rise to broadband?
Recursing•53m ago
For a better sense of scale: it's about 2% of the average US electricity consumption, and about the same as the average electricity consumption of the Netherlands (18 million people)
tonyhart7•39m ago
Wtf, and this is from 1 company

How many atrophic,xAi,google,Microsoft would be????

having around 5% entire country infrastructure on AI hardware seems excessive no???

bcrosby95•24m ago
Around 5% in the next 5 years for AI alone sounds pretty in-line with projections I've seen.
udkl•16m ago
For another sense of scale: A 500MW AI-centric datacenter could cost $10 billion or more to build. So 10GW is $200 billion!
HarHarVeryFunny•26m ago
A big power station of any type is ~1GW. Nuclear is slow to build, so I'd have to guess natural gas.
melenaboija•18m ago
This still blows my mind.

If each human brain consumes ~20W then 10 GW is like 500 M people, that sounds like a lot of thinking. Maybe LLMs are moving in the complete opposite direction and at some point something else will appear that vaporizes this inefficiency making all of this worthless.

I don’t know, just looking at insects like flies and all the information they manage to process with what I assume is a ridiculous amount of energy suggests to me there must be a more efficient way to ‘think’, lol.

sindriava•15m ago
We know for a fact that current LLMs are massively inefficient, this is not a new thing. But every optimization you make will allow you to run more inference with this hardware, there's not a reason for it to make it meaningless any more than more efficient cars didn't obsolete roads.
pera•1h ago
10 gigawatts sounds ridiculously high, how can you estimate the actual usage? I guess they are not running at capacity 24/7 right? Because that would be more than the consumption of several European countries, like Finland and Belgium:

https://en.m.wikipedia.org/wiki/List_of_countries_by_electri...

rawgabbit•1h ago
Does this affect OpenAI’s renegotiation of their deal with Microsoft?
lvl155•1h ago
We are definitely closer to the top in this market. Do people even realize what they’re predicting in terms of energy use? It’s going to be a wasteland territory sooner than people think.
webdevver•33m ago
its going to go 10x from where it is now
cainxinth•1h ago
Next Year: OpenAI announces it is seeking funding for a Dyson Sphere
BHSPitMonkey•28m ago
The production rate of ~~paperclips~~ tokens isn't growing quickly enough!
pmdr•1h ago
If I had shovels to sell, I'd definitely announce a strategic partnership to have a huge quarry dug by hand.

Seriously, is there anyone in the media keeping unbiased tabs on how much we're spending on summarizing emails and making creatives starve a little more?

jlokier•57m ago
onlyrealcuzzo wrote:

> Google is pretty useful. It uses 15 TWh per year.

15TWh per year is about 1.7GW.

Assuming the above figures, that means OpenAI and Nvidia new plan will consume about 5.8 Googles worth of power, by itself.

At that scale, there's a huge opportunity for ultra-low-power AI compute chips (compared with current GPUs), and right now there are several very promising technology pathways to it.

lukan•44m ago
" there's a huge opportunity for ultra-low-power AI compute chips (compared with current GPUs), and right now there are several very promising technology pathways to it"

Sharing an example would be nice. Of how much power reduction are we talking here?

hangonhn•54m ago
Anyone else find it fascinating that gigawatt/unit of power is the metric used for this deal?
iphone_elegance•52m ago
fancy stock buyback lol
truelson•39m ago
Ben Thompson and Doug O'Laughlin ( https://stratechery.com/2025/the-oracle-inflection-point-app... (paywall), https://www.fabricatedknowledge.com/p/capital-cycles-and-ai ) are calling it a bubble, largely because we've entered the cycle where cash flows aren't paying for it, but debt is (See Oracle. they won't be able to pay for their investment with cash flow).

I think even Byrne Hobart would agree (from his interview with Ben): -- Bubbles are this weird financial phenomenon where asset prices move in a way that does not seem justified by economic fundamentals. A lot of money pours into some industry, a lot of stuff gets built, and usually too much of it gets built and a bunch of people lose their shirts and a lot of very smart, sophisticated people are involved with the beginning, a lot of those people are selling at the peak, and a lot of people who are buying at the peak are less smart, less sophisticated, but they’ve been kind of taken in by the vibe and they’re buying at the wrong time and they lose their shirts, and that’s really bad. --

This is a classic bubble. It starts, builds, and ends the same way. The technology is valuable, but it gets overbought/overproduced. Still no telling when it may pop, but remember asset values across many categories are rich right now and this could hurt.

dboreham•35m ago
Perpetual Money Machine
softwaredoug•33m ago
How will this actually be powered? Just seems like we’re powering an ecological disaster.
seydor•26m ago
We are way past peak LLM and it shows. They are basically advertise spacing heating as if it's some sort of advancement, while the tech seems to have stagnated, and they re just making the horses faster. The market should have punished this
jama211•12m ago
There will be a great market correction soon. Long term though it’ll still have some value, much like after the dot com crash the internet still remained useful. I hope.
danielfalbo•24m ago
And the bubble keeps bubble-ing
gloxkiqcza•20m ago
I’m really curious how this affects the consumer GPU market over the next few years. Sure, there has been a GPU shortage for a few years now but if this continues, there should be an absolute surplus of obsolete-gen enterprise GPUs flooding the market, right? Any ideas what limitations and benefits these cards might have for an enthusiast?
jama211•14m ago
These systems aren’t easily converted into desktop style GPU’s, so it may not trickle down the way we hope
wmf•8m ago
I assume surplus DGX A100 are already out there but they consume kilowatts so enthusiasts can't even plug them in.
agentultra•12m ago
Water is a critical resource in dwindling supplies in many water-stressed regions. These data centers have been known to suck up water supplies during active droughts. Is there anyone left at the EPA that gets a say in how we manage water for projects like this?