frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Demystifying Agentic Memory

https://alexspyropoulos.com/posts/demystifying-agentic-memory/
1•alexspyr•2m ago•0 comments

How I Vibe Coding? (Sept 2025 Edition)

https://xuanwo.io/2025/06-how-i-vibe-coding-sept-2025-edition/
1•xuanwo•3m ago•0 comments

Model literals, semantic aliases, and preference-aligned routing for LLMs

https://docs.archgw.com/guides/llm_router.html
1•honorable_coder•4m ago•1 comments

Market design can feed the poor

https://worksinprogress.co/issue/how-market-design-can-feed-the-poor/
1•zdw•5m ago•1 comments

Automate User Interviews with AI

https://theproductfeedbackcompany.com/
1•bobcoi•5m ago•1 comments

AMD Ryzen AI Max+ "Strix Halo" Performance with ROCm 7.0

https://www.phoronix.com/review/amd-rocm-7-strix-halo
2•rbanffy•6m ago•0 comments

How Samin Nosrat Learned to Love the Recipe

https://www.newyorker.com/culture/persons-of-interest/how-samin-nosrat-learned-to-love-the-recipe
1•mitchbob•6m ago•1 comments

Canon updates a PowerShot with higher price and fewer features

https://m.dpreview.com/news/9212403257/canon-powershot-360-hs-a-announcement
2•PaulHoule•7m ago•0 comments

Show HN: Technical Interview for an Open Source Team (Grove Engineering)

https://github.com/orgs/buildwithgrove/discussions/456
1•Olshansky•7m ago•0 comments

Europe's cookie law messed up the internet. Brussels wants to fix it

https://www.politico.eu/article/europe-cookie-law-messed-up-the-internet-brussels-sets-out-to-fix...
2•c420•9m ago•2 comments

Ask HN: Which was the first 32bit DOS game?

1•DrNosferatu•9m ago•0 comments

Ready or not, the digital afterlife is here

https://www.nature.com/articles/d41586-025-02940-w
1•XzetaU8•9m ago•0 comments

Open-source security analysis with Gemini CLI

https://github.com/gemini-cli-extensions/security
1•evanotero•9m ago•0 comments

Aldrin Cycler Burials

1•Xorakios•11m ago•0 comments

Legal Lullabies: Narrations of tech giants' terms of service

https://www.zzzuckerberg.com/
1•gaws•11m ago•0 comments

Effect Systems vs. Print Debugging: A Pragmatic Solution

https://blog.flix.dev/blog/effect-systems-vs-print-debugging/
2•degurechaff•13m ago•0 comments

Tri Dao on Unsupervised Learning Podcast

https://youtu.be/xlSaoP0b90A?si=_HsLmZ3Vy1M37tdX
1•mdunnoconnor•13m ago•0 comments

Saga Distributed Transactions Pattern

https://learn.microsoft.com/en-us/azure/architecture/patterns/saga
1•mooreds•13m ago•0 comments

Elizabeth Stone on what's next for Netflix – and streaming itself

https://techcrunch.com/2025/09/22/elizabeth-stone-on-whats-next-for-netflix-and-streaming-itself-...
1•mathattack•13m ago•0 comments

Qwen3-Omni – the first natively AI unifying text, image, audio and video

https://twitter.com/Alibaba_Qwen/status/1970181599133344172
1•amrrs•15m ago•0 comments

OpenAI to launch ChatGPT for teens with parental controls

https://www.cnbc.com/2025/09/16/openai-chatgpt-teens-parent.html
1•gmays•15m ago•0 comments

Optical Chip Beats Counterparts in AI Power Efficiency 100 Fold

https://www.allaboutcircuits.com/news/optical-chip-beats-counterparts-in-ai-power-efficiency-100-...
1•giuliomagnifico•16m ago•0 comments

Ask HN: How much of your code is AI writing?

1•bmau5•17m ago•1 comments

Qwen3-Omni: Native Omni AI Model for Text, Image & Video

https://github.com/QwenLM/Qwen3-Omni
2•meetpateltech•17m ago•0 comments

AI Image Animator: Animate Any Image into Video Online

https://aiimageanimator.net/
2•wukongfine•17m ago•0 comments

Crypto theft booms to a record amid kidnappings, Bybit hack

https://www.cnbc.com/2025/07/17/crypto-theft-hits-record-in-2025.html
1•paulpauper•18m ago•0 comments

U.S. Seniors Lost More Money to Scammers in 2024 Than You Think

https://www.vice.com/en/article/u-s-seniors-lost-more-money-to-scammers-in-2024-than-you-think/
3•paulpauper•18m ago•0 comments

Qwen3-Omni

https://huggingface.co/collections/Qwen/qwen3-omni-68d100a86cd0906843ceccbe
1•speedyboi•19m ago•0 comments

Qwen-Image-Edit-2509

https://huggingface.co/Qwen/Qwen-Image-Edit-2509
2•speedyboi•19m ago•0 comments

Topological fingerprints for audio identification (2023)

https://arxiv.org/abs/2309.03516
1•wslh•19m ago•0 comments
Open in hackernews

OpenAI and Nvidia announce partnership to deploy 10GW of Nvidia systems

https://openai.com/index/openai-nvidia-systems-partnership/
171•meetpateltech•1h ago

Comments

eagerpace•1h ago
Where is Apple? Even from an investment perspective.
rubyfan•1h ago
Being rationale.
fancyfredbot•1h ago
Rational.
newfocogi•1h ago
Maybe we're not sure if they're being rational or rationalizing.
brcmthrowaway•1h ago
Losing the race
richwater•1h ago
This is not something that can be won. The LLM architecture has been reaching it's limitations slowly but surely. New foundational models are now being tweaked for user engagement rather than productive output.
gpm•1h ago
Right, but is the race to the pot of gold, or the stoplight (in which case by "losing" they save on gas)?
bertili•1h ago
Apple doing fine and often spend the same 100B in a year buying back Apple stocks.
threetonesun•6m ago
My MacBook Pro runs local models better than anything else in the house and I have not yet needed to install a small nuclear reactor to run it, so, I feel like they're doing fine.
me551ah•1h ago
So OpenAI is breaking up with Microsoft and Azure?
freedomben•1h ago
They've been sleeping with Oracle too recently, so I don't think they're breaking up, just dipping a toe in the poly pool
jsheard•1h ago
It's more resembling a Habsburg family tree at this point

https://bsky.app/profile/anthonycr.bsky.social/post/3lz7qtjy...

(pencil in another loop between Nvidia and OpenAI now)

sekai•33m ago
In true Bay Area fashion?
Handy-Man•1h ago
It was more like Microsoft refused to build the capacity OpenAI was asking for, so they gave them blessing to buy additional compute from others.

It does seem like Satya believes models will get commoditized, so no need to hitch themselves with OpenAI that strongly.

FinnKuhn•54m ago
I would say Microsoft cheated on OpenAI first ;)

https://www.reuters.com/business/microsoft-use-some-ai-anthr...

mmmllm•14m ago
Are Anthropic and Google breaking up with Nvidia?
ddtaylor•1h ago
For someone who doesn't know what a gigawat worth of Nvidia systems is, how many high-end H100 or whatever does this get you? My estimates along with some poor-grade GPT research leads me to think it could be nearly 10 million? That does seem insane.
thrtythreeforty•1h ago
Safely in "millions of devices." The exact number depends on assumptions you make regarding all the supporting stuff, because typically the accelerators consume only a fraction of total power requirement. Even so, millions.
cj•54m ago
"GPUs per user" would be an interesting metric.

(Quick, inaccurate googling) says there will be "well over 1 million GPUs" by end of the year. With ~800 million users, that's 1 NVIDIA GPU per 800 people. If you estimate people are actively using ChatGPT 5% of the day (1.2 hours a day), you could say there's 1 GPU per 40 people in active use. Assuming consistent and even usage patterns.

That back of the envelope math isn't accurate, but interesting in the context of understanding just how much compute ChatGPT requires to operate.

Edit: I asked ChatGPT how many GPUs per user, and it spit out a bunch of calculations that estimates 1 GPU per ~3 concurrent users. Would love to see a more thorough/accurate break down.

skhameneh•1h ago
Before reading your comment I did some napkin math using 600W per GPU: 10,000,000,000 / 600 = 16,666,666.66...

With varying consumption/TDP, could be significantly more, could be significantly less, but at least it gives a starting figure. This doesn't account for overhead like energy losses, burst/nominal/sustained, system overhead, and heat removal.

kristjansson•34m ago
B200 is 1kW+ TDP ;)
iamgopal•1h ago
and How much is that in terms of percentage of bitcoin network capacity ?
cedws•1h ago
I'm also wondering what kind of threat this could be to PoW blockchains.
mrb•51m ago
Bitcoin mining consumes about 25 GW: https://ccaf.io/cbnsi/cbeci so this single deal amounts to about 40% of that.

To be clear, I am comparing power consumption only. In terms of mining power, all these GPUs could only mine a negligible fraction of what all specialized Bitcoin ASIC mine.

Edit: some math I did out of sheer curiosity: a modern top-of-the-line GPU would mine BTC at about 10 Ghash/s (I don't think anyone tried but I wrote GPU mining software back in the day, and that is my estimate). Nvidia is on track to sell 50 million GPUs in 2025. If they were all mining, their combined compute power would be 500 Phash/s, which is 0.05% of Bitcoin's global mining capacity.

ProofHouse•1h ago
How much cable (and what kind) to connect them all? That number would be 100x the number of gpus. I would think they just clip on metal racks no cables but then I saw the xai data center that can blue wire cables everywhere
hbarka•33m ago
It was announced last week that Nvidia acquired-hired a company that can connect more than 100,000 GPUs together as a cluster that can effectively serve as a single integrated system.
ddtaylor•45s ago
Do you have a link or info?
kingstnap•58m ago
It's a ridiculous amount claimed for sure. If its 2 kW per it's around 5 million, and 1 to 2 kW is definitely the right ballpark at a system level.

The NVL72 is 72 chips is 120 kW total for the rack. If you throw in ~25 kW for cooling its pretty much exactly 2 kW each.

boringg•51m ago
whats the time frame?
sandworm101•34m ago
At this scale, I would suggest that these numbers are for the entire data center rather than a sum of the processor demands. Also the "infrastructure partnership " language suggest more than just compute. So I would add cooling into the equation, which could be as much a half the power load, or more depending on where they intend to locate these datacenters.
awertjlkjl•29m ago
You could think of it as "as much power as is used by NYC and Chicago combined". Which is fucking insanely wasteful.
onlyrealcuzzo•24m ago
I dunno.

Google is pretty useful.

It uses >15 TWh per year.

Theoretically, AI could be more useful than that.

Theoretically, in the future, it could be the same amount of useful (or much more) with substantially less power usage.

It could be a short-term crunch to pull-forward (slightly) AI advancements.

Additionally, I'm extremely skeptical they'll actually turn on this many chips using that much energy globally in a reasonable time-frame.

Saying that you're going to make that kind of investment is one thing. Actually getting the power for it is easier said than done.

Capricorn2481•18m ago
Does Google not include AI?
dns_snek•7m ago
According to Google's latest environmental report[1] that number was 30 TWh per year in 2024, but as far as I can tell that's their total consumption of their datacenters, which would include everything from Google Search, to Gmail, Youtube, to every Google Cloud customer. Is it broken down by product somewhere?

30 TWh per year is equivalent to an average power consumption of 3.4 GW for everything Google does. This partnership is 3x more energy intensive.

Ultimately the difference in `real value/MWh` between these two must be many orders of magnitude.

[1] https://sustainability.google/reports/google-2025-environmen...

tmiku•6m ago
For other readers: "15 Twh per year" is equivalent to 1.71 GW, 17.1% of the "10GW" number used to describe the deal.
jazzyjackson•18m ago
I mean if 10GW of GPUs gets us AGI and we cure cancer than that's cool, but I do get the feeling we're just getting uncannier chatbots and fully automated tiktok influencers
alphabetag675•25m ago
Account for around 3MW for every 1000 GPUs. So, 10GW is around 333 * 10 * 3MW so 3.33 * 1k * 1k GPUs, so around 3.33 M GPUs
gmm1990•1h ago
Strange unit of measurement. Who would find that more useful than expected compute or even just the number of chips.
credit_guy•1h ago
A point of reference is that the recently announced OpenAI-Oracle deal mentioned 4.5 GW. So this deal is more than twice as big.
leetharris•1h ago
Probably because you can't reliably predict how much compute this will lead to. Power generation is probably the limiting factor in intelligence explosion.
skhameneh•1h ago
I wouldn't be surprised if power consumption is a starting point due to things like permitting and initial load planning.

I imagine this as a subtractive process starting with the maximum energy window.

zozbot234•1h ago
It's a very useful reference point actually because once you hit 1.21 GW the AI model begins to learn at a geometric rate and we finally get to real AGI. Last I've heard this was rumored as a prediction for AI 2027, so we're almost there already.
jsnell•51m ago
1.21GW is an absurd level of precision for this kind of prediction.
leptons•39m ago
It's from the movie "Back to the Future"
outside2344•43m ago
Is this a crafty reference to Back to the Future? If so I applaud you.
aprdm•58m ago
At large scales a lot of it is measured on power instead of compute, as power is the limitation
isoprophlex•49m ago
If a card costs x money, and operating it every year/whatever costs y money in electricity, and y >> x, it makes sense to mostly talk about the amount of electricity you are burning.

Because if some card with more FLOPS comes available, and the market will buy all your FLOPS regardless, you just swap it in at constant y / for no appreciable change in how much you're spending to operate.

(I have no idea if y is actually much larger than x)

xnx•1h ago
What does this mean? "To support the partnership, NVIDIA intends to invest up to $100 billion in OpenAI progressively as each gigawatt is deployed."
jstummbillig•1h ago
I am confused as to what the question is.
solarexplorer•1h ago
That they will invest 10$ in OpenAI for each W of NVIDIA chips that is deployed? EDIT: In steps of 1GW it seems.
re-thc•1h ago
> What does this mean?

> to invest up to

i.e. 0 to something something

losteric•1h ago
so nvidia's value supported by the value of AI companies, which nvidia then supports?
patapong•1h ago
Perhaps it means OpenAI will pay for the graphics card in stock? Nvidia would become an investor in OpenAI thereby moving up the AI value chain as well as ensuring demand for GPUs, while OpenAI would get millions of GPUs to scale their infrastructure.
vlovich123•1h ago
Nvidia is buying their own chips and counting it as a sale. In exchange they’re maybe getting OpenAI stock that will be worth more in the future. Normally this would count as illegally cooking the books I think but if the OpenAI investment pays off no one will care.
toomuchtodo•1h ago
What if it doesn't?
vlovich123•1h ago
Still unlikely they’d get prosecuted because they’re not trying to hide how they’re doing this and there’s no reasonable expectation that OpenAI is likely to fold. I doubt they’d improperly record this in their accounting ledger either.
nutjob2•1h ago
It's a good question since it's probably the 99% case.
dtech•1h ago
They're investing in kind. They're paying with chips instead of money
dsr_•1h ago
It means this is a bubble, and Nvidia is hoping that their friends in a white house will keep them from being prosecuted, of at least from substantial penalties.
mmmllm•13m ago
They will transfer the money to buy their own chips right before each chip is purchased
isodev•1h ago
> Strategic partnership enables OpenAI to build and deploy at least 10 gigawatts of AI datacenters with NVIDIA systems representing millions of GPUs

I know watts but I really can’t quantify this. How much of Nvidia is there in the amount of servers that consume 10GW? Do they all use the same chip? What if there is newer chip that consumes less, does the deal imply more servers? Did GPT write this post?

mr_toad•1h ago
You don’t need AI to write vague waffly press releases. But to put this in perspective an H100 has a TDP of 700 watts, the newer B100s are 1000 watts I think?

Also, the idea of a newer Nvidia card using less power is très amusant.

nick__m•59m ago
A 72 GPUs NVL72 rack consumes up to 130kW, so it's a little more than 5 500 000 GPUs
hooloovoo_zoo•1h ago
These $ figures based on compute credits or the investor's own hardware seem pretty sketchy.
fufxufxutc•1h ago
In accounting terms, this is a shady business practice known as "round tripping" where you invest in a company for the sole purpose of them buying your product. It allows you to count your revenue multiple times.
klysm•1h ago
Is it counting revenue multiple times? It's buying your own products really, but not sure how that counts as double counting revenue
fufxufxutc•1h ago
The "investment" came from their revenue, and will be immediately counted in their revenue again.
weego•1h ago
In this case it seems that if we're being strict here the investment could then also show up as fixed assets on the same balance sheet
lumost•1h ago
It's real revenue, but you are operating a fractional reserve revenue operation. If the person your investing in has trouble, or you have trouble - the whole thing falls over very fast.
rsstack•1h ago
Customer A pays you $100 for goods that cost you $10. You invest $100-$10=$90 in customer B so that they'll pay you $90 for goods that cost you $9. Your reported revenue is now $100+$90=$190, but the only money that entered the system is the original $100.
FinnKuhn•58m ago
And your evaluation also rises as a consequence of your increased revenue.
Aurornis•47m ago
Yes, but you’ve also incurred a $90 expense in purchasing the stock of Company B and that stock is on the balance sheet.

In the actual shady version of this, Company B isn’t the hottest AI investment around, it’s a shell company created by your brother’s cousin that isn’t actually worth what you’re claiming on the balance sheet because it was only created for the round tripping shell game.

creddit•15m ago
Except that this is isn't round-tripping at all. Round-tripping doesn't result in a company actually incurring expenses to create more product. Round-tripping is the term for schemes that enable you to double count assets/revenue without any economic effects taking place.

Every time HackerNews talks about anything in the legal or finance realm, people trip over themselves to make arguments for why something a big tech is doing is illegal. This is definitively neither illegal nor shady. If Nvidia believes, for example, that OpenAI can use their GPUs to turn a profit, then this is inherently positive sum economically for both sides: OpenAI gets capital in the form of GPUs, uses them to generate tokens which they sell above the cost of that capital and then the return some of the excess value to Nvidia. This is done via equity. It's a way for Nvidia to get access to some of the excess value of their product.

selectodude•1h ago
This is some Enron shit. Lets see NVDA mark to market these profits. Keep the spice flowing.
FinnKuhn•1h ago
They for example did a similar deal with Nscale just last week.

https://www.cnbc.com/2025/09/17/ai-startup-nscale-from-uk-is...

Aurornis•54m ago
This is being done out in the open (we’re reading the press announcement) and will be factored into valuations.

Also, investing in OpenAI means they get equity in return, which is not a worthless asset. There is actual mutually beneficial trade occurring.

landl0rd•52m ago
Nvidia has consistently done this with Coreweave, Nscale, really most of its balance sheet investments are like this. On the one hand there's a vaguely cogent rationale that they're a strategic investor and it sort of makes sense as an hardware-for-equity swap; on the other, it's obviously goosing revenue numbers. This is a bigger issue when it's $100B than with previous investments.

It's a good time to gently remind everyone that there are a whole pile of legal things one can do to change how a security looks "by the numbers" and this isn't even close to the shadiest. Heck some sell-side research makes what companies themselves do look benign.

yannyu•28m ago
A relevant joke, paraphrased from the internet:

Two economists are walking in a forest when they come across a pile of shit.

The first economist says to the other “I’ll pay you $100 to eat that pile of shit.” The second economist takes the $100 and eats the pile of shit.

They continue walking until they come across a second pile of shit. The second economist turns to the first and says “I’ll pay you $100 to eat that pile of shit.” The first economist takes the $100 and eats a pile of shit.

Walking a little more, the first economist looks at the second and says, "You know, I gave you $100 to eat shit, then you gave me back the same $100 to eat shit. I can't help but feel like we both just ate shit for nothing."

"That's not true", responded the second economist. "We increased total revenue by $200!"

paxys•15m ago
The punchline is supposed to be GDP, but yeah, same concept.
hoosieree•15m ago
This should go without saying but unfortunately it really doesn't these days:

This kind of corporate behavior is bad and will end up hurting somebody. If we're lucky the fallout will only hurt Nvidia. More likely it will end up hurting most taxpayers.

Mistletoe•51m ago
Isn’t our stock market basically propped up on this AI credits etc. house of cards right now?
rzerowan•50m ago
Its the same loop de loop NVIDIA is doing with Coreweave as i understand.'Investing' in coreweave which then 'buys' NVIDIA merch for cloud rental , resulting in Coreweave being the top 4 customers of NVIDIA chips.
vessenes•4m ago
Wait, why the quotes? NVDA sends cash, and the Coreweave spends it, no? I don’t think quotes are accurate, if they imply these transactions aren’t real, and material. At the end of the day, NVDA owns Coreweave stock, and actual, you know, physical hardware is put into data centers, and cash is wired.
rsync•41m ago
"In accounting terms, this is a shady business practice known as "round tripping" where you invest in a company for the sole purpose of them buying your product. It allows you to count your revenue multiple times."

... and we've seen this before in previous bubbles ...

mandeepj•33m ago
> this is a shady business practice known as "round tripping" where you invest in a company for the sole purpose of them buying your product.

Microsoft and Google have been doing it for decades. Probably, MS started that practice.

GuB-42•11m ago
I don't really understand how it is round tripping.

In the end, Nvidia will have OpenAI shares, which are valuable, and OpenAI will have GPUs, which are also valuable. It is not fake revenue, the GPUs will be made, sold at market price, and used, they are not intended to be bought back and sold to another customer. And hopefully, these GPUs will be put to good use by OpenAI so that they can make a profit, which will give Nvidia some return on investment.

It doesn't look so different from a car loan, where the dealer lends you the money so that you can buy their car.

bertili•1h ago
Whats in it for Nvidia? At the recent 300B valuation, 25% equity?
searine•1h ago
I look forward to subsidizing this effort with my skyrocketing home power bill.
DebtDeflation•1h ago
Wouldn't Nvidia be better served investing the $100B in expanding GPU manufacturing capacity?
ecshafer•1h ago
By investing in TSMC? By buying TSMC? I don't think $100B would buy them enough current generation capacity to make a difference from scratch.
paxys•33m ago
The don't have to pick just one.
vessenes•1m ago
They’re already spending as much money as they possibly can on growth, and have no further use for cash currently - they’ve been doing share buybacks this year.
TheRealGL•1h ago
Did I miss the part where they mention the 10 large nuclear plants needed to power this new operation? Where's all the power coming from for this?
HDThoreaun•1h ago
Build this thing in the middle of the desert and you would need around 100 sq mile of solar panels + a fuck load of batteries for it to be energy independent. The solar farm would be around $10 billion which is probably far less than the gpus cost
boringg•53m ago
Won't get you the necessary 4 9's uptime and energy sadly. Im still 100% for this -- but need another model for energy delivery.
xnx•52m ago
Dissipating 10GW of heat is also a challenge in a sunny, hot, dry environment.
newyankee•24m ago
100 sq km should suffice
catigula•1h ago
Consumer electric grids.
delfinom•1h ago
Yep. Consumers are screwed and $500/month electric bills are coming for the average consumer within a year or two. We do not have the electricity available for this.
leptons•43m ago
I'm pretty average, living in a small home, and my electric bill is already >$500/mo in the summer, and that's with the A/C set at 76F during the day.
davis•57m ago
Exactly this. This is essentially a new consumer tax in your electrical bill. The buildout of the electrical grid is being put on consumers essentially as a monthly tax with the increase in electrical costs. Everyone in the country is paying for the grid infrastructure to power these data centers owned by trillion dollar companies who aren't paying for their needs.
nutjob2•1h ago
Also, the fact they they announce not how much computing power they are going to deploy but rather how much electricity it's going to use (as if power usage is a useful measurement of processing power) is kind of gross.

"Good news everybody, your power bills are going up and your creaking, chronically underfunded infrastructure is even closer to collapse!"

hoosieree•9m ago
Also water. You will be rationed, OpenAI will not.

https://www.newstarget.com/2025-08-02-texas-ai-data-centers-...

nitwit005•1m ago
I assumed this headline was not aimed at the public, but at some utility they want to convince to expand capacity. Otherwise, bragging about future power consumption seems a bit perplexing.
lumenwrites•1h ago
Yaay, one step closer to torment nexus.
nh23423fefe•57m ago
low effort comment, whose content is a stale reference to other low effort memes
andreicaayoha•1h ago
pls
zuInnp•1h ago
Yeah, who cares about the enviroment... who needs water and energy, if you AI agent can give you better pep talk
moduspol•1h ago
Waiting patiently for the Ed Zitron article on this...
nextworddev•9m ago
He single-handedly cost people more than anyone with his bearish takes lol
gitremote•2m ago
When executives can't measure success by output, they measure success by input, a perverse incentive that rewards inefficiency.

Execs ask their employees to return to office, because they don't know how to measure good employee output.

Now OpenAI and Nvidia measure success by gigawatt input into AI instead of successful business outcomes from AI.

catigula•1h ago
Can we get some laws to force these companies to start subsidizing the consumer grids they're pummeling?

The electric bills are getting out of hand.

2OEH8eoCRo0•1h ago
What will happen if/when the AI bubble pops and there is far more grid capacity than demand? Power plant bailouts?
davis•1h ago
Load growth for the last 15 years has been very small but load growth going forward is expected to rise due to electrification of all things to decarbonize the economy. This means home heating, electrical cars, heavy industries, obviously data centers and the list goes on. So even if we have more grid capacity than demand (this seems unlikely), it will be used before too long.
vmg12•31m ago
They would build their own power lines / grid if they could.
bananapub•23m ago
... why? the current (heh) situation is that they do these big announcements and then local/state governments around the US get in a bidding war to try to shift costs from the datacenter operator on to their own citizens, in addition to offloading all of the capex.
aanet•57m ago
I'm old enough to remember when vendor financing was both de rigueur and also frowned upon... (1990s: telecom sector, with all big players like Lucent, Nortel, Cisco, indulging in it, ending with the bust of 2001/2002, of course)
alephnerd•54m ago
This absolutely feels like the Telco Bubble 2.0, and I've mentioned this on HN as well a couple times [0]

[0] - https://news.ycombinator.com/item?id=44069086

boringg•52m ago
For sure a great infrastructure build out -- lets hope the leftover are better energy infrastructure so that whatever comes next in 7 years after the flame out has some great stuff to build on (similar to telco bubble 1.0) and less damaging to planet earth in the long arc.
alephnerd•45m ago
Yep. The Telco Bust 1.0 along with the Dotcom Bust is what enabled the cloud computing boom, the SaaS boom, and the e-commerce boom by the early-mid 2010s.

I think the eventual AI bust will lead to the same thing, as the costs for developing a domain-specific model have cratered over the past couple years.

AI/ML (and the infra around it) is overvalued at their current multiples, but the value created it real, and as the market grows to understand the limitations but also the opportunities, a more realistic and permanent boo' will occur.

aanet•25m ago
Yeah - no doubt on the eventual productivity gains due to AI/ML (which are real, of course, just like the real gains due to telecom infra buildup), but must an economy go through a bubble first to realize these productivity gains??

It appears that the answer is "more likely yes than not".

Counting some examples:

- self driving / autonomous vehicles (seeing real deployments now with Waymo, but 99% deployment still ahead; meanwhile, $$$ billions of value destroyed in the last 10-15 years with so many startups running out of money, getting acquihired, etc)

- Humanoid robots... (potential bubble?? I don't know of a single commercial deployment today that is worth any solid revenues, but companies keep getting funded left / right)

lawlessone•53m ago
Nvidia if you're listening give me 10K and i'll bu...*invest 10K+ 10 euro worth of cash in your product.
EcommerceFlow•39m ago
If Solar can't compete with natural gas economically, and subsidizing solar ends up de-incentivizing natural gas production by artificially lowering energy prices, what's the solution here?
henearkr•38m ago
Your question is weird.

Solar does compete economically with methane already, and it's only going to improve even more.

EcommerceFlow•29m ago
If true, why aren't we mass scaling it all over the American West? We have railways running from West -> East, why not include power lines that can take power from energy farms in the West -> East? No major project in AZ, TX, or CA to give a city free power? etc
henearkr•14m ago
It is massively scaling everywhere, and notably in Texas btw.
zitterbewegung•39m ago
To put this into perspective this datacenter would have the land area of Monaco (740 acres) given assumptions of a 80kW/rack per case.
dguest•24m ago
Am I the only person who had to look up how big Monaco was? (answer, 2 km^2 [1])

[1]: https://en.wikipedia.org/wiki/Monaco

vessenes•7m ago
So, basically a single BYD factory
JCM9•38m ago
This is throwing more cards on the house of cards. Nvidia is “investing” in OpenAI so OpenAI can buy GPUs from NVidia. Textbook “round tripping.”

I generally like what’s been happening with AI but man this is gonna crash hard when reality sets in. We’re reaching the scary stage of a bubble where folks are forced to throw more and more cash on the fire to keep it going with no clear path to ever get that cash back. If anyone slows down, even just a bit, the whole thing goes critical and implodes.

anothermathbozo•31m ago
What is this a bubble on? What does said bubble collapsing look like?
Drunkfoowl•29m ago
High end server gpus and AI roi expectations.
reactordev•26m ago
I think everyone is underestimating the advancements in wafer tech and server compute over the last decade. Easy to miss when it’s out of sight out of mind but this isn’t going anywhere but up.

The current SOTA is going to pale in comparison to what we have 10 years from now.

zer00eyz•7m ago
> I think everyone is underestimating the advancements in wafer tech and server compute over the last decade.

What advancements?

We have done a fabulous job at lowering power consumption while exponentially increasing density of cores and to a lesser extent transistors.

Delivering power to data centers was becoming a problem 20 ish years ago. Today Power density and heat generation are off the charts. Most data center owners are lowering per rack system density to deal with the "problem".

There are literal projects pushing not only water cooling but refrigerant in the rack systems, in an attempt to get cooling to keep up with everything else.

The dot com boom and then Web 2.0 were fueled by Mores law, by Clock doubling and then the initial wave of core density. We have run out of all of those tricks. The new steps that were putting out have increased core densities but not lowered costs (because yields have been abysmal). Look at Nvidia's latests cores, They simply are not that much better in terms of real performance when compared to previous generations. If the 60 series shows the same slack gains then hardware isnt going to come along to bail out AI --- that continues to demand MORE compute cycles (tokens on thinking anyone) rather than less with each generation.

shawabawa3•28m ago
AI and tech companies

Collapse might look a little like the dot com bubble (stock crashes, bankruptcies, layoffs, etc)

wongarsu•8m ago
And it's worth reiterating that a bubble does not mean the technology is worthless. The dot com bubble collapsed despite the internet being a revolutionary technology that has shaped every decade since. Similarly LLMs are a great and revolutionary technology, but expectations, perception and valuations have grown much faster than what the technology can justify

These hype cycles aren't even bad per se. There is lots of capital to test out lots of useful ideas. But only a fraction of those will turn out to be both useful and currently viable, and the readjustment will be painful

bitmasher9•24m ago
Nvidia is giving OpenAi money (through investment) to buy Nvidia chips. The bubble is that Nvidia got that money from its crazy high stock price, the extra investment raises OpenAi’s evaluation and the increased sells raises Nvidia’s evaluation. If the valuations see a correction then spending like this will decrease, further decreasing valuations.

Bubble collapsing looks like enshittification of OpenAI tools as they try to raise revenues. It’ll ripple all throughout tech as everyone is tied into LLMs, and capital will be harder to come by.

drexlspivey•13m ago
> The bubble is that Nvidia got that money from its crazy high stock price,

This is totally False, NVDA has not done any stock offerings. The money is coming from the ungodly amount of GPUs they are selling. In fact they are doing the opposite, they are buying back their stock because they have more money that they know what to do with.

vessenes•10m ago
NVDA outstanding shares are down ~1.2% year over year; the company has been buying back its own shares with —>> profits <<— to the tune of tens of billions.

Meanwhile NVDA stock is mildly up on this news, so the current owners of NVDA seem to like this investment. Or at least not hate it.

Agreed that we’ll see ad-enabled ChatGPT in about five minutes. What’s not clear is how easily we’ll be able to identify the ads.

mountainriver•8m ago
Valuations won’t see a correction for the core players, I have no idea why people think that. Both of these companies are already money factories.

Then consider we are about to lower interest rates and kick off the growth cycle again. The only way these valuations are going is way up for the foreseeable future

babelfish•7m ago
> Bubble collapsing looks like enshittification of OpenAI tools as they try to raise revenues

Why does monetizing OpenAI tools lead to bubble collapse? People are clearly willing to pay for LLMs

jononor•28m ago
I do not think the leveraging is going to end there. I suspect this will be used to justify/secure power generation investments, possibly even nuclear. Likely via one or more of the OpenAI/Altman adjacent power startups.
amluto•25m ago
On the bright side, if lots of power capacity is added and most of the GPUs end up idle, then there might be cheap power available for other uses.
holoduke•22m ago
And computing in general gets cheaper.
lawlessone•18m ago
heating our homes next winter with clusters of h100s
jazzyjackson•20m ago
Power generation is not a monolithic enterprise. If more supply is built than needed, certain suppliers will go bankrupt.
lucianbr•5m ago
What are the chances suppliers will go bankrupt but the plants get sold and still produce power?
ogaj•4m ago
They may, but that doesn’t mean that the capacity disappears. It may require some assumptions about USG willingness to backstop an acquisition but it’s not a significant leap to think that the generation capacity remains in (more capable?) hands.
NewJazz•6m ago
Not if Ellison trickles it out for maximum profit.
bobmcnamara•22m ago
Altman is all in on converting the solar system into a Dyson sphere to power OpenAI.
resters•24m ago
At least the deal is denominated in watts rather than currency which may hyperinflate soon.
gdiamos•24m ago
It forces us to confront a question.

How much investment and prioritization in scaling laws is justified?

Jayakumark•15m ago
reminds me of this image https://www.reddit.com/media?url=https%3A%2F%2Fi.redd.it%2F6...
paxys•14m ago
It would be amusing if it also wasn't so accurate.
vessenes•13m ago
Almost every model trained by the majors has paid for itself with inference fees.

I’m not saying there isn’t a bubble, but I am saying if the researchers and strategists absolutely closest to the “metal” of realtime frontier models are correct that AGI is in reach, then this isn’t a bubble, it’s a highly rational race. One that large players seem to be winning right now.

mossTechnician•11m ago
Which of these model-making companies have posted a profit? I'm not familiar with any.
mountainriver•10m ago
The idea that it’s a bubble on the frontier model side is insane. AI assisted coding alone makes it the most valuable thing we’ve ever created.
jsheard•3m ago
> Almost every model trained by the majors has paid for itself with inference fees.

Even if we assume this is true, the downstream customers paying for that inference also have to be making a net profit from that inference on average in order for the upstream model training to be sustainable. Considering the prevailing business model for AI startups is to sell $10 of tokens for $1 that doesn't seem like a given to me.

SilverElfin•12m ago
I also recall reading that OpenAI is developing its own chips. What happened to that?
pixelready•12m ago
The real question is not whether this is a bubble since as you mentioned even if AI settles into a somewhat useful semi-mainstream tech, there is no way any of the likely outcomes can justify this level of investment.

The real question is what are we gonna do with all this cheap GPU compute when the bubble pops! Will high def game streaming finally have its time to shine? Will VFX outsource all of its render to the cloud? Will it meet the VR/AR hardware improvements in time to finally push the tech mainstream? Will it all just get re-routed back to crypto? Will someone come up with a more useful application of GPU compute?

halJordan•6m ago
Ai is already in semi-useful mainstream tech. There's a massive misunderstanding on this site (and other neo luddite sites) that somehow there is no "long tail" of business applications being transformed into ai applications.
ACCount37•3m ago
"The bubble will pop any minute now, any second, just you wait" is cope.

Even if AI somehow bucks the trend and stops advancing in leaps? It's still on track to be the most impactful technology since smartphones, if not since the Internet itself. And the likes of Nvidia? They're the Cisco of AI infrastructure.

big_toast•12m ago
Is this more of an accounting thing?

Is there some (tax?) efficiency where OpenAI could take money from another source, then pay it to Nvidia, and receive GPUs. But instead taking investment from Nvidia acts as a discount in some way (in addition to them roughly being realistically the efficient/sole supplier of an input OpenAI currently needs)?

jedberg•3m ago
It's interesting how deals like this are politically relevant. Nvidia refused to do deals like this (investing in companies buying large amounts of NVIDIA GPUs) after they got the hammer from Biden's SEC for self dealing due to their investment in Coreweave.

But now that there is a new SEC, they are doing a bunch of these deals. There is this one, which is huge. They also invested in Lambda, who is deploying Gigawatt scale datacenters of NVIDIA GPUs. And they are doing smaller deals too.

throwaway667555•1m ago
It's not round tripping. Economically Nvidia is investing property is OpenAI. It's not investing nothing, far from it.
mrcwinn•30m ago
Very foolish of them not to leverage SoftwareFPU. And with minimal effort Performas are rackable.
paxys•26m ago
> letter of intent for a landmark strategic partnership

> intends to invest up to xxx progressively

> preferred strategic compute and networking partner

> work together to co-optimize their roadmaps

> look forward to finalizing the details of this new phase of strategic partnership

I don't think I have seen so much meaningless corporate speak and so many outs in a public statement. "Yeah we'll maybe eventually do something cool".

labrador•25m ago
For scale: The 1960's era US Navy submarine I served on had a 78MW reactor, so 10GW is 128 nuclear submarines
dguest•18m ago
A typical reactor core is 1 GW, so it's also one rather big nuclear power plant.
Muromec•16m ago
More like two (and a half )
Muromec•16m ago
Or just ten very safe РБМК reactors rated 1GW each (they can't explode).
fragmede•11m ago
You almost got me. RBMKs had this problem with large positive void coefficients that was buried by the Soviet Union, which lead to Chernobyl.
gehsty•14m ago
Some more context, Nuclear power stations can be up to 2GW, offshore windfarms are seemingly hitting a plateau at ~1.5GW, individual turbines in operations now are 15MW. Grids are already strained, 525kV DC systems can transmit ~2GW of power per cable bundle…

Adding 10GW of offtake to any grid is going to cause significant problems and likely require CAPEX intensive upgrades (try buy 525kV dc cable from an established player and you are waiting until 2030+), as well as new generation for the power!

vessenes•9m ago
Yeah the path forward here is going to be Apple-like vertical supply chain integration. There is absolutely no spare capacity in the infra side of electrical right now, at least in the US.
pera•21m ago
10 gigawatts sounds ridiculously high, how can you estimate the actual usage? I guess they are not running at capacity 24/7 right? Because that would be more than the consumption of several European countries, like Finland and Belgium:

https://en.m.wikipedia.org/wiki/List_of_countries_by_electri...

rawgabbit•18m ago
Does this affect OpenAI’s renegotiation of their deal with Microsoft?
lvl155•14m ago
We are definitely closer to the top in this market. Do people even realize what they’re predicting in terms of energy use? It’s going to be a wasteland territory sooner than people think.
cainxinth•5m ago
Next Year: OpenAI announces it is seeking funding for a Dyson Sphere
pmdr•3m ago
If I had shovels to sell, I'd definitely announce a strategic partnership to have a huge quarry dug by hand.

Seriously, is there anyone in the media keeping unbiased tabs on how much we're spending on summarizing emails and making creatives starve a little more?