frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Open in hackernews

Circular Financing: Does Nvidia's $110B Bet Echo the Telecom Bubble?

https://tomtunguz.com/nvidia_nortel_vendor_financing_comparison/
107•miltava•1h ago

Comments

alephnerd•1h ago
Glad to see Tom's blog on HN - as usual a great write up. A number of us have been chatting about this for several months now, and the take is fairly sober.

Meta commentary but I've grown weary of how commentary by actual domain experts in our industry are underrepresented and underdiscussed on HN in favor of emotionally charged takes.

dvt•47m ago
> actual domain experts

Calling a VC a "domain expert" is like calling an alcoholic a "libation engineer." VC blogs are, in the best case, mildly informative, and in the worst, borderline fraudulent (the Sequoia SBF piece being a recent example, but there are hundreds).

The incentives are, even in a true "domain expert" case (think: doctors, engineers, economists), often opaque. But when it comes to VCs, this gets ratcheted up by an order of magnitude.

alephnerd•43m ago
Tom has had a fairly solid track record at Redpoint in Cybersecurity, Enterprise SaaS, and AI/ML. And it's not like we see many posts by engineers, doctors, or economists on HN either - most posts are listicles about the "culture" of technology, an increased amount of political articles growing increasingly tenuously related to the tech industry, and a portion of actually interesting technical content.
hodgesrm•17m ago
Martin Casado is a counter-example. His writings on technology starting with his phd thesis are very informative. [0] He’s the real thing as are many others.

[0] http://yuba.stanford.edu/~casado/mcthesis.pdf

redwood•1h ago
TLDR: Lucent was committing various forms of accounting fraud and had an unhealthy cash flow position, and had their primary customers on economically dangerous ground. Nvidia meanwhile appears to be above board, has strong cash flow, and has extremely strong dominant customers (eg customers that could reduce spending but can survive a downturn). Therefore there's no clear takeaway: similarities but also differences. Risk and a lot of debt as well as hyperscalers insulating themselves from some of that risk... but at the same time as lot more cash to burn.
dangus•58m ago
I think we are at the PS3/Xbox 360 phase of AI.

By that I mean, those were the last consoles where performance improvements delivered truly new experiences, where the hardware mattered.

Today, any game you make for a modern system is a game you could have made for the PS3/Xbox 360 or perhaps something slightly more powerful.

Certainly there have been experiences that use new capabilities that you can’t literally put on those consoles, but they aren’t really “more” in the same way that a PS2 offered “more” than the PlayStation.

I think in that sense, there will be some kind of bubble. All the companies that thought that AI would eventually get good enough to suit their use case will eventually be disappointed and quit their investment. The use cases where AI makes sense will stick around.

It’s kind of like how we used to have pipe dreams of certain kinds of gameplay experiences that never materialized. With our new hardware power we thought that maybe we could someday play games with endless universes of rich content. But now that we are there, we see games like Starfield prove that dream to be something of a farce.

ben_w•44m ago
> By that I mean, those were the last consoles where performance improvements delivered truly new experiences, where the hardware mattered.

I hope that's where we are, because that means my experience will still be valuable and vibe coding remains limited to "only" tickets that take a human about half a day, or a day if you're lucky.

Given the cost needed for improvements, it's certainly not implausible…

…but it's also not a sure thing.

I tried "Cursor" for the first time last week, and just like I've been experiencing every few months since InstructGPT was demonstrated, it blew my mind.

My game metaphor is 3D graphics in the 90s: every new release feels amazing*, such a huge improvement over the previous release, but behind the hype and awe there was enough missing for us to keep that cycle going for a dozen rounds.

* we used to call stuff like this "photorealistic": https://www.reddit.com/r/gaming/comments/ktyr1/unreal_yes_th...

jcranmer•21m ago
> By that I mean, those were the last consoles where performance improvements delivered truly new experiences, where the hardware mattered.

The PS3 is the last console to have actual specialized hardware. After the PS3, everything is just regular ol' CPU and regular ol' GPU running in a custom form factor (and a stripped-down OS on top of it); before then, with the exception of the Xbox, everything had customized coprocessors that are different from regular consumer GPUs.

davedx•52m ago
Some great insights with some less interesting in there. I didn’t know about the SPVs, that’s sketchy and now I wanna know how much of that is going on. The MIT study that gets pulled out for every critical discussion of AI was an eye roll for me. But very solid analysis of the quants.

How much of a threat is custom silicon to Nvidia remains an open question to me. I kinda think, by now, we can say they’re similar but different enough to coexist in the competitive compute landscape?

alephnerd•49m ago
> How much of a threat is custom silicon to Nvidia remains an open question to me

Nvidia has also begun trying to enter the custom silicon sector as well, but it's still largely dominated by Broadcom, Marvell, and Renesas.

monkeydust•47m ago
Where can you track GPU utilization rates? Assuming private data but curious if not.
spaceballbat•46m ago
Looking at the last chapter of the essay, there was a lot of illegal activity by lucent in the runup to the collapse. Today, We won’t know the list of shady practices until the bubble bursts. I doubt Tom could legally speculate, he’d likely be sued into oblivion if he even hinted at malfeasance by these trillion dollar companies.
hackthemack•45m ago
I worked at a mom and pop ISP in the 90s. Lucent did seem at the forefront of internet equipment at the time. We used Portmaster 3s to handle dial up connections. We also looked into very early wireless technology from Lucent.

Something I wanted to mention, only somewhat tanget. The Telecommunications Act of 1996 forced telecommunication companies to lease out their infrastructure. It massively reduced the prices an ISP had to pay to get T1, because, suddenly, there was competition. I think a T1 went from 1800 a month in 1996, to around 600 a month in 1999. It was a long time ago, so my memory is hazy.

But, wouldn't you know it, the Telecommunication companies sued the FCC and the Telecommunications Act was gutted in 2003

https://en.wikipedia.org/wiki/Competitive_local_exchange_car...

narmiouh•44m ago
I think the fundamental issue is the uncertainty of achieving AGI with baked in fundamentals of reasoning.

Almost 90% of topline investments appear to be geared around achieving that in the next 2-5 years.

If that doesn’t come to pass soon enough, investors will loose interest.

Interest has been maintained by continuous growth in benchmark results. Perhaps this pattern can continue for another 6-12 months before fatigue sets in, there are no new math olympiads to claim a gold medal on…

Whats next is to show real results, in true software development, cancer research, robotics.

I am highly doubtful the current model architecture will get there.

cl42•27m ago
Not sure why you're getting downvoted.

If you speak with AI researchers, they all seem reasonable in their expectations.

... but I work with non-technical business people across industries and their expectations are NOT reasonable. They expect ChatGPT to do their entire job for $20/month and hire, plan, budget accordingly.

12 months later, when things don't work out, their response to AI goes to the other end of the spectrum -- anger, avoidance, suspicion of new products, etc.

Enough failures and you have slowing revenue growth. I think if companies see lower revenue growth (not even drops!), investors will get very very nervous and we can see a drop in valuations, share prices, etc.

Cheer2171•7m ago
> their expectations are NOT reasonable. They expect ChatGPT to do their entire job for $20/month and hire, plan, budget accordingly.

This is entirely on the AI companies and their boosters. Sam Altman literally says gpt 5 is "like having a team of PhD-level experts in your pocket." All the commercials sell this fantasy.

Zigurd•24m ago
AGI is not near. At best, the domains where we send people to years of grad school so that they can do unnatural reasoning tasks in unmanageable knowledge bases, like law and medicine, will become solid application areas for LLMs. Coding, most of all, will become significantly more productive. Thing is, once the backlog of shite code gets re-engineered, the computing demand for a new code creation will not support bubble levels of demand for hardware.
xadhominemx•11m ago
Hyperscalers are only spending less than half of their operating cash flows on AI capex. Full commitment to achieving AGI within a few years would look much different.
xbmcuser•41m ago
With all the major players like Amzn, Msft and Alphabet going for their own custom chips and restrictions on selling to China it will be interesting to see how Nvidia does.

I personally would prefer China to get to parity on node size and get competitive with nvidia. As that is the only way I see the world not being taken over by the tech oligarchy.

pgspaintbrush•40m ago
Are these companies developing InfiniBand-class interconnects to pair with their custom chips? Without equivalent fabric, they can’t replace NVIDIA GPUs for large-scale training.
whp_wessel•23m ago
recent Huang podcast went into this, making the point that custom chips won't be competitive to Nvidia's as they are now making specialised chips instead of just 'gpu's'.

https://open.spotify.com/episode/2ieRvuJxrpTh2V626siZYQ?si=2...

pragmatic•37m ago
At a "telecom of telecom" we (they) were still lighting up dark fiber 15 years later (2015) when mobile data for cell carriers finally created enough demand. Hard to fathom the amount of overbuild.

The only difference is fiber optic lines remained useful the whole time. Will these cards have the same longevity?

(I have no idea just sharing anecdata)

hyghjiyhu•32m ago
I think the chips themselves won't have longevity, but the r&d gone into them is useful. Question is whether the value of that can be captured.
adventured•7m ago
Depends on which companies we're talking about. Nvidia's annualized operating income is so high right now that it'll be capturing more value (op income) in the next four quarters (~$120 billion) than its R&D expenditures have cost over its 32 year history combined. For Nvidia the return has long since been achieved.

As the AI spending bubble gives out, Nvidia's profit growth will slow dramatically (single digits), and slamming into a wall (as Cisco did during the telecom bubble; leading up to the telecom crash, Cisco was producing rather insane quarter over quarter growth rates).

pragmatic•29m ago
In 2005 telecom was a cash cow because of long distance charges and if your mechanical phone switch was paid off you were printing money (regulations guaranteed revenue)

This didn't last that much longer and many places were trying to diversify into managed services (data dog for companues on Orem network and server equipment,etc) which they call "unregulated" revenue.

Add written an things business, irrational exuberance can kill you.

Zigurd•28m ago
New fiber isn't significantly more power efficient. The other side of the coin is that backhoes haven't become more efficient since the fiber was buried.
mjcl•26m ago
I think the high-density data centers that are being built to support the hyperscalers are more analogous to the dark fiber overbuild. When you lit that fiber in 2015, you (presumably) were not using a line card bought back in 1998.
mg•36m ago

    Fiber networks were using less
    than 0.002% of available capacity,
    with potential for 60,000x speed
    increases. It was just too early.
I doubt we will see unused GPU capacity. As soon as we can prompt "Think about the codebase over night. Try different ways to refactor it. Tomorrow, show me your best solution." we will want as much GPU time at the current rate as possible.

If a minute of GPU usage is currently $0.10, a night of GPU usage is 8 * 60 * 0.1 = $48. Which might very well be worth it for an improved codebase. Or a better design of a car. Or a better book cover. Or a better business plan.

cantor_S_drug•35m ago
With improvements on the algorithm side and new techniques, even older hardware will become useful.
Zigurd•30m ago
I get what you're saying and the reasoning behind it, but older hardware has never been useful where power consumption is part of determining usefulness.
chatmasta•21m ago
This is the biggest threat to the GPU economy – software breakthroughs that enable inference on commodity CPU hardware or specialized ASIC boards that hyperscalers can fabricate themselves. Google has a stockpile of TPUs that seem fairly effective, although it’s hard to tell for certain because they don’t make it easy to rent them.
xadhominemx•15m ago
More efficient inference = more reasoning token. Hyperscaler ASICs are closing the gap at the hardware/system level, yes.
Zigurd•7m ago
I don't think we will need to wait for anything as unpredictable as a breakthrough. Optimizing inference for the most clearly defined tasks, which are also the tasks where value is most readily quantified, like coding, is underway now.
Mo3•22m ago
> I doubt we will see unused GPU capacity

I'd argue we very certainly will. Companies are gobbling up GPUs like there's no tomorrow, assuming demand will remain stable and continue growing indefinitely. Meanwhile LLM fatigue has started to set in, models are getting smaller and smaller and consumer hardware is getting better and better. There's no way this won't end up with a lot of idle GPUs.

xadhominemx•16m ago
Test time compute has made consumption highly elastic. More compute = better results. Marginal cost of running these GPUs when they would otherwise be idle is relatively very low. It will be utilized.
delusional•16m ago
> There's no way this won't end up with a lot of idle GPUs.

Nvidia is betting the farm on reinventing GPU compute every 2 years. The GPUs wont end up idle, because they will end up in landfills.

Do I believe that's likely, no, but it is what I believe Nvidia is aiming for.

jdlshore•11m ago
> As soon as we can prompt…

This is the fundamental error I see people making. LLMs can’t operate independently today, not on substantive problems. A lot of people are assuming that they will some day be able to, but the fact is that, today, they cannot.

The AI bubble has been driven by people seeing the beginning of an S-curve and combining it with their science-fiction fantasies about what AI is capable of. Maybe they’re right, but I’m skeptical, and I think the capabilities we see today are close to as good as LLMs are going to get. And today, it’s not good enough.

skrebbel•5m ago
> improved codebase

I've seen lots of claims about AI coding skill, but that one might be able to improve (and not merely passably extend) a codebase is a new one. I'd want to see it before I believe it.

ivape•36m ago
One of the things before AI in the market was that capital had limited growth opportunities. Tech, which was basically a universe of scaled out crud apps, was where capital would keep going back into.

AI is a lot more useful than hyper scaled up crud apps. Comparing this to the past is really overfitting imho.

The only argument against accumulating GPUs is that they get old and stop working. Not that it sucks, not that it’s not worth it. As in, the argument against it is actually in the spirit of “I wish we could keep the thing longer”. Does that sound like there’s no demand for this thing?

The AI thesis requires getting on board with what Jenson has been saying:

1) We have a new way to do things

2) The old ways have been utterly outclassed

3) If a device has any semblance of compute power, it will need to be enhanced, updated, or wholesale replaced with an AI variant.

There is no middle ground to this thesis. There is no “and we’ll use AI here and here, but not here, therefore we predictably know what is to come”.

Get used to the unreal. Your web apps could truly one day be generated frame by frame by a video model. Really. The amount of compute we’ll need will be staggering.

stephc_int13•25m ago
Knowing history of past bubbles is only mildly informative. The dotcom bubble was different than the railroads bubble etc.

The only thing to keep in mind is that all of this is about business and ROI.

Given the colossal investments, even if the companies finances are healthy and not fraudulent, the economic returns have to be unprecedented or there will be a crash.

They are all chasing a golden goose.

delusional•18m ago
> only mildly informative

I agree, but would like to maybe build out that theory. When we start talking about the mechanisms of the past we end up over-constricting the possibility space. There were a ton of different ways the dotcom bubble COULD have played out, and only one way it did. If we view the way it did as the only way it possibly could have, we'll almost certainly miss the way the next bubble will play out.

cl42•23m ago
Great points. I am bullish on AI but also wary of accounting practices. Tom says Nvidia's financials are different from Lucent's but that doesn't mean we shouldn't be wary.

The Economist has a great discussion on depreciation assumptions having a huge impact on how the finances of the cloud vendors are perceived[1].

Revenue recognition and expectations around Oracle could also be what bursts the bubble. Coreweave or Oracle could be the weak point, even if Nvidia is not.

[1] https://www.economist.com/business/2025/09/18/the-4trn-accou...

Theodores•19m ago
This reminds me of SGI at the peak of the dot-com bubble.

SGI (Silicon Graphics) made the 3D hardware that many companies relied on for their own businesses, in the days before Windows NT and Nvidia came of age.

Alias|Wavefront and Discreet were two companies where their product cycles were very tied in the SGI product cycles, with SGI having some ownership, whether it be wholly owned or spun out (as SGI collapsed). I can't find the reporting from the time, but it seemed to me that the SGI share price was propped up by product launches from the likes of Alias|Wavefront or Discreet. Equally, the 3D software houses seemed to have share prices propped up by SGI product launches.

There was also the small matter of insider trading. If you knew the latest SGI boxes were lemons then you could place your bets of the 3D software houses accordingly.

Eventually Autodesk, Computer Associates and others eventually owned all the software, or, at least, the user bases. Once upon a time these companies were on the stock market and worth billions, but then they became just another bullet point in the Autodesk footer.

My prediction is that a lot of AI is like that, a classic bubble, and, when the show moves on, all of these AI products will get shoehorned into the three companies that will survive, with competition law meaning that it will be three rather than two eventual winners.

Equally, much like what happened with SGI, Nvidia will eventually come a cropper due to the evaluations due to today's hype and hubris not delivering.

digitcatphd•19m ago
The biggest issue with Nvidia is their revenue is not recurring but the market is treating their stock as it were, which is correlated with all semi stocks, with a one-time massive CAPEX investment lasting 1-2 years.

Simple as this - as to why its just not possible for this to continue.

cyanydeez•16m ago
TSLA is the same. Tge market is basically a new rich persons bank, abstracted by loans and lines of credit.

Obviously its a bubble but thats meaningless for anyone but the richest to manage.

The rest of us are just ants.

xadhominemx•12m ago
NVDA stock does not trade at a huge multiple. Only 25x EPS despite very rapid top line growth and a dominant position at the eve of possibly the most important technology transition in the history of humankind. The market is (and has been) pricing in a slowdown.
JCM9•17m ago
The smartest finance folks I know say that this “irrational exuberance” works until it doesn’t. Meaning nobody really thinks it’s sustainable, but companies and VCs chasing the AI hype bubble have backed themselves into a corner where the only way to stop the bubble from bursting is to keep inflating the bubble.

The fate of the bubble will be decided by Wall Street not tech folks in the valley. Wall Street is already positioning itself for the burst and there’s lots of finance types ready to call party over and trigger the chaos that lets them make bank on the bubble’s implosion.

These finance types (family offices, small secret investment funds) eat clueless VCs throwing cash on the fire for lunch… and they’re salivating at what’s ahead. It’s a “Big Short” once in 20-30 years type opportunity.

delusional•12m ago
> have backed themselves into a corner where the only way to stop the bubble from bursting is to keep inflating the bubble.

They are not in any corner. They rightly believe that they won't be allowed to fail. There's zero cost to inflating the bubble. If they tank a loss, it's not their money and they'll go on to somewhere else. If they get lucky (maybe skillful?) they get out of the bubble before anyone else, but get to ride it all the way to the top.

The only way they lose is if they sit by and do nothing. The upside is huge, and the downside is non-existent.

rossdavidh•5m ago
"This time it's different"

AI Wave – The What and the How

https://alearningaday.blog/2025/10/04/ai-wave-the-what-and-the-how/
1•codeclimber•52s ago•0 comments

High-fidelity microreactor load follow simulations with model predictive control

https://www.sciencedirect.com/science/article/pii/S0149197025002872
1•PaulHoule•2m ago•0 comments

Marineland asks Canada for emergency cash to feed whales or euthanasia imminent

https://apnews.com/article/whales-marineland-canada-belugas-2ede29f44bd0dc949f3879a2520b6be8
1•c420•3m ago•0 comments

Self-Hosting Email Like It's 1984

https://maxadamski.com/blog/2025/10/email.html
1•xmx98•3m ago•0 comments

Show HN: Radkit, A2A focused rust agents SDK

https://github.com/microagents/radkit
1•irshadnilam•5m ago•0 comments

Discover the power of knowing who's watching your site–AI bots

1•legitcoders•6m ago•0 comments

Flock's gunshot detection microphones will start listening for human voices

https://www.eff.org/deeplinks/2025/10/flocks-gunshot-detection-microphones-will-start-listening-h...
1•hhs•7m ago•0 comments

Why the hell does Android even exist anymore?

https://fireborn.mataroa.blog/blog/why-the-hell-does-android-even-exist-anymore/
2•microsoftedging•9m ago•0 comments

Building a ham radio data transceiver on the cheap

https://hackaday.com/2025/10/03/building-a-ham-radio-data-transceiver-on-the-cheap/
2•ikbdsk•13m ago•0 comments

Lazy text capitalization with low latency large language models

https://blog.florianschulz.info/2025/10/lazy-text-capitalization/
2•florians•17m ago•0 comments

Harvard Neurologist: 'Wanting to keep your brain young forever is foolishness'

https://english.elpais.com/science-tech/2025-09-30/alvaro-pascual-leone-neurologist-wanting-to-ke...
1•panarchy•19m ago•0 comments

Don't Tax Wealth

https://www.economist.com/finance-and-economics/2025/10/02/dont-tax-wealth
3•andsoitis•21m ago•0 comments

Treasure hunters discover $1M in silver and gold coins off Florida coast

https://www.theguardian.com/us-news/2025/oct/02/treasure-florida-spanish-coins
1•hentrep•23m ago•0 comments

Show HN: Got sick and tired of AI in search engines, so I made my own

https://glass.8ball.space
1•eightballsystem•23m ago•0 comments

Tell HN: Stripe seems to have suspended human customer support

4•almost•25m ago•0 comments

A Mac-like experience on Linux

https://pointieststick.com/2025/10/04/a-mac-like-experience-on-linux/
4•TangerineDream•25m ago•1 comments

Tom Swifty

https://en.wikipedia.org/wiki/Tom_Swifty
1•tdeck•27m ago•0 comments

Databranches: Using Git as a Database

https://joeyh.name/blog/entry/databranches/
1•edward•28m ago•0 comments

Game Development: History, Industry, and Engine Design

https://spiiin.github.io/blog/490626496/
1•todsacerdoti•31m ago•0 comments

86Box Now Supports the SafeDisc Copy Protection for Cue/Bin Image Files

https://fabulous.systems/posts/2025/09/86box-safedisc-copy-protection-for-cue-bin-images/
2•ingve•31m ago•0 comments

Home Tech – Smart Display for Healthier Work

https://www.hometechco.com/
1•tortilla•32m ago•0 comments

Explainer: Inodes and Inode Numbers

https://eclecticlight.co/2025/10/04/explainer-inodes-and-inode-numbers/
2•ingve•35m ago•0 comments

One Day at a Time

https://firstlight.bearblog.dev/one-day-at-a-time/
1•edelwiess•43m ago•0 comments

How fast can you taste code?

https://dayson.io/posts/taste-code/
1•dayson•43m ago•0 comments

Mira Murati, the 36-year-old tech prodigy who shot to fame at OpenAI

https://fortune.com/2025/10/03/mira-murati-career-ai-thinking-machines-goldman-sachs-tesla-leap-o...
3•fcpguru•43m ago•0 comments

PC cooler control with a $2 microcontroller, no development board

https://popovicu.com/posts/pc-cooler-control-with-2-dollar-microcontroller-no-development-board/
2•popovicu•44m ago•1 comments

ICE plans to scour Facebook, TikTok, X, and even defunct Google+

https://www.theregister.com/2025/10/03/ice_contractors_social_media_spy/
3•Bender•45m ago•0 comments

UK government says digital ID won't be compulsory – honest

https://www.theregister.com/2025/10/03/uk_digital_id_clarity/
4•Bender•46m ago•0 comments

'Retired' cybercrime group demands $989M not to leak 1B Salesforce records

https://www.theregister.com/2025/10/03/scattered_lapsus_hunters_latest_leak/
1•Bender•46m ago•0 comments

Show HN: LazyArchon – Terminal-Based Project Management TUI Built with Go

https://lazyarchon.yousfisaad.com/
1•ysaad•48m ago•0 comments