frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

Biomni: A General-Purpose Biomedical AI Agent

https://github.com/snap-stanford/Biomni
50•GavCo•1h ago•11 comments

Tree Borrows

https://plf.inf.ethz.ch/research/pldi25-tree-borrows.html
338•zdw•6h ago•48 comments

Show HN: FlopperZiro – A DIY open-source Flipper Zero clone

https://github.com/lraton/FlopperZiro
109•iraton•3h ago•26 comments

Jank Programming Language

https://jank-lang.org/
121•akkad33•3d ago•18 comments

A fast 3D collision detection algorithm

https://cairno.substack.com/p/improvements-to-the-separating-axis
138•OlympicMarmoto•6h ago•21 comments

Configuring Split Horizon DNS with Pi-Hole and Tailscale

https://www.bentasker.co.uk/posts/blog/general/configuring-pihole-to-serve-different-records-to-different-clients.html
36•gm678•3h ago•7 comments

Desktop Publishing Tools That Didn't Make It (2022)

https://tedium.co/2022/10/12/forgotten-desktop-publishing-tools-history/
30•rbanffy•3h ago•16 comments

Evolution Mail Users Easily Trackable

https://www.grepular.com/Evolution_Mail_Users_Easily_Trackable
76•mike-cardwell•3h ago•43 comments

Nuclear Waste Reprocessing Gains Momentum in the U.S.

https://spectrum.ieee.org/nuclear-waste-reprocessing-transmutation
54•rbanffy•5h ago•23 comments

Archaeologists unveil 3,500-year-old city in Peru

https://www.bbc.co.uk/news/articles/c07dmx38kyeo
84•neversaydie•2d ago•22 comments

Ruby 3.4 frozen string literals: What Rails developers need to know

https://www.prateekcodes.dev/ruby-34-frozen-string-literals-rails-upgrade-guide/
182•thomas_witt•3d ago•89 comments

Why LLMs Can't Write Q/Kdb+: Writing Code Right-to-Left

https://medium.com/@gabiteodoru/why-llms-cant-write-q-kdb-writing-code-right-to-left-ea6df68af443
163•gabiteodoru•1d ago•106 comments

US Court nullifies FTC requirement for click-to-cancel

https://arstechnica.com/tech-policy/2025/07/us-court-cancels-ftc-rule-that-would-have-made-canceling-subscriptions-easier/
509•gausswho•22h ago•478 comments

Google fails to dismiss wiretapping claims on SJ, settles with app users

17•1vuio0pswjnm7•1h ago•1 comments

I Ported SAP to a 1976 CPU. It Wasn't That Slow

https://github.com/oisee/zvdb-z80/blob/master/ZVDB-Z80-ABAP.md
112•weinzierl•2d ago•48 comments

Most RESTful APIs aren't really RESTful

https://florian-kraemer.net//software-architecture/2025/07/07/Most-RESTful-APIs-are-not-really-RESTful.html
255•BerislavLopac•14h ago•403 comments

Let Kids Be Loud

https://www.afterbabel.com/p/let-kids-be-loud
73•trevin•1h ago•73 comments

Bootstrapping a side project into a profitable seven-figure business

https://projectionlab.com/blog/we-reached-1m-arr-with-zero-funding
757•jonkuipers•1d ago•200 comments

The most otherworldly, mysterious forms of lightning on Earth

https://www.nationalgeographic.com/science/article/lightning-sprites-transient-luminous-events-thunderstorms
23•Anon84•3d ago•6 comments

Phrase origin: Why do we "call" functions?

https://quuxplusone.github.io/blog/2025/04/04/etymology-of-call/
216•todsacerdoti•17h ago•154 comments

7-Zip for Windows can now use more than 64 CPU threads for compression

https://www.7-zip.org/history.txt
234•doener•2d ago•153 comments

TaIrTe₄ photodetectors show promise for sensitive room-temperature THz sensing

https://phys.org/news/2025-07-tairte-photodetectors-highly-sensitive-room.html
10•wglb•3d ago•6 comments

Helm local code execution via a malicious chart

https://github.com/helm/helm/security/advisories/GHSA-557j-xg8c-q2mm
152•irke882•15h ago•73 comments

The Architecture Behind Lovable and Bolt

https://www.beam.cloud/blog/agentic-apps
46•Mernit•4h ago•19 comments

RapidRAW: A non-destructive and GPU-accelerated RAW image editor

https://github.com/CyberTimon/RapidRAW
240•l8rlump•18h ago•102 comments

A Emoji Reverse Polish Notation Calculator Written in COBOL

https://github.com/ghuntley/cobol-emoji-rpn-calculator
28•ghuntley•3d ago•5 comments

QRS: Epsilon Wrangling

https://www.tbray.org/ongoing/When/202x/2025/07/07/Epsilon-Wrangling
4•zdw•22m ago•0 comments

Is the doc bot docs, or not?

https://www.robinsloan.com/lab/what-are-we-even-doing-here/
182•tobr•13h ago•106 comments

X Chief Says She Is Leaving the Social Media Platform

https://www.nytimes.com/2025/07/09/technology/linda-yaccarino-x-steps-down.html
292•donohoe•6h ago•421 comments

ESIM Security

https://security-explorations.com/esim-security.html
115•todsacerdoti•11h ago•44 comments
Open in hackernews

Nvidia Becomes First Company to Reach $4T Market Cap

https://www.cnbc.com/2025/07/09/nvidia-4-trillion.html
97•mfiguiere•7h ago

Comments

chubot•5h ago
One meta-lesson is probably that sustained effort by the same people or same culture matters

i.e. maybe you need “two hits” to become this big, separated by ~2 decades

For nvidia, it was graphics and then programmable GPUs (CUDA)

For Apple, it was GUI desktops and music players/phones

Google is up there, but I’d argue it’s closer to “one hit”, and limited by the founders stepping back and turning the company into an investment conglomerate, rather than being mission-based

When the founders leave, efficiency and creativity seem to be slowed by competing factions of upper management, often working at cross purposes

I’d say that in the best cases, institutional knowledge can build over 2 decades, but it’s also very possible to lose it

floxy•5h ago
Amazon ($2.3T)? Microsoft ($3.7T)? Meta ($1.8T)?
xrendan•4h ago
Amazon - Ecommerce (1994), AWS (2006)

Microsoft - Programming Language (1975), Operating Systems (1981), Office Suite (1983)

Meta - Facebook (2004), Instagram (2010)

I would argue microsoft is unique because of how badly IBM screwed up.

0xcafefood•4h ago
Microsoft also has Azure and its gaming division.
jahsome•4h ago
Books and AWS

EEE and owning the soul of every customer

Facebook and cool zuck

nativeit•4h ago
Crazy what a near total lack of anticompetitive regulatory enforcement can do for a monopoly’s value, amirite?
bayarearefugee•5h ago
Another lesson is to try to be incredibly lucky.

I'm not suggesting this is all luck, Nvidia has executed very well and their early investments in programmable GPUs really paid off as a result, but a lot of their insane valuation now is due to crypto and then LLMs which are basically two back to back once in lifetime goldrushes where Nvidia happened to be the best positioned shovel seller.

You can run a company well to prepare to ride such a wave should it appear, but you've also got to be born with horseshoes up your ass for this to work out as well as it has for Nvidia

hn_throwaway_99•4h ago
The other thing I find interesting is that Jensen Huang has said that he wouldn't do it all over again. That is, knowing how hard it is now to build a startup, he wouldn't do it again.

I find this pretty crazy given that (at least for now) NVidia is the most successful startup of all time. Imagine the millions of other entrepreneurs, many of whom worked just as hard, who completely failed in the process.

SeanAnderson•3h ago
One of my favorite quotes is, "Luck is when preparation meets opportunity" :)
Imustaskforhelp•3h ago
Yes but opportunities are random.

I recently read in the https://news.ycombinator.com/item?id=44495428 (it is absolutely good article) and the first comment there which I wanted to paste here is:-

It's so, so , so hard to walk the line between persistence (which leads to glory) and stubbornness (which leads to more time following already wasted time.)

-Bruce_511 (HN)

So sure preparation meets opportunity but don't be stubborn just cause you feel like you are prepared.

Work hard but honestly, if you walk the line like bruce said (then persistence will result in you doing the hard work too tbh that's my philosophy right now)

So Everybody just need to figure out this line b/w stubborness and persistance especially in the startup world. But I think we need a book written on this topic tbh, any suggestions anybody?

FirmwareBurner•2h ago
>Yes but opportunities are random.

Not random, but always in the places where the big and easy to get money is. You're not gonna get much VC funding in Botswana or Bangladesh no matter how good your idea is and how smart you are.

That's why so many sacrifice everything to go live in the bay area even if it's an overpriced shithole full of homeless, drug addicts and feces.

Mars008•2h ago
AMD, Intel, IBM,.. were just as lucky, and what?
Nifty3929•1h ago
Satoshi =? Jensen
dachworker•5h ago
NVIDIA put a lot of effort into making their hardware and accompanying software useful and usable. CUDA by itself might have never got any attention if not for the effort that NVIDIA puts into helping their customers use their technology, effectively. And they are by no means perfect at it. Most of their products are a horrible mess and they often have 7 different ways to do the same thing. A lot of their libraries are closed source and you're forced to use an API that links to a black box. There's lots to complain about.
LarsDu88•4h ago
I think Nvidia's market development efforts have meant more than its culture. Ever since Nvidia started moving towards general purpose compute with the 8800 GPU back in the mid 2000s, it's been actively growing markets - first in research (leading to AI advances), and now in autonomous driving, world simulation, biotechnology, and robotics.

The market for compute is endless, and Nvidia makes huge efforts to commoditize the software side of things so people can buy hardware.

basch•1h ago
isnt the modern LLM frenzy googles second?
kraemate•5h ago
Wow, who knew making proprietary accelerators could be so profitable.
Yeul•4h ago
Every time Jensen speaks the AI word on a stage the company value goes up by a billion.
anonymars•4h ago
It stuns me that a trillion (much less 4) is so incomprehensibly large that this is entirely mathematically plausible
cratermoon•4h ago
Related? https://www.cnbc.com/2025/06/29/nvidia-insiders-1-billion-st...
LarsDu88•4h ago
To think the entire company could have failed multiple times in the 90s if not for Sega's CEO bailing Nvidia out after Sega fucked up the Dreamcast contract.

And even if Nvidia had won that contract, the Dreamcast ultimately failed. Nvidia was close to destruction multiple times in its early years.

lvl155•4h ago
I thought Apple would drive it to the ground after their fallout considering how Jobs handled things. Nvidia rode that crypto wave and AI was there at the very end of that rainbow. Having said that, they kept at CUDA all these years even before pandemic. They earned it 100%!
nativeit•4h ago
Time for some grossly oversimplified back-of-the-proverbial-envelope value crunching! I’ll assume the average GPU price, for the sake of argument, is $1000. Let’s also assume their per-unit profit margin is roughly 30% (I found conflicting numbers for this on a casual search, esp. between figures that measure quarterly and annual income, I suppose it isn’t a surprise that their accountants frequently pull rabbits from hats).

Nvidia would need to move on the order of 4,000,000,000 units to hit $4T in revenue, more than triple that to realize $4T in profits. Even if the average per-unit costs are 2-3x my estimated $1k, as near as I’ve been able to tell they “only” move a few million units each year for a given sku.

I am struggling to work out how these markets get so inflated, such that it pins a company’s worth to some astronomical figure (some 50x total equity, in this case) that seems wholly untethered to any material potential?

My intuition is that the absence of the rapid, generationally transformative, advances in tech and industry that were largely seen in the latter half of the 20th-century (quickly followed with smartphones and social networking), stock market investors seem content to force similar patterns onto any marginally plausible narrative that can provide the same aesthetics of growth, even if the most basic arithmetic thoroughly perforates it.

That said, I nearly went bankrupt buying a used car recently, so this is a whole lot of unqualified conjecture on my part (but not for nothing, my admittedly limited personal wealth isn’t heavily dependent on such bets).

wredcoll•4h ago
It seems fairly obvious, to me, that the issue is that most people make money from the stock price changing rather than from any kind of intrinisic value of the underlying company.

In other words, why should it matter to me what the company's profit margin or asset base or what not is actually worth when I make money if the stock number goes up?

sokoloff•3h ago
In the short run, markets are a voting machine; in the long run, they’re a weighing machine. — Ben Graham

If you own a slice of nVidia’s shares at a current P/E of 37, after a year, they’ve earned 2.7% of the value and you still have the same stake as you did before. That’s pricing in further growth and upside in earnings from here (otherwise, you could buy US treasuries at a better price), but doesn’t seem outrageous to me.

Disclaimer: I don’t directly own any $NVDA; I do own mutual funds that own some.

Ologn•4h ago
Nvidia's trailing P/E ratio is 53 (stock hitting a new high today). Its forward P/E ratio is 38.

A year ago both its trailing and forward P/E were higher. So the stock is relatively a bargain compared to what it was a year ago.

The price implies that revenues and profits are expected to continue to grow.

> My intuition is that the absence of the rapid, generationally transformative, advances in tech and industry that were largely seen in the latter half of the 20th-century (quickly followed with smartphones and social networking), stock market investors seem content to force similar patterns onto any marginally plausible narrative that can provide the same aesthetics of growth

I wouldn't disagree with this.

nativeit•3h ago
Thanks for the layman’s explanation for the logic involved, that was precisely what I was confused about.
scottiebarnes•4h ago
NVDA's current forward P/E ratio (price to earnings) is about 37.

That means if we hold constant the profit earnings, if you bought the whole company at its current valuation ($4tr), it would take you 37 years to break even.

Is this reasonable? Depends on sector and growth potential. To me, this is a "fair" valuation and not overly inflated based solely on existing earnings.

nativeit•3h ago
I can understand that, at least in theory. I feel like this is one of the only contexts where markets accommodate long-term thinking, which frustrates my own sensibilities. Thanks for the add’l perspective.
arcanemachiner•3h ago
Compared to Palantir's P/E ratio of ~750, that seems very reasonable.
ElevenLathe•3h ago
Seems pretty unlikely to me that they can sustain their current earnings until 2062, but I'm no Wall St analyst.
scottiebarnes•3h ago
Yes, that is a limit of the model (PE ratio) that we're using. It requires the holding of all variables to be constant, which is not practical.

We use it as a snapshot in time to check our sanity and to allow us to compare apples to oranges.

That said, you could have made the same statement about AAPL or MSFT 20-30 years ago, and you would have been dead wrong.

ElevenLathe•3h ago
Fair point, but without an engraved prophecy from a licensed and bonded deity, I probably wouldn't have bought AAPL or MSFT in 1988 either, certainly not with the intent of holding it until 2025. I would have been wrong in some sense, but one has to take on the risks one is comfortable with. I'd rather hold a broad index and focus on other things!
scottiebarnes•1h ago
Fair enough.

As someone who bought NVDA in 2016/2017 and held till now, I'm very happy with the way I applied my software knowledge to profit where I won't have to work again.

Risk taking is best done in domains where you have an edge!

ElevenLathe•1h ago
I'm curious: Did you hedge your 2017 NVDA bet so that it was literally just on the performance of that one stock?
scottiebarnes•1h ago
I bought shares, a sum that I could afford to lose but also a non-trivial amount.

As it went up, I occasionally sold some and took profit, but not much, till now.

I saw how important and productive machine learning was, and it seemed like NVDA had an excellent CUDA moat, so that was my thesis for that bet. In the last couple years I sold some and bought a house.

This is obviously a story with survivorship bias. But anyway my superpower isn't technical knowledge, its the lack of emotion and bias towards non-action when markets tumble. As you probably know, time in the market is better than timing the market.

rdsubhas•27m ago
Or 23 years if the currency depreciates 2% each year.

Or 16 years at 5% inflation.

Spinnaker_•4h ago
You are way off. A single B200 costs $70k. They sell them in racks for over $3mm each. And they have 55% net profit margin.
ergsef•4h ago
55% net profit doesn't include NRE right? The thing about selling fewer, bigger-ticket items is that the non-recurring engineering costs are amortized over fewer sales. Not to say they aren't printing money, but the unit cost to produce the second GPU pales in comparison to the effort to produce the first one.
nativeit•4h ago
How many racks are they selling? Is that 50% of their revenue? 10%? How sustainable will that be? I understand AI will probably continue to grow, but can they continue cornering such a market with 55% margins?

Fortunately, I opted to pivot towards ratio of total equity, the per-unit activity was a very rough attempt at moving away from abstractions, and that is obviously one of the many flaws in such an exercise.

I already noted the profit margins are incredibly unstable, so I don’t trust the reported figures where they quadrupled inside of a decade. I’m not suggesting it isn’t real, only that it isn’t possible to pin that 55% down as sustainable for any significant period of time, certainly not the 30-50 years is it would take to realize $4T of value at their current pace.

swalsh•3h ago
The answer to "how many racks are they selling" is currently as much as they can manufacture, extended out a year.
Spinnaker_•3h ago
It's close to 90% of their revenue. They will sell about $115B this year, $180B in 2026 and $230B in 2027, with margins staying fairly constant. Their only real competitor is Broadcom, who has slightly worse margin on AI chips.
nativeit•3h ago
So my broader argument is that their current performance isn’t a reliable indicator for the future, as so much of their current position is circumstantial. Any investments that rely on their sustaining this sort of performance are inherently flawed as a result.
nativeit•3h ago
For the record, most recent graphics revenues were $14.3B, with $116B in compute/networking. [1]

So quite lopsided. That curtails my expectations even further, as this represents a lot of initial infrastructure investments whose long-term expenditures (and viability) remain to be seen.

1. https://www.marketscreener.com/quote/stock/NVIDIA-CORPORATIO...

_zoltan_•2h ago
B200, that old thing? :)

B300 has been installed a week or so ago. Good luck to AMD to catch up...

moralestapia•4h ago
>I’ll assume the average GPU price, for the sake of argument, is $1000.

They make big bucks on the premium end of their chips. Those contracts are typically on the 8-figure range, I would think they easily have thousands of them around the world.

Even Jensen has implied[1] that the consumer GPU market (i.e. gaming) holds a minor share of revenue these days.

1: Citation needed, I know. I mean comments like "we are not going to abandon gamers, etc...".

Spinnaker_•3h ago
The contracts are now 10 figures for multiple hyperscalers. And they aren't abandoning gaming, but it's 8% of revenue and falling fast.
petsfed•4h ago
Thing is, NVIDIA ships waaaay more than GPUs. Or, perhaps more accurately, NVIDIA ships chips. Other manufacturers install those chips. Sitting in my office right now, I have 5 computers, and between them I'd estimate I have 15 NVIDIA chips, minimum. Maybe more, I haven't carefully examined my NVIDIA-based, ASUS-manufactured graphics card to see how many name-brand chips it has.

That's to say nothing of all the other products and services they build. I just visited their website, clicked on "solutions" at the top, and there's waaaay more there than just desktop GPUs. And its worth noting that NVIDIA doesn't manufacture or sell any of the down-market NVIDIA-based boards.

Given NVIDIA's role in data centers, I think the 4T market cap is, while probably still somewhat inflated by speculation, not so inflated as to be a bubble ready to pop.

tsvetkov•3h ago
The market price is supposed to account for future growth, not just for current revenue. Predicting future is speculative by definition, but it's not completely detached from reality to bet that Nvidia has the potential to grow significantly for some time (at some point either the market cap or the multiple will correct of course).

I also see where the reasoning here contradicts the reality. If we assume Nvidia only sells $1000 gpus and moves a few millions a year, then how did it received $137B in FY2025? In reality they don't just sell GPUs, they sell systems for AI training and inference at insane margins (I've seen 90% estimates) and also some GPUs at decent margins (30-40%). These margins may be enough to stimulate competition at some point, but so far those risks have not materialized.

Stevvo•2h ago
Nvidia's revenue is $44 billion in the last quarter. It's been growing at 5 billion a quarter. With a 50% profit margin. It is the most profitable company in the world; just take a look at the fiscals. If that justifies the valuation or not, your guess is as good as mine.
gautamcgoel•2h ago
Yeah, but their AI/data center GPUs go for closer to $100K, and I've heard that they obtain >50% margins on those. I agree with your overall point that the $4T valuation is not justified by current profits.
TriangleEdge•4h ago
The next question is: will I be alive to see the first quadrillion dollar company? Or maybe a 100$ banana?
thomassmith65•4h ago
Congratulations to Nvidia.

That said, I would be wary about buying shares of any company tied to AI right now.

Very few people scrambling to throw money into 'AI stocks' have any idea about technology. When the music stops it's going to be ugly.

ra7•4h ago
I understand Nvidia is in a very dominant position. But $4T market cap still seems absolutely insane to me. I've only read about the Cisco boom and bust during the Internet era, and this feels eerily similar (people who actually experienced it might feel differently though).

What could actually drag Nvidia down and make them spend decades in the dark like Cisco still does? So far the two things I've come up with are: (a) general disillusionment in AI and companies not being able to monetize enough to justify spending on GPUs. (b) Big companies designing their own chips in-house lowering demand for Nvidia GPUs.

I don't think Nvidia can counter (a), but can they overcome (b) by also offering custom chip design services instead of insisting on selling a proprietary AI stack?

fckgw•4h ago
Also just LLMs getting more efficient in general can lead to the end of the GPU buying frenzy
bigyabai•3h ago
I don't really buy it. If you can get GPT-3 performance out of a 4B parameter model, then people are going to use the GPUs for even higher-quality remote inference.
blitzo•3h ago
If China finally figured out how to make whatever machines ASML does, that the sell signal for NVDA.
oersted•3h ago
I do broadly agree, but want to note that the ASML machines are not necessarily the bottleneck, or at least not the final bottleneck, and their valuation reflects that.

There’s a reason why only TSMC and to a lesser extent now Samsung and Intel are the only serious players in top-end semiconductors. You can’t just buy the machine and print chips, the amount of iterative tuning and know-how required to get good yields is immense. Weirdly, the actual bottleneck seems to be the availability of what can almost be described as “master craftsmanship”. But it’s not enough either to hire a couple of masters, it’s the collective institutional knowledge built up over >50 years.

And, of course, TSMC is not worth nearly as much as NVidia either even if they manufacture all their hardware.

vel0city•3h ago
We have a class of shawmans who knows how to speak the right incantations over the magic crystals during their formation which causes our machines to think and create value in the world.
FirmwareBurner•2h ago
>You can’t just buy the machine and print chips

Exactly this, as otherwise there would be nothing stopping ASML opening up its own competing fabs next door with their cutting edge machines 12 months before they sell them to their customers for maximum profits, the same way Bitcoin ASMIC mining companies did with their chips.

Or at least some Dutch/EU company in the area doing it, but nobody else can do what TSMC does at the cutting edge. For context, EU's most cutting edge fabs will be the Dresden TSMC one at 12nm.

moralestapia•2h ago
Also, the world is not going to "hyperscale" forever.

But also also, it most likely will for the next 10 years.

Ologn•3h ago
Cisco stock (which I thought about buying in 1992 and didn't, unfortunately) doubled in 1990, tripled in 1991, doubled in 1992, and kept going up every year - in 1995 it doubled, in 1998 it doubled, in 1999 it doubled. So it had a long run (and is also still worth over $250 billion).

The monetary push is very LLM based. One thing being pushed that I am familiar with is LLM assisted programming. LLMs are being pushed to do other things as well. If LLMs don't improve more, or if companies don't see the monetary benefits of using them in the short/medium term, that would drag Nvidia down.

Nvidia has a lot of network effects. Probably only Google has some immunity to that (with its TPUs). I doubt Nvidia will have competition in training LLMs for a while. It is possible a competitor could start taking market share on the low end for inference, but even that would take a while. People have been talking about AMD competition for over two years, and I haven't seen anything that even seems like it might have potential yet, especially on the high end.

ra7•3h ago
There's a lot of push for inference hardware now (e.g. Ironwood TPUs). How does Nvidia maintain an edge there?

Also, I think the market has to expand beyond LLMs to areas like robotics and self driving cars (and they need to have real success) for Nvidia to maintain this valuation. I don't think only LLMs are enough because I don't see code assist/image generation/chatbots as a massive market.

Night_Thastus•3h ago
(b) isn't likely because CUDA. Nvidia spent a LOT of time developing CUDA into the de-facto standard for working with GPUs. No one has anything close to comparable. They poured a ton of time and effort into it and pushed it like crazy in the education space as well.

(a) can happen, but Nvidia has some buffer. All these companies promised their shareholders huge massive gains if only they embrace the AI revolution. They will be extremely resistant to admit they can't make any money out of it and will keep the charade going for as long as they can. It won't be like a cliff, so Nvidia has time to adjust and handle it.

The market cap is nonsense though, it's just hype. I never put any real weight on them.

svnt•3h ago
In what way would the successful implementation of chip design services overcome (b)?

They would need to make more offering design services, giving away their design secrets in the process, than selling a product protected by a massive moat. This would be slaughtering the goose.

If you can present an example of someone executing this strategy as a pivot from an extremely high margin proprietary product and increasing their market cap as a result I would be very interested to read about it.

mcv•3h ago
Not so long ago, the most valuable company in the world was $200B. It feels like only a few years ago that the $1T barrier was broken. Where will this stop?

Are these companies already more valuable than the VOC at its height, when it owned entire countries? Is that where we're headed?

lucaspauker•3h ago
Apple was first company to reach $1T and it happened on August 2nd, 2018
strbean•3h ago
Is it possible that a higher concentration of wealth means inflation is reflected more strongly in asset markets than it is in consumer goods?

Billionaires aren't exactly buying more eggs when they have more cash, they're cash is competing for ownership of assets.

Nifty3929•1h ago
I agree with you, but I think you have cause/effect reversed. It's not that a high concentration of wealth results in high assets prices. It's that the money has nowhere else to go except into tech/AI, so those are the only assets that appreciate/inflate, and that leads to a concentration of wealth.
pfannkuchen•28m ago
Their cash isn’t actually tied up in the asset though, per se. Whomever they buy the asset from gets the cash. A good chunk of it likely does eventually filter out into consumer goods, since people selling assets are sometimes selling to fund living expenses and not just to shift into another asset.
chung8123•3h ago
I think this is a mixure of inflation and fewer places to put money
RadiozRadioz•7m ago
Yes they are https://www.worldsfirststockexchange.com/2020/11/17/was-the-... which I agree is strange
bluecalm•3h ago
One thing that can hurt them is a shift from current neural networks to something that is more sequential in nature and thus better suited for a CPU.

One scenario is that maybe a smaller NN is enough for most tasks but you have to train it in a smart way (search, creating feedback, reasoning).

It's a long shot but maybe GPUs won't be the best hardware for the job. It's a pure speculation though.

yibg•3h ago
(c) a technological breakthrough that makes what Nvidia makes obsolete.
falcor84•1h ago
My favorite sci-fi fever dream is of biological computation. With the rapid advances in computational biology, I think we might have full biological ALUs in my lifetime, which would offer incredible power efficiency.
kjellsbells•20m ago
I've been thinking about this a lot lately too. There are biological systems that exist in vast quantity, like, say, ants or blades of grass (or, simply cells). I think there is a way to make them computing machines (there was a paper from Adleman, of RSA crypto fame, years ago about it, for example). The challenge is getting the problem broken down across billions of individual compute elements, and of getting the data in and out quickly.
Mars008•2h ago
In other words we hate them and want them to fail. Bill Gates, Elon Musk, the same story repeats again.

(c) NVidia isn't all about LLM, it has robotics, embedded, vision.. This is going to be huge as generic robotics hits.

amelius•4h ago
Time to start building their own fab, I would say.
Night_Thastus•3h ago
...no? Leading-edge semiconductor manufacturing is incredibly difficult. Keeping pace with it is nearly impossible. Many companies and entire countries have tried - and failed, over and over and over. You can't just throw billions of dollars at the problem and hope you solve it.

It is far, far smarter to design the chips and leave the manufacturing to others who will have the expertise and take on the risk.

amelius•3h ago
There isn't that much competition, and don't forget about recent silicon fabbing shortages.
Night_Thastus•3h ago
TSMC isn't 'that much' competition? Good luck beating them at it. They're essentially the only game in town for top-end silicon.

AMD tried, realized it was completely pointless and split off into Global Foundries. The giant Intel struggled incredibly hard with the 10nm node and has now lost significant ground because of it.

Leading edge silicon is ruthless. You can burn hundreds of billions of dollars a year trying to catch up and if your product is even slightly worse it may as well be worthless - all the cost with none of the demand.

Going into older nodes is far more forgiving, but you can't make GPUs with that.

lossolo•36m ago
It's only a matter of time, SMIC will have all the funding it needs. They've hired former TSMC staff, and they have Huawei as a research hub, with its 30,000 person research campus. Huawei is also working on its own EUV machines and researching the next generation of chip-making technologies.

China has some of the best technical universities in the world. Many of the names you see in the U.S. AI/ML space are Chinese — and a significant number have returned to China, seeking new opportunities. The world has changed.

By 2035, China could be on par with TSMC, or even surpass it. Once, there were Nokias and Kodaks — and like them, TSMC will eventually face real competition. Nothing is eternal. TSMC doesn't possess secret knowledge that others can't obtain or special brains that others cannot match.

China’s future security and competitive edge depend on becoming self-sufficient in advanced node production. They now have all the necessary ingredients, more than anyone before, to challenge and possibly overtake TSMC. It's only a matter of time.

amelius•18m ago
> They're essentially the only game in town for top-end silicon

If so, then why aren't we paying them 30% of our revenue like in the AppStore?

2OEH8eoCRo0•3h ago
Lunacy for a company that doesn't physically make anything but emails a design to TSMC in geopolitically risky Taiwan.
beng-nl•2h ago
It’s a bit short sighted to say that nvidia doesn’t physically make anything. Because they sure as hell physically sell something. One of the most wanted products on earth. For huge margins.
2OEH8eoCRo0•45m ago
Where is the Nvidia factory?
mg•3h ago
Over time, will humans turn all of planet Earth into an active system?

If so, the total addressable market of Nvidia might be pretty big.

Let's take the human body as a point of reference. The weight of the human brain makes up about 2% of the human body.

Earth weighs in at about 10^25 kg. 2% of that is about 10^22 kg.

All computer hardware on planet Earth weighs what? Maybe a billion computers times 10 kg? That would be 10^10 kg.

So we still would have to up that by a factor of 10^12.

Still 99.99999999% of the way to go.

Night_Thastus•3h ago
Smells like Universal Paperclips
timerol•3h ago
$4T isn't cool. You know what's cool? $2^32
lowsong•3h ago
AI definitely isn't a bubble though, don't worry about it.
firefax•3h ago
Maybe I suck at searching, but I'm surprised there isn't a Wikipedia list of companies by market cap.
nemothekid•3h ago
https://en.wikipedia.org/wiki/List_of_public_corporations_by...
orlp•3h ago
Literally pasting "Wikipedia list of companies by market cap" into Google gives me https://en.wikipedia.org/wiki/List_of_public_corporations_by....
EcommerceFlow•3h ago
What percent of the population currently uses Ai? 5%?

This is a technology that will reach 90% usage for almost everything people do, so there is still so much more growth to go.

micromacrofoot•3h ago
I sincerely hope not
babypuncher•3h ago
Everybody already uses AI, it's being shoved down our throats. You literally cannot use Google or Amazon without AI nonsense popping up. People don't seem too happy about it though. I can't say that these features have improved my Amazon or Google experiences. In fact, in the case of the latter, the experience has only gotten worse as they've injected more AI.
vivzkestrel•3h ago
i ll give you the opposite viewpoint. 90% people are using AI because it is being forcefully shoved into google, amazon, microsoft etc. Most AI models offer little value at the moment and once the novelty wears off, even a real AI breakthrough will create aversion amongst the masses, this ll cause a huge crash
strbean•2h ago
I think eventual 90% (more like 100%) usage is already priced in to nVidia's valuation.
pinkmuffinere•3h ago
My personal pet peeve is comparisons like this, which aren’t inflation-adjusted. Is this a higher real valuation than the any other company in existence so far? From the information provided, it is unclear.

Edit: scrolling through the entries in the Wikipedia page here (https://en.wikipedia.org/wiki/List_of_public_corporations_by...), it seems likely that this is the highest valuation in real terms (ie, adjusted for inflation)

bentt•1h ago
I'm afraid this primarily means that dollars are worth a lot less than they used to be.