frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Dev Culture Is Dying the Curious Developer Is Gone

https://dayvster.com/blog/dev-culture-is-dying-the-curious-developer-is-gone/
76•ibobev•32m ago•34 comments

I regret building this $3000 Pi AI cluster

https://www.jeffgeerling.com/blog/2025/i-regret-building-3000-pi-ai-cluster
183•speckx•2h ago•160 comments

Ants Seem to Defy Biology: They Lay Eggs That Hatch into Another Species

https://www.smithsonianmag.com/smart-news/these-ant-queens-seem-to-defy-biology-they-lay-eggs-tha...
73•sampo•4h ago•16 comments

Internet Archive's big battle with music publishers ends in settlement

https://arstechnica.com/tech-policy/2025/09/internet-archives-big-battle-with-music-publishers-en...
103•coloneltcb•3d ago•48 comments

Ruby Central's Attack on RubyGems [pdf]

https://pup-e.com/goodbye-rubygems.pdf
363•jolux•8h ago•99 comments

Want to piss off your IT department? Are the links not malicious looking enough?

https://phishyurl.com/
911•jordigh•17h ago•268 comments

Show the Physics

https://interactivetextbooks.tudelft.nl/showthephysics/Introduction/About.html
46•pillars•2d ago•3 comments

Statistical Physics with R: Ising Model with Monte Carlo

https://github.com/msuzen/isingLenzMC
78•northlondoner•7h ago•48 comments

Help Us Raise $200k to Free JavaScript from Oracle

https://deno.com/blog/javascript-tm-gofundme
390•kaladin-jasnah•14h ago•179 comments

Shipping 100 hardware units in under eight weeks

https://farhanhossain.substack.com/p/how-we-shipped-100-hardware-units
27•M_farhan_h•20h ago•19 comments

Intel Arc Celestial dGPU seems to be first casualty of Nvidia partnership

https://www.notebookcheck.net/Intel-Arc-Celestial-dGPU-seems-to-be-first-casualty-of-Nvidia-partn...
71•LorenDB•2h ago•56 comments

Rules for creating good-looking user interfaces, from a developer

https://weberdominik.com/blog/rules-user-interfaces/
279•domysee•3d ago•149 comments

Trevor Milton's Nikola Case Dropped by SEC Following Trump Pardon

https://eletric-vehicles.com/nikola/trevor-miltons-nikola-case-dropped-by-sec-following-trump-par...
71•xnx•1h ago•30 comments

Dynamo AI (YC W22) Is Hiring a Senior Kubernetes Engineer

https://www.ycombinator.com/companies/dynamo-ai/jobs/fU1oC9q-senior-kubernetes-engineer
1•DynamoFL•4h ago

Leatherman (vagabond)

https://en.wikipedia.org/wiki/Leatherman_(vagabond)
212•redbell•4d ago•99 comments

The Ruliology of Lambdas

https://writings.stephenwolfram.com/2025/09/the-ruliology-of-lambdas/
81•marvinborner•3d ago•24 comments

Linux for Nintendo 64 (1997)

https://web.archive.org/web/19990220141243/http://www.heise.de/ix/artikel/E/1997/04/036/
27•flykespice•3d ago•10 comments

The Sagrada Família takes its final shape

https://www.newyorker.com/magazine/2025/09/22/is-the-sagrada-familia-a-masterpiece-or-kitsch
332•pseudolus•3d ago•176 comments

U.S. already has the critical minerals it needs, according to new analysis

https://www.minesnewsroom.com/news/us-already-has-critical-minerals-it-needs-theyre-being-thrown-...
227•giuliomagnifico•20h ago•298 comments

Apple: SSH and FileVault

https://keith.github.io/xcode-man-pages/apple_ssh_and_filevault.7.html
465•ingve•20h ago•158 comments

David Lynch LA House

https://www.wallpaper.com/design-interiors/david-lynch-house-los-angeles-for-sale
226•ewf•16h ago•99 comments

Gemini in Chrome

https://gemini.google/overview/gemini-in-chrome/
251•angst•14h ago•208 comments

This map is not upside down

https://www.maps.com/this-map-is-not-upside-down/
327•aagha•22h ago•464 comments

The sordid reality of retirement villages: Residents are being milked for profit

https://unherd.com/2025/09/the-sordid-truth-about-retriement-villages/
86•johngabbar•3h ago•81 comments

Court lets NSF keep swinging axe at $1B in research grants

https://www.theregister.com/2025/09/19/court_lets_nsf_keep_swinging/
39•rntn•2h ago•25 comments

Grief gets an expiration date, just like us

https://bessstillman.substack.com/p/oh-fuck-youre-still-sad
427•LaurenSerino•1d ago•199 comments

AI tools are making the world look weird

https://strat7.com/blogs/weird-in-weird-out/
184•gaaz•18h ago•165 comments

Tracking trust with Rust in the kernel

https://lwn.net/Articles/1034603/
141•pykello•4d ago•43 comments

JIT-ing a stack machine (with SLJIT)

https://bullno1.com/blog/jiting-a-stack-machine
28•bullno1•3d ago•5 comments

Slow Liquid

https://www.robinsloan.com/lab/slow-liquid/
61•thomasjb•1h ago•59 comments
Open in hackernews

Intel Arc Celestial dGPU seems to be first casualty of Nvidia partnership

https://www.notebookcheck.net/Intel-Arc-Celestial-dGPU-seems-to-be-first-casualty-of-Nvidia-partnership-while-Intel-Arc-B770-is-allegedly-still-alive.1118962.0.html
71•LorenDB•2h ago

Comments

TiredOfLife•1h ago
Source is "Moore's law is dead" youtuber. A coin toss is more reliable than him.
gregbot•1h ago
Really? Ive been following him for years and he has always been 100% accurate. What has he been wrong about?
dralley•1h ago
I agree that he's not that bad, but he's definitely not 100% accurate, in particular with respect to Intel.

Notably this is about the 3rd time in 2 years that he's reported that the Intel dGPU efforts are being killed off.

Even on the latest developments the reporting is contradictory, so someone is wrong and I suspect it's him. https://www.techpowerup.com/341149/intel-arc-gpus-remain-in-...

gregbot•28m ago
So far everything he said in that video has happened and he did not say that intel would never release another dGPU just that it would be a token release which is absolutely exactly what has happened
nodja•59m ago
They had videos saying intel was gonna cancel the dGPU division and focus on datacenter pretty much since the intel cards came out. Amongst many other things they've said. I used to follow them too, but they speak with too much confidence about things they know nothing about.

They're a channel focused on leaks, but most of their leaks are just industry insider gossip masked as factual to farm clicks. Their leaks are useless for any sort of predictions, but may be interesting if you'd like to know what insiders are thinking.

A quick google search also yielded this[1] 2-year old reddit thread that shows videos they deleted because their predictions were incorrect. There's probably many more. (That subreddit seems to be dedicated to trashing MLID.)

[1] https://www.reddit.com/r/BustedSilicon/comments/yo9l2i/colle...

gregbot•26m ago
> gossip masked as factual to farm clicks

Instead of invectives could you just say what specific leak of his was inaccurate? Everything he said about intel dGPU has happened exactly as he said it would. Have you watched his video about that yourself?

carlhjerpe•36m ago
Yeah all the videos I saw where he was right had 100% accuracy, which you'll be reminded of in the next video, the times he was wrong won't be advertised the same.
gregbot•25m ago
why dont you just say what he’s been wrong about?
carlhjerpe•22m ago
Because I won't invest my time or money into rewatching every video, I don't get paid to be here.
gregbot•14m ago
You replied to a comment that asked:

> What has he been wrong about

…

carlhjerpe•12m ago
And I said he's been right about everything he's been right about, because that's the stuff you'll remember.
chao-•1h ago
It is very hard to put any belief in the rumor mill surrounding Intel's discrete desktop GPUs. Already this year, there have been at least three "leaks" saying "It's canceled!", and every time, a counter-rumor comes about saying "It isn't canceled!"

In all accounts I have seen, their single SKU from this second generation consumer lineup has been well-received. Yet the article says "what can only be categorized as a shaky and often rudderless business", without any justification.

Yes, it is worth pondering what the Nvidia investment means for Intel Arc Graphics, but "rudderless"? Really?

belval•32m ago
Honestly, the rumor mill surrounding Intel is actually very similar to AMD 2015-2016 pre-Zen (not saying that they will see the same outcome). I swear I have seen the same "x86 license is not transferable [other company] might sue them" 9 years ago or "Product Y will be discontinued".

When it comes to GPUs, a $4T company probably couldn't care less what their $150B partner does in their spare time as long as they prioritize the partnership. Especially when the GPUs in question are low-end units, in a segment that Nvidia has no competition in and not even shipping that many. If they actually asked them to kill it, it would be 100% out of pettiness.

Sometimes I wonder if these articles are written for clicks and these "leakers" are actually just the authors making stuff up and getting it right from time to time.

chao-•2m ago
From a corporate strategy perspective, cancel Arc or keep Arc, I can see it both ways.

Intel has so many other GPU-adjacent products and they will doubtless be continuing most of them, even if they don't pursue Arc further: Jaguar Shores, Flex GPUs for VDI, and of course their Xe integrated graphics. I could possibly see Intel not ship a successor to Flex? Maybe? I cannot see a world where they abandon Xe (first-party laptop graphics) or Jaguar Shores ("rack scale" datacenter "GPUs").

With all of that effort going into GPU-ish designs, is there enough overlap that the output/artifacts from those products support and benefit Arc? Or if Arc only continues to be a mid-tier success, is it thus a waste of fab allocation, a loss of potential profit, and an unnecessary expense in terms of engineers maintaining drivers, and so forth? That is the part I do not know, and why I could see it going either way.

So I can see it both ways, but in no world do I trust these supposed leaks.

cubefox•32m ago
Yeah that is bizarre. They have been very focused and even managed to upstage AMD by several years in the ML acceleration department (XeSS).
bobajeff•1h ago
Suddenly, what Intel's CEO means by new products must deliver 50% gross profit [1], and it being too late to catch up with AI [2] is starting to be clearer.

[1]: https://www.tomshardware.com/tech-industry/semiconductors/in...

[2]: https://www.tomshardware.com/tech-industry/intel-ceo-says-it...

baq•1h ago
CEO of a silicon company saying his business is "too late for AI" is a CEO either without a vision or guts, an accountant, the safe option. If it's anywhere close to true, Intel is looking to sell itself for parts.
freedomben•1h ago
Agreed, either their business situation is far more critical than we know, this is a gross indictment of their R&D, or this is malpractice on the part of the leadership
checker659•46m ago
> of a silicon company

With their own fabs, at that

moralestapia•45m ago
Well, but if it's true and there's a better strategy, why wouldn't he do it?

Seems like you'd prefer yet another +1 selling AI oil and promises ...

h2zizzle•37m ago
Or, a sly way of calling the AI bubble.
tester756•36m ago
The real quote is:

>"On training, I think it is too late for us,"

Not too late for AI, but too late for training meanwhile there's inference opportunity or something like that

aDyslecticCrow•28m ago
Too late for the AI boom if they have to spend another 2 years and manufacturing investments to get a product out for the segment. We are over-inflated on AI hype. Its relevance will remain but betting a company on it isn't a wise idea.
BeetleB•24m ago
There's a whole backstory to this.

When he joined only a few months ago, he set the vision of making Intel a worthy participant in the AI space.

Then just a few months later, he announced "we cannot compete".

What happened in the middle? Recent articles came out about the conflict between him and Frank Yeary, the head of the Intel board. He wanted to acquire a hot AI startup, and Frank opposed it. Two factions were formed in the Board, and they lost a lot of time battling it out. While this was going on, a FAANG came in and bought the startup.

I think his announcement that Intel cannot compete was his way of saying "I cannot do it with the current Intel board."

Scramblejams•7m ago
What startup was it?
phkahler•55m ago
Cutting products that don't have 50 percent margins seems like a bad choice when their goal should be filling their advanced fabs. Keeping that investment at or near capacity should be their goal. They said they'd have to cancel one node if the foundry business couldn't get enough customers, and yet they're willing to cut their own product line? Sure they need to make a profit, but IMHO they should be after volume at this point.
KronisLV•16m ago
Even the Arc B580 GPUs could have been a bigger win if they were actually ever in stock at MSRP, I say that as someone who owns one in my daily driver PC. Yet it seemed oddly close to a paper launch, or nowhere near the demand, to the point where the prices were so far above MSRP that it made the value really bad.

Same as how they messed up the Core Ultra desktop launch, of their own volition - by setting the prices so high that they can’t even compete with their own 13th and 14th gen chips, not even mentioning Ryzen CPUs that are mostly better in both absolute terms and in the price/perf. A sidegrade isn’t the end of the world but a badly overpriced sidegrade is dead on arrival.

Idk what Intel is doing.

tester756•36m ago
The real quote is:

>"On training, I think it is too late for us,"

Not too late for AI, but too late for training meanwhile there's inference opportunity or something like that

2OEH8eoCRo0•1h ago
I think we overestimate desktop GPU relevance. Are gaming GPUs really that lucrative?
hhh•1h ago
No. It used to be more even between datacenter and gaming for NVIDIA, but that's not been the case for a few years. Gaming has brought in less money than networking (mellanox) since '24 Q4.

https://morethanmoore.substack.com/p/nvidia-2026-q2-financia...

vlovich123•1h ago
But the same thing that makes GPUs powerful at rendering is what AI needs - modern gaming GPUs are basically supercomputers that provide Hw and Sw to do programmable embarrassingly parallel work. That is modern game rendering but also AI and crypto (and various science engineering) which is the second revolution that Intel completely missed (the first one being mobile).
jlarocco•48m ago
I don't think anybody is using gaming GPUs to do serious AI at this point, though.
vlovich123•32m ago
But you can use a gaming card to do AI and you can use H100 to game. The architecture between them is quite similar. And I expect upcoming edge AI applications to break down and end up using GPUs more than having dedicated AI accelerator HW because A) you need something to do display anyway B) the fixed function DSPs that have been called "AI accelerators" are worse than useless for running LLMs.
patagurbon•41m ago
AI (apparently) needs much lower precision in training and certainly in inference than gaming requires though. A very very large part of the die on modern datacenter GPUs is effectively useless for gaming
vlovich123•26m ago
I disagree that HW blocks for lower precision take up that much die space. Data center GPUs are useless for gaming because it's tuned that way. H100 still has 24 raster operating units (4050 has 32) and 456 texture mapping units (4090 has 512). That's because there's only so much they can tune the HW architecture to one use-case or the other without breaking some fundamental architecture assumptions. And consumer cards still come with tensor units and support for lower precision. This is because the HW costs and unit economics are such that it's much more in favor of a unified architecture that scales to different workloads vs discrete implementations specific to a given market segment.

They've also not bothered investing in SW to add the H100 to their consumer drivers to work well on games. That doesn't mean it's impossible and none of that takes away from the fact that H100 and consumer GPUs are much more similar and could theoretically be made to run the same workloads at comparable performance.

pjmlp•1h ago
Depends if one cares about a PlayStation/XBox like experience, or Switch like.
gpderetta•1h ago
they kept nvidia in business for a long time until their datacenter breakthrough.
anonym29•55m ago
the value proposition of Intel's graphics division wasn't in the current generation gaming GPUs, it was the growth of talent internally that could target higher and higher end chips at a much lower price than Nvidia until they were knocking on the door of the A100/H200-class chips - the chips that Nvidia produces for $2k and then sells for $40k.

Not to mention, Intel having vertical integration gave Intel flexibility, customization, and some cost saving advantages that Nvidia didn't have as much of, Nvidia being a fabless designer who are themselves a customer of another for-profit fab (TSMC).

If TFA is true, this was an anticompetitive move by Nvidia to preemptively decapitate their biggest competitor in 2030's datacenter GPU market.

nodja•54m ago
They're a market entry point. CUDA became popular not because it was good, but because it was accessible. If you need to spend $10k minimum on hardware just to test the waters of what you're trying to do, that's a lot to think about, and possibly tons of paperwork if it's not your money. But if you can test it on $300 hardware that you probably already own anyway...
mrweasel•54m ago
If they weren't why would Nvidia keep making them? They do seem like an increasingly niche product, but apparently not enough that Nvidia is willing to just exit the market and focus on the datacenters.

They aren't just for gaming, there's also high-end workstations, but that's probably even more niche.

tempest_•22m ago
Have you seen the latest generation of Nvidia gaming cards? They are increasingly looking like an after thought.
MostlyStable•4m ago
I'm honestly curious why they keep making them. As far as I can tell, NVIDIA can sell literally as many datacenter AI chips as they can produce, and that would probably continue to be true even if they significantly increased prices. And even without increasing prices, the datacenter products are considerably higher margin than the consumer GPUs. Every consumer GPU they sell is lost revenue in comparison to using that fab capacity for a datacenter product.

The only reason I can imagine for them leaving the money on the table is that they think that the AI boom won't last that much longer and they don't want to kill their reputation in the consumer market. But even in that case, I'm not sure it really makes that much sense.

Maybe if consumer GPUs were literally just datacenter silicon that didn't make the grade or something, it would make sense but I don't think that's the case.

justincormack•43m ago
Gaming GPUs make up 7% of Nvidia's business, 93% is datacentre. So, no.
eYrKEC2•43m ago
Intel has always pursued agglomeration into the main CPU. They sucked up the math co-processor. They sucked up the frontside bus logic. They sucked up the DDR controllers more and more. They have sucked in integrated graphics.

Everything on-die, and with chiplets in-package, is the Intel way.

Default, average integrated graphics will continue to "statisfice" for a greater and greater portion of the market with integrated graphics continuing to grow in power.

carlhjerpe•28m ago
Intel made fun of AMD for "taping chips together". Intel did everything on a monolithic die for about way too long.

The smaller the node the smaller the yield, chiplets is a necessity now (or architectural changes like Cerebras).

eYrKEC2•9m ago
Running tests and then fusing off broken cores or shared caches helps to recover lots of yield for bigger chips. Certain parts of the silicon is not redundant, but Intel's designs have redundancy for core pieces and chunks that are very large and hence probabilistically more prone to a manufacturing error.
carlhjerpe•4m ago
Yep, cerebras takes that thing to the next level with their "wafer chips". A common technique is killing defective cores entirely (how all cheaper CPUs are made).

But reducing size will still increase yield since you can pick and choose.

jeffbee•1h ago
There has never been any information conveyed by the "Moore's Law Is Dead" account. If you want to know whether Intel has cancelled their next dGPU, you might as well ask a coin.
flufluflufluffy•1h ago
Can someone explain what the heck Battlemage means in this context?
_ihaque•57m ago
https://chipsandcheese.com/p/intels-battlemage-architecture
ripbozo•57m ago
Codename of the Intel Arc B-series gpu lineup
daemonologist•56m ago
Battlemage, aka X^e2, is Intel's current and second-generation GPU architecture. (Like RDNA 4 for AMD or Blackwell for Nvidia.)
nodja•41m ago
Intel Arc - Intel's dedicated GPUs, each GPU generation has a name in alphabetical order, names are taken from nerd culture.

Alchemist - First gen GPUs A310 GPUs are the low end, A770 are the high end. Powerful hardware for cheap, very spotty software at release. Got fixed up later.

Battlemage - Second gen (current gen), only B570 and B580 GPUs came out. They said weren't gonna release more Battlemage GPUs after these because they wanted to focus on Celestial, but probably went back on it seeing how well the B580 was reviewed and the B770 is due to be released by the end of the year.

Celestial - Next gen GPUs, they were expected for release early 2026. This article claims it was cancelled, but personally I think it's too late to cancel a GPU this late in production. Especially when they basically skipped a generation to get it out faster.

risho•14m ago
i will note that their source appears to be moore's law is dead which is a speculative youtube channel that has a long history of being wrong about the death of arc. dude has been predicting the imminent death of arc since the first one released years ago. it wouldn't surprise me if this did lead to the death of arc, but it certainly isn't because this moron predicted it.