In all accounts I have seen, their single SKU from this second generation consumer lineup has been well-received. Yet the article says "what can only be categorized as a shaky and often rudderless business", without any justification.
Yes, it is worth pondering what the Nvidia investment means for Intel Arc Graphics, but "rudderless"? Really?
When it comes to GPUs, a $4T company probably couldn't care less what their $150B partner does in their spare time as long as they prioritize the partnership. Especially when the GPUs in question are low-end units, in a segment that Nvidia has no competition in and not even shipping that many. If they actually asked them to kill it, it would be 100% out of pettiness.
Sometimes I wonder if these articles are written for clicks and these "leakers" are actually just the authors making stuff up and getting it right from time to time.
Intel has so many other GPU-adjacent products and they will doubtless be continuing most of them, even if they don't pursue Arc further: Jaguar Shores, Flex GPUs for VDI, and of course their Xe integrated graphics. I could possibly see Intel not ship a successor to Flex? Maybe? I cannot see a world where they abandon Xe (first-party laptop graphics) or Jaguar Shores ("rack scale" datacenter "GPUs").
With all of that effort going into GPU-ish designs, is there enough overlap that the output/artifacts from those products support and benefit Arc? Or if Arc only continues to be a mid-tier success, is it thus a waste of fab allocation, a loss of potential profit, and an unnecessary expense in terms of engineers maintaining drivers, and so forth? That is the part I do not know, and why I could see it going either way.
So I can see it both ways, but in no world do I trust these supposed leaks.
[1]: https://www.tomshardware.com/tech-industry/semiconductors/in...
[2]: https://www.tomshardware.com/tech-industry/intel-ceo-says-it...
With their own fabs, at that
Seems like you'd prefer yet another +1 selling AI oil and promises ...
>"On training, I think it is too late for us,"
Not too late for AI, but too late for training meanwhile there's inference opportunity or something like that
When he joined only a few months ago, he set the vision of making Intel a worthy participant in the AI space.
Then just a few months later, he announced "we cannot compete".
What happened in the middle? Recent articles came out about the conflict between him and Frank Yeary, the head of the Intel board. He wanted to acquire a hot AI startup, and Frank opposed it. Two factions were formed in the Board, and they lost a lot of time battling it out. While this was going on, a FAANG came in and bought the startup.
I think his announcement that Intel cannot compete was his way of saying "I cannot do it with the current Intel board."
Same as how they messed up the Core Ultra desktop launch, of their own volition - by setting the prices so high that they can’t even compete with their own 13th and 14th gen chips, not even mentioning Ryzen CPUs that are mostly better in both absolute terms and in the price/perf. A sidegrade isn’t the end of the world but a badly overpriced sidegrade is dead on arrival.
Idk what Intel is doing.
>"On training, I think it is too late for us,"
Not too late for AI, but too late for training meanwhile there's inference opportunity or something like that
https://morethanmoore.substack.com/p/nvidia-2026-q2-financia...
They've also not bothered investing in SW to add the H100 to their consumer drivers to work well on games. That doesn't mean it's impossible and none of that takes away from the fact that H100 and consumer GPUs are much more similar and could theoretically be made to run the same workloads at comparable performance.
Not to mention, Intel having vertical integration gave Intel flexibility, customization, and some cost saving advantages that Nvidia didn't have as much of, Nvidia being a fabless designer who are themselves a customer of another for-profit fab (TSMC).
If TFA is true, this was an anticompetitive move by Nvidia to preemptively decapitate their biggest competitor in 2030's datacenter GPU market.
They aren't just for gaming, there's also high-end workstations, but that's probably even more niche.
The only reason I can imagine for them leaving the money on the table is that they think that the AI boom won't last that much longer and they don't want to kill their reputation in the consumer market. But even in that case, I'm not sure it really makes that much sense.
Maybe if consumer GPUs were literally just datacenter silicon that didn't make the grade or something, it would make sense but I don't think that's the case.
Everything on-die, and with chiplets in-package, is the Intel way.
Default, average integrated graphics will continue to "statisfice" for a greater and greater portion of the market with integrated graphics continuing to grow in power.
The smaller the node the smaller the yield, chiplets is a necessity now (or architectural changes like Cerebras).
But reducing size will still increase yield since you can pick and choose.
Alchemist - First gen GPUs A310 GPUs are the low end, A770 are the high end. Powerful hardware for cheap, very spotty software at release. Got fixed up later.
Battlemage - Second gen (current gen), only B570 and B580 GPUs came out. They said weren't gonna release more Battlemage GPUs after these because they wanted to focus on Celestial, but probably went back on it seeing how well the B580 was reviewed and the B770 is due to be released by the end of the year.
Celestial - Next gen GPUs, they were expected for release early 2026. This article claims it was cancelled, but personally I think it's too late to cancel a GPU this late in production. Especially when they basically skipped a generation to get it out faster.
TiredOfLife•1h ago
gregbot•1h ago
dralley•1h ago
Notably this is about the 3rd time in 2 years that he's reported that the Intel dGPU efforts are being killed off.
Even on the latest developments the reporting is contradictory, so someone is wrong and I suspect it's him. https://www.techpowerup.com/341149/intel-arc-gpus-remain-in-...
gregbot•28m ago
nodja•59m ago
They're a channel focused on leaks, but most of their leaks are just industry insider gossip masked as factual to farm clicks. Their leaks are useless for any sort of predictions, but may be interesting if you'd like to know what insiders are thinking.
A quick google search also yielded this[1] 2-year old reddit thread that shows videos they deleted because their predictions were incorrect. There's probably many more. (That subreddit seems to be dedicated to trashing MLID.)
[1] https://www.reddit.com/r/BustedSilicon/comments/yo9l2i/colle...
gregbot•26m ago
Instead of invectives could you just say what specific leak of his was inaccurate? Everything he said about intel dGPU has happened exactly as he said it would. Have you watched his video about that yourself?
carlhjerpe•36m ago
gregbot•25m ago
carlhjerpe•22m ago
gregbot•14m ago
> What has he been wrong about
…
carlhjerpe•12m ago