frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Beyond the Appearances: The joys of an Ideal world

https://aurelien2022.substack.com/p/beyond-the-appearances
1•OgsyedIE•38s ago•0 comments

LLM Data Exfiltration via URL Previews (With OpenClaw Example and Test)

https://www.promptarmor.com/resources/llm-data-exfiltration-via-url-previews-(with-openclaw-examp...
1•takira•1m ago•0 comments

MLflow's Missing Validators: An Authorization Bypass Across API Surfaces

https://tachyon.so/blog/mlfow-bypass
1•logicx24•2m ago•0 comments

Agentic Coding in Xcode [video]

https://developer.apple.com/videos/play/tech-talks/111428/
1•Austin_Conlon•3m ago•0 comments

AI Came to Take My Job Here’s How I Responded [video]

https://www.youtube.com/watch?v=HjDD8xwU6s8
1•emsign•3m ago•1 comments

Many people have no mental imagery. What's going on in their brains?

https://www.nature.com/articles/d41586-026-00311-7
2•johngossman•4m ago•0 comments

Write about the future you want

https://daverupert.com/2026/02/futurescapes/
1•headalgorithm•4m ago•0 comments

Do people underestimate GPS metadata in shared photos?

https://exif-cleaner.com/
1•cope123•4m ago•1 comments

RCC: A Boundary Theory Explaining Why LLMs Still Hallucinate

http://www.effacermonexistence.com/rcc-hn-1
2•noncentral•5m ago•2 comments

Slate Auto – The Customizable EV That Works for You

https://www.slate.auto/en
1•lisper•6m ago•0 comments

France's Raid on X Escalates Trans-Atlantic Showdown over Social Media

https://www.nytimes.com/2026/02/04/world/europe/social-media-free-speech.html
2•JumpCrisscross•7m ago•0 comments

Professors Are Being Watched: 'We've Never Seen This Much Surveillance'

https://www.nytimes.com/2026/02/04/us/professors-classroom-surveillance-politics.html
2•JumpCrisscross•7m ago•0 comments

The World's First Viral AI Assistant Has Arrived, and Things Are Getting Weird

https://www.wsj.com/tech/ai/openclaw-ai-agents-moltbook-social-network-5b79ad65
1•fortran77•7m ago•1 comments

Dr. Zhivago: Dixiecrat of the Steppes (1958)

https://marxists.architexturez.net/archive/fraser/1958/zhivago.html
1•jruohonen•9m ago•0 comments

Distance Marching for Generative Modeling

https://arxiv.org/abs/2602.02928
1•E-Reverance•11m ago•0 comments

Zendesk Email Spam

3•Philpax•12m ago•1 comments

Show HN: Template for real-time agentic web apps using Convex

https://www.youtube.com/watch?v=MlH0za1vcUY
1•ohstep23•13m ago•0 comments

Size influences assessment of attractiveness and fighting ability

https://journals.plos.org/plosbiology/article?id=10.1371/journal.pbio.3003595
1•PaulHoule•14m ago•0 comments

Show HN: EpsteIn – Search the Epstein files for your LinkedIn connections

https://github.com/cfinke/EpsteIn
2•cfinke•14m ago•0 comments

Boilerplate Tax: Ranking popular programming languages by density

https://boyter.org/posts/boilerplate-tax-ranking-popular-languages-by-density/
1•ingve•17m ago•0 comments

F# Code I Love (2019) [video]

https://www.youtube.com/watch?v=1AZA1zoP-II
1•tosh•17m ago•0 comments

Beginner's Guide: Your First Web Scraper with Python and ScrapingDuck

https://scrapingduck.com/python-web-scraping-with-scrapingduck/
2•gsoftwarelab•17m ago•0 comments

Pandoc in the Browser

https://pandoc.org/app/
2•Tomte•18m ago•0 comments

Vim 9 Plugins for Lean

https://vim.castedo.com/
1•dougb5•18m ago•0 comments

Women rejecting the hijab have doomed Iran's brutal regime

https://www.telegraph.co.uk/news/2026/02/03/iranian-regime-doomed-when-it-lost-control-of-women/
3•binning•18m ago•3 comments

Commodore, IBM, OS/2, ARexx: Deal or No Deal?

https://datagubbe.se/os2/
2•rbanffy•19m ago•0 comments

Recreating Epstein PDFs from raw encoded attachments

https://neosmart.net/blog/recreating-epstein-pdfs-from-raw-encoded-attachments/
3•ComputerGuru•19m ago•0 comments

LLMs Can't Jump [pdf]

https://philsci-archive.pitt.edu/28024/1/Scientific_Invention_Position_Paper%20%2817%29.pdf
2•amichail•19m ago•0 comments

Checks for indicators of compromise related to the Notepad++ supply chain attack

https://github.com/roady001/Check-NotepadPlusPlusIOC
1•speckx•21m ago•0 comments

From 'nerdy' Gemini to 'edgy' Grok: how developers are shaping AI behaviours

https://www.theguardian.com/technology/2026/feb/03/gemini-grok-chatgpt-claude-qwen-ai-chatbots-id...
1•binning•22m ago•0 comments
Open in hackernews

Intel will start making GPUs

https://techcrunch.com/2026/02/03/intel-will-start-making-gpus-a-market-dominated-by-nvidia/
42•SunshineTheCat•1h ago

Comments

bhouston•1h ago
I am a little confused, I thought Intel had a big push the last couple of years to create its own GPU:

https://www.intel.com/content/www/us/en/products/docs/discre...

BadBadJellyBean•1h ago
I think this is about them creating the silicon. AFAIK their GPU chips were produced by TSMC.
alt227•1h ago
Even the gpus inside their cpus?
mastax•1h ago
Intel has been using a fair bit of TSMC in their CPU manufacturing recently, yes. Most recently they’ve been assembling “tiles” of silicon from many process nodes into a single CPU package and IIRC they have been using TSMC for the GPU tiles.
wtallis•1h ago
This year's laptop chips use TSMC for the 12-core GPU parts but Intel 3 for the 4-core GPU parts.
alt227•15m ago
I wasnt aware of that, thanks for correcting.
mrpippy•1h ago
The original article is here, and it unfortunately is just as confusing: https://www.reuters.com/business/intel-ceo-says-company-will...

Of course Intel has been designing and selling GPUs for years, I guess Lip-Bu means they're going to start manufacturing them as well? Or they're going to be data-center focused now?

wtallis•1h ago
Since he was touting that they recently hired a well-known GPU architect, it seems unlikely that this is merely about them using their own fabs for discrete GPUs instead of having integrated GPUs being the only ones they fab themselves. Some kind of shift in product strategy or reboot of their GPU architecture development seems more likely, if there's anything of substance underlying the news at all.

But this news is somehow even less comprehensible and believable than usual for Intel, whose announcements about their future plans have a tenuous connection to reality on a good day.

searls•1h ago
Agree, extremely poorly reported out across numerous outlets.
alt227•1h ago
One can only assume the original Intel press release was just as confusing.
rbanffy•32m ago
Their newsroom website has nothing more recent than Feb 2... I wonder where this came from.
ginko•1h ago
Intel has been making GPUs since the late 90s:

https://en.wikipedia.org/wiki/List_of_Intel_graphics_process...

re-thc•1h ago
They did but then it stopped and was scrapped (chief involved also left). So it's more like "re"-start than start.
jamesgeck0•23m ago
Any official source? YouTubers have been saying that Intel is shuttering Arc at every minor setback for years.
elephanlemon•37m ago
Intel Arc seems to be well liked, this seems to just be bad writing by Reuters. Unclear what is news here exactly as Demmers was hired a month ago…
chrsw•1h ago
The silicon is just one piece of the puzzle. CUDA and the rest of the software stack is huge advantage for NVIDIA.
bhouston•1h ago
yup, which is why AMD struggles so much even though its hardware is usually within 30% of the performance (give or take) of NVIDIA.

(Replaced "with 30%" with "within 30%")

bigyabai•1h ago
It's also complicated by the notion that raster performance doesn't directly translate to tensor performance. Apple and AMD both make excellent raster GPUs, but still lose in efficiency to the CUDA's architecture in rendering and compute.

I'd really like AMD and Apple to start from scratch with a compute-oriented GPU architecture, ideally standardized with Khronos. The NPU/tensor coprocessor architecture has already proven itself to be a bad idea.

Asmod4n•1h ago
AMD is doing that for their next gen, no idea if khronos is involved.
roysting•1h ago
That may be true, but assuming you meant "within 30% of the performance" ... can we just acknowledge that is a rather significant handicap, even ignoring CUDA.
epolanski•1h ago
The customers are players that can throw money into the software stack, hell, they are even throwing lots of money in the hardware one too with proprietary tensors and such.

And the big players don't necessarily care about the full software stack, they are likely to optimize the hardware for single usage (e.g. inference or specific steps of the training).

Qision•1h ago
Why doesn't AMD make a similar framework than CUDA? Is this so much of a task? But if that increases their market share that should be financially viable, no?
4fterd4rk•58m ago
They do. It's called ROCm. It works, it's open source, but CUDA is so entrenched it's like a Windows vs. Linux kind of thing.
yolostar1•1h ago
Exactly. CUDA is huge moat and all competitors must be adopting SOFTWARE first approach similar to what tinycorp is trying to do. Find one single thing that makes CUDA bad to use and TRIPLE DOWN on that.
BadBadJellyBean•1h ago
Good to hear. More than two players in the GPU market is a really good thing and their recent dedicated consumer GPUs are really good value in their segment. It will take a few generations until they might catch up to Nvidia, but I am hopeful. This is a good thing.
roysting•1h ago
So they are just moving GPU production in-house? Weren't they already designing GPUs for contract with TSMC?
u1hcw9nx•1h ago
Deja Vu all over again.

2016 Nervana. Intel would lead in AI training. The "Nervana NNP" was the future.

2019 Habana Intel announced the Gaudi and Goya chips as their new official AI strategy, effectively killing the Nervana project.

2021 Xe general HPC/AI GPU (Ponte Vecchio) Intel said they will be shifting to the "AI chip" market.

2023 The "AI PC". every consumer CPU would now be an "AI Chip" with NPU (Neural Processing Unit).

2024 Intel is now "AI Systems Foundry" to focus on making AI chips for other people (like Microsoft and Amazon).

2026 Intel will start making GPUs

netule•1h ago
If you discount recent AI-specific developments, you can trace this back further to Larrabee, Intel MIC, Xeon Phi, etc., etc., etc.
usrusr•1h ago
"Intel keeps starting to make GPUs"
RobotToaster•55m ago
Wasn't Itanium marketed as being good for some kind of AI?
eqvinox•12m ago
Don't beat a dead horse, that's rude ;). They did kinda try to market it as good for everything though, didn't they.
midnitewarrior•1h ago
3 years too late to get serious about GPUs.
perbu•1h ago
Not at all. There is no indication that the world won't need more GPUs going forward.
alt227•1h ago
Intel started making and selling their own gpus many years ago, this news is just that they are going to fab the chips themselves, instead of outsourcing to TMSC.
Zigurd•1h ago
It's a confusing article. It's strongly implies that Intel will make GPUs for data centers. It says Intel will produce GPUs without saying whether they are manufacturing them in house or not.
alexbaden•1h ago
Intel has been designing GPUs manufactured on TSMC nodes across client and datacenter for at least the past 5 years. The client chips are price competitive but not performance competitive with AMD/NVIDIA/Apple. The data center roadmap has historically been a huge mess with cancelled products left and right. But, to say "Intel will start making GPUs" seems misleading. Perhaps "Intel to try to inject sanity into its GPU roadmap" would be a better headline, though I am skeptical one hire will do anything to fix 10+ years of mismanagement.
bpye•1h ago
I have a B580 in my desktop. Unfortunately AMD still has broken PCIe level reset and so their GPUs don't work well for assignment to a VM, Intel and Nvidia cards both work fine.

The perf is fine - it was a $350 CAD GPU after all.

I am certainly interested to see where Intel ends up going with their lineup. Having a third player in the GPU space is definitely a good thing.

alexbaden•30m ago
I have a B580 too. The cool thing about it is architecturally speaking it is basically a mini version of the Ponte Vecchio (PVC) datacenter GPU. You can run most of the datacenter GPU workloads, albeit scaled down to fit the compute/memory constraints of the B580. It's a great vehicle for software development. But you can't buy PVC anymore so it's unclear what you are developing for...
monster_truck•1h ago
It is my understanding that this isn't happening in any meaningful capacity, they're simply using the kit no longer relevant to R&D.

I'm still not entirely convinced they actually did Arc themselves. It has all the hallmarks of a project that was bought or taken. Every meaningful iteration keeps getting pushed back further out towards the horizon and the only thing they've been able to offer in the meantime is "uhhhh what if we used two"

re-thc•1h ago
> I'm still not entirely convinced they actually did Arc themselves

Raja ex AMD / Radeon ran the project?

5G_activated•1h ago
intel has been making graphics silicon since the 90s, the current discrete graphics effort has been going for at least a decade, and in areas like low power video decode and encode it could be argued intel is class-leading. the concept of the "GPU" is a quarter of a century old. this is an especially poor article, especially for a publication running as long as techcrunch.
andrewstuart•1h ago
The CEO of Intel already specifically said Intel has given up competing with Nvidia.

That’s the spirit!

gilbertjolly•1h ago
The most rapid path that Intel has to selling competitive GPUs, would be to licence designs from Groq, and apply all effort to getting them working on 14a.

Hyperscalers would bite their hand off and would be a viable alternative to TSMC.

Nvidia has left the door open with the non-exclusive license in the acquisition

yolostar1•1h ago
Wtf Intel is doing, they can't even communicate well !
seg_lol•1h ago
I'd be more excited about Intel making HBM.