frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Show HN: Codex HUD – Claude-HUD Style Status Line for Codex CLI

https://github.com/anhannin/codex-hud
1•anhm720•3m ago•0 comments

Ask HN: Best way to physically "type" letters written with a computer?

1•simonebrunozzi•3m ago•0 comments

I Sold Out for $20 a Month and All I Got Was This Perfectly Generated Terraform

https://matduggan.com/i-sold-out-for-200-a-month-and-all-i-got-was-this-perfectly-generated-terra...
1•speckx•4m ago•0 comments

Instruction decoding in the Intel 8087 floating-point chip

https://www.righto.com/2026/02/8087-instruction-decoding.html
1•rbanffy•4m ago•0 comments

Show HN: I made a bot that has to beg and make $1 or it dies

https://begbot.ai
1•kilroy123•8m ago•0 comments

Why AI still behaves like a rationed resource?

https://ilicigor.substack.com/p/the-scarcity-trap-why-ai-still-feels
2•iggori•10m ago•1 comments

StreamFast ESSD and the Open Flash Platform

https://www.blocksandfiles.com/ai-ml/2026/02/10/streamfast-essd-and-the-open-flash-platform/4090309
1•rbanffy•10m ago•0 comments

3D Printing Pneumatic Channels with Dual Materials for Soft Robots

https://hackaday.com/2026/02/13/3d-printing-pneumatic-channels-with-dual-materials-for-soft-robots/
2•HardwareLust•11m ago•0 comments

Moore and Mealy Model in System Verilog (2025)

https://medium.com/@jawadahmed2k3/moore-and-mealy-model-in-system-verilog-aba19be15b42
1•andsoitis•13m ago•0 comments

Show HN: Kai – A Telegram bot that turns Claude Code into a personal dev asst

https://github.com/dcellison/kai
1•dcellison•14m ago•0 comments

Atom – Hydrogen Quantum Orbital Visualizer

https://www.kavang.com/atom
1•samixg•17m ago•0 comments

Ask HN: What happens after the AI bubble bursts?

3•101008•17m ago•3 comments

The NotebookLM Tutorial

https://www.augmentedswe.com/p/notebooklm-tutorial
2•wordsaboutcode•18m ago•0 comments

The tiny corp – Nvidia is a Software Company

https://consensus-hongkong.coindesk.com/agenda/event/-open-source-ai-in-your-pocket-a-case-study-77
1•randomgermanguy•19m ago•0 comments

Twitter(X) Is Down

21•bakigul•19m ago•11 comments

Why "Skip the Code, Ship the Binary" Is a Category Error

https://engrlog.substack.com/p/why-skip-the-code-ship-the-binary
1•birdculture•20m ago•0 comments

Running My Own XMPP Server

https://blog.dmcc.io/journal/xmpp-turn-stun-coturn-prosody/
2•speckx•22m ago•0 comments

Show HN: Out Plane – Deploy any app in 60s with per-second pricing

https://outplane.com
1•receperdogan•23m ago•0 comments

Stages of Denial

https://nsl.com/papers/denial.html
2•tosh•25m ago•0 comments

The Break Is Over. Companies Are Jacking Up Prices Again

https://www.wsj.com/business/price-increases-consumers-businesses-b70e4542
3•belter•26m ago•1 comments

They're Made Out of Meat (1991)

https://www.mit.edu/people/dpolicar/writing/prose/text/thinkingMeat.html
2•erhuve•26m ago•0 comments

Show HN: 2d platformer game built with Codex (zero code)

3•armcat•27m ago•2 comments

The Promptware Kill Chain

https://www.schneier.com/blog/archives/2026/02/the-promptware-kill-chain.html
1•leephillips•28m ago•0 comments

Deterministic Core, Agentic Shell

https://blog.davemo.com/posts/2026-02-14-deterministic-core-agentic-shell.html
2•ingve•28m ago•0 comments

A DataFrame Library Which Runs on GPUs, Accelerators and More

https://github.com/ronfriedhaber/autark/blob/main/extra/notebooks/data_gov_ev_analysis_1.ipynb
1•ronfriedhaber•29m ago•0 comments

Ministry of Justice orders deletion of the UK's largest court reporting database

https://www.legalcheek.com/2026/02/ministry-of-justice-orders-deletion-of-the-uks-largest-court-r...
5•harel•31m ago•0 comments

2026 Barkley Marathon Results: No Finishers, Sébastien Raichon Completes Fun Run

https://www.irunfar.com/2026-barkley-marathon-results
2•lode•33m ago•0 comments

Developers speak out about bigotry on Steam

https://www.theguardian.com/games/2026/feb/16/bigotry-steam-pc-moderation-developers-speak-out
2•Archelaos•33m ago•0 comments

Show HN: A word game a friend and I built. How did we do?

https://www.subletters.fun/
3•fercircularbuf•35m ago•0 comments

Sourdine – open-source macOS app for meeting transcription with 100% local AI

https://angelo-lima.fr/en/sourdine-transcription-reunions-ia-locale-en/
2•llingelo•36m ago•1 comments
Open in hackernews

Thanks a lot, AI: Hard drives are sold out for the year, says WD

https://mashable.com/article/ai-hard-drive-hdd-shortages-western-digital-sold-out
105•dClauzel•1h ago

Comments

dClauzel•1h ago
Good luck to everyone. Home you made some reserve.

Yes, AI is nice, but I also like to be able to buy some RAM and drives…

Sharlin•36m ago
The future is thin clients for everyone, requiring a minimal amount of RAM and storage because all they are is a glorified ChatGPT interface.
XorNot•26m ago
It won't last. If the demand is sustained then new factories will open up and drive the price down.

More likely a couple of big financing wobbles lead to a fire sale.

It isn't practical for HDD supply to be wedged because in 5 years the disks start failing.

huijzer•22m ago
I'm running multiple services such as Forgejo, Audiobookshelf, Castopod and they all need no more than roughly 100 MB RAM.

There is one exception though. Open WebUI with a whopping 960 MB. It's literally a ChatGPT interface. I'm only using external API providers. No local models running.

Meanwhile my website that runs via my own Wordpress-like software written in Rust [1] requires only a few MB of RAM, so it's possible.

[1]: https://github.com/rikhuijzer/fx

voidUpdate•1h ago
This is the consequence of "I don't want to write this function myself, I'll get the plagiarism machine to do it for me"
szszrk•57m ago
I honestly think it's not that simple.

The ones who spend billions on integrating public cloud LLM services are not the ones writing that function. They are managers who based on data pulled out of thin air say "your goal for this year is to increase productivity by X%. With AI, while staffing is going slightly down".

I have to watch AI generated avatars on the most boring topics imaginable, because the only "documentation" and link to actual answer is in a form of fake person talking. And this is encouraged!

Then the only measure of success is either AI services adoption (team count), or sales data.

That is the real tragedy and the real scale - big companies pushing (external!) AI services without even proof that it justifies the cost alone. Smooth talking around any other metric (or the lack of it).

GuB-42•19m ago
And what's wrong with not wanting to write functions yourself? It is a perfectly reasonable thing, and in some cases (ex: crypto), rolling your own is strongly discouraged. That's the reason why libraries exist, you don't want to implement your own associative array every time your work needs it do you?

As for plagiarism, it is not something to even consider when writing code, unless your code is an art project. If someone else's code does the job better then yours, that's the code you should use, you are not trying to be original, you are trying to make a working product. There is the problem of intellectual property laws, but it is narrower than plagiarism. For instance, writing an open source drop-in replacement of some proprietary software is common practice, it is legal and often celebrated as long as it doesn't contain the original software code, in art, it would be plagiarism.

Copyright laundering is a problem though, and AI is very resource intensive for a result of dubious quality sometimes. But that just shows that it is not a good enough "plagiarism machine", not that using a "plagiarism machine" is wrong.

voidUpdate•5m ago
If I use a package for crypto stuff, it will generally be listed as part of the project, in an include or similar, so you can see who actually wrote the code. If you get an LLM to create it, it will write some "new original code" for you, with no ability to tell you any of the names of people who's code went into that, and who did not give their consent for it to be mangled into the algorithm.

If I copy work from someone else, whether that be a paragraph of writing, a code block or art, and do not credit them, passing it off as my own creation, that's plagiarism. If the plagiarism machine can give proper attribution and context, it's not a plagiarism machine anymore, but given the incredibly lossy nature of LLMS, I don't foresee that happening. A search engine is different, as it provides attribution for the content it's giving you (ignoring the "ai summary" that is often included now). If you go to my website and copy code from me, you know where the code came from, because you got it from my website

BLKNSLVR•1h ago
Are these the picks and shovels?

Is the profitability of these electronics manufacturers more likely than the companies that are buying up all their future inventory?

smashface•1h ago
If AI continues at this trajectory, sure, likely to the picks and shovels.

If AI has a bubble burst, you could see a lot of used hardware flood the market and then companies like WD could have a hard time selling against their previous inventory.

mrweasel•34m ago
The problem is more likely that companies like WD doesn't know if this will be a bubble or not. Currently they can milk the market by raising their prices and just rely on their current production facilities, maybe expand a little. If there's going to be crash, then it's better to have raised the price, even if just temporarily, rather than being left standing with excessive production capacity.

If it's long term, it would be better to be the front runner on additional capacity, but that's assuming continuous growth. If it all comes down, or even just plateaus, it's better to simply raise prices.

trueismywork•10m ago
Given how hard AI is on I/O, while restarting if hardware might go second hand. I dont see hard drives go second hand. Most hardware that we get might be used beyond redeeming even at free price.
m4rtink•1h ago
Do they really think they will get some money from the AI ponzi scheme ?

Well, at least they might still have a product to sell once the AI bubble pops, unlike with NVIDIA which does seem to kinda forgot to design new consumer GPUs after getting high on AI money.

Sharlin•40m ago
They haven't forgotten, they've expressly decided to soft-pivot away from consumer GPUs. RTX 60x0 series is apparently coming in 2018… maybe. If the bubble has burst by then.
topspin•22m ago
> RTX 60x0 series is apparently coming in 2018

That's either a typo, or NVidia has achieved some previously unheard of levels of innovation.

jodrellblank•18m ago
> "apparently coming in 2018… maybe. If the bubble has burst by then."

Spoiler from the future: it hasn't. Get your investments in while you have time.

glimshe•1h ago
There's clearly easy/irrational money distorting the markets here. Normally this wouldn't be a problem: prices would go up, supply would eventually increase and everybody would be okay. But with AI being massively subsidized by nation-states and investors, there's no price that is too high for these supplies.

Eventually the music will stop when the easy money runs out and we see how much people are truly willing to pay for AI.

appreciatorBus•58m ago
Regardless where demand comes from, it takes time to spin up a hard drive factory, and prices would have to rise enough that, as a producer, you would feel confident that a new hard drive factory will actually pay off. Conversely, if you feel that boom is irrational and temporary, as a producer you’d be quite wary of investing money in a new factory if there was a risk it would be producing into a glut in a few years.
anonymars•41m ago
If I remember during a previous GPU shortage (crypto?), Nvidia (and/or TSMC?) basically knew the music would stop and didn't want to be caught with its pants down after making the significant investments necessary to increase production

Not to mention that without enough competition, you can just raise prices, which, uh (gestures at Nvidia GPU price trends...)

XorNot•31m ago
Somewhat ironically the AI boom means Nvidia would've easily made their money back on that investment though and probably even more thoroughly owned the GPGPU space.

But as it is it's not like they made any bad decisions either.

Xunjin•52m ago
Loved the reference. Probably from Margin Call[0]

0. https://youtu.be/fij_ixfjiZE

mcny•30m ago
I like to imagine the reference in the movie margin call is that of a merry go round or a game of Musical chair. Like we are all on a ride, none of us are the operator, and all we can do is guess when the music will stop (and the ride ends).

The problem with this AI stuff is we don't know how much we will be willing to pay for it, as individuals, as businesses, as nations. I guess we just don't know how far this stuff will be useful. The reasons for the high valuation is, in my guess, that there is more value here than what we have tapped so far, right?

The revenues that nVidia has reported is based on what we hope we will achieve in the future so I guess the whole thing is speculation?

mrweasel•44m ago
AI is going to be what fiber was to the dotcom bubble. Someone spend a lot of money on a lot of infrastructure, some of which is going to be incredibly useful, but sold for much less than it cost to build. Hardware just depreciates much much faster than fiber networks.
baq•35m ago
current shortages are exactly the result of fabs not wanting to commit extra capex due to overbuild risk and inference demand seems to be growing 10x yoy; you've famously got 8 year old TPUs at google at 100% load.
hmmmmmmmmmmmmmm•23m ago
This goes beyond profits. It will be important for national security.
ido•13m ago

    Hardware just depreciates much much faster than fiber
The manfucaturing capacity expanded to meet the demand for new hardware doesn't (as much)
aurareturn•8m ago
I'm not saying that data center buildouts can't overshoot demand but AI and compute is different than fiber buildout. The more compute you have, the smarter the AI. You can use the compute to let the AI think longer (maybe hours/days/weeks) on a solution. You can run multiple AI agents simultaneously and have them work together or check each other's work. You can train and inference better models with more compute.

So there is always use for more compute to solve problems.

Fiber installations can overshoot relatively easily. No matter how much fiber you have installed, that 4k movie isn't going to change. The 3 hours of watch time for consumers isn't going to change.

zozbot234•23m ago
It's hard to increase long-run production capacity for what seems to be clearly a short-term spike in datacenter buildout. Even if AI itself is not much of a bubble, at some point spending on new AI facilities has to subside.
ghywertelling•20m ago
Earlier gamers got punished by crypto and now they are being punished by AI.
high_na_euv•18m ago
So what?

Why gamers must be the most important group?

infecto•7m ago
No it’s not an easy fix. Manufacturers don’t have a good pulse on long term demand. The he capex to spin up a new manufacturing plant is significant. Especially with the recency of Covid where some folks did get caught with their pants down and over invested during the huge demand boom.

I don’t quite follow the narrative like yours about nation states and investors. There is certainly an industrial bubble going on and lots of startups getting massive amounts of capital but I here is a strong signal that a good part of this demand is here to stay.

This will be one of those scenarios where some companies will look brilliant and others foolish.

Jyaif•1h ago
does that only include SSDs, or does it include HDDs as well?
jacquesm•59m ago
It includes all forms of storage except for USB devices, GPUs and high end CPUs. The latter you can still get but you're going to have some severe sticker shock.
antonyh•50m ago
Maybe shucking USB HDDs is the short-term answer.
M95D•28m ago
Is that still possible? Aren't they native USB with no adapter?
ErneX•24m ago
Those drives are SATA inside the case.
jacquesm•21m ago
That depends on the brand. The lower priced brands, yes, those can be SATA, the more vertically integrated companies also make custom PCBs that just have USB-C without any SATA interface exposed internally.
antonyh•53m ago
I read it as both, but UK suppliers have stock of various SATA HDDs available in large and small sizes. It's hard to say if prices will rocket or availability decline, or both. I don't normally advocate panic-buying, but if it's needed now is the time. I have one NAS spare on hand, I don't want or need a drawer full of them, but it'll be a royal pain if I do and can't get parts.
ddtaylor•56m ago
Is this for NVME only or spinning drives too? I use both, but I actually have use cases for HDDs and hope those are less affected.
Forgeties79•53m ago
All I know is I saw most of my go-to refurbished enterprise HDD’s 2-3x during Black Friday a few months ago compared to a year prior.
ErneX•26m ago
This particular news is for spinning drives, the other types we already had news about upcoming shortages earlier on.
layer8•24m ago
It’s affecting both. HDD maybe slightly less/slower, but you’re paying significantly more than six months ago in any case.
Forgeties79•55m ago
I bought 6x refurbished ultrastars for ~$100/ea Black Friday 2024. They were over $200/ea 2025. Samsung T7 (and shield) SSD’s have 2x-3x. Can’t get 1TB for less than like $180 right now. It’s ridiculous
cmiles8•53m ago
This is all basically a textbook example of irrational market decisions. There’s clearly a bubble and not enough money coming in to pay for the AI bonanza.

It’s building materials being in short supply when there’s obviously more houses than buyers. That’s just masked at the moment because of all the capital being pumped in to cover for the lack of actual revenue to pay for everything. The structural mismatch at the moment is gigantic, and the markets are getting increasingly impatient waiting for the revenue to materialize.

Mark this post… in a few years folks will be coming up with creative ideas for cheap storage and GPUs flooding the market after folks pick up the pieces of imploded AI companies.

(For the record, I’m a huge fan of AI, but that doesn’t mean I don’t also think a giant business and financial bubble is about to implode).

xienze•46m ago
I built a new server this time last year. My board does 6 channel RAM so I bought 6x32GB ECC DDR5. $160 a stick at the time. Just for grins I looked up the same product number at the same supplier I originally bought from. $1300 apiece. One of the VMs running on that server is TrueNAS, with 4 20TB WD Red Pros. God help me if I have to replace a drive.
kotaKat•33m ago
Best Buy is actively selling 2x8gb sticks of DDR4 3200 for $80 a stick. I was floored. Ten bucks a gig, $160 for the pack.

We're fucking doomed.

wuschel•19m ago
Perhaps there is an incentive to go back to OS that can operate with 640KB RAM ... /s
zozbot234•18m ago
Ten bucks a gig is lower than what some DDR5 memory is selling at.
mnw21cam•44m ago
I was recently involved in a large server purchase for work, where we wanted 72 hard drives of 24TB each for a server. They were available last year, but last month the largest we could get were 20TB drives.
emsign•39m ago
Yeah, this is slowing down growth and profits. The AI hype is sucking everything dry, from HVAC services to hardware.
huijzer•29m ago
Not only storage, cheapest 32 GB RAM that I can find is around 200 euros.
zozbot234•26m ago
That's actually a bargain, average market price (though highly volatile) is more than double that.
xiphias2•26m ago
It's interesting to see here that spending is irrational, but actually even if AI improvements slow down it's more rational for companies to spend more and underutilize the machines than to underspend and get disrtupted.

On the otherhand lots of people here are even more uncomfortable of the other option, which is quite possible: AI software algorithms may scale better than the capacity of companies that make the hardware. Personally I think hardware is the harder to scale from the two and this is just the beginning.

55555•22m ago
What are companies needing all of these hard drives for? I understand their need for memory, and boot. But storing text training data and text conversations isn't that space intensive. There's a few companies doing video models, so I can see how that takes a tremendous amount of space. Is it just that?
Ekaros•13m ago
Hearing about their scrapping practises it might be that they are storing same data over and over and over again. And then yes, audio and video is likely something they are planning for or already gathering.

And if they produce lot of video, they might keep copies around.

red75prime•11m ago
All the latest general purpose models are multimodal (except DeepSeek I think). Transfer learning allows to improve results even after they exhausted all the text in the internet.
jmclnx•7m ago
I am surprised by that too. I thought everyone moved to SDDs or NVMe ?

I was toying with getting a 2T HDD for a BSD system I have, I guess not now :)

cs02rm0•15m ago
Presumably they're also looking to increase production capacity as fast as possible - within the year?

I'd have thought HDDs aren't at the top of the list for AI requirements, are other component manufacturers struggling even more to meet demand?

aceelric•9m ago
It started with RAM; now with hard drives and SSDs. This is not looking good. But at least you can buy used ones for a pretty good price, for now.
arbuge•5m ago
I console myself with knowledge of the economics maxim that every supply shortage is usually, eventually, followed by a supply glut.

One can only hope that that's the principle at work here, anyway. It could also be a critically damped system for all I know. Unfortunately I studied control systems too...

markus_zhang•5m ago
Better stock up with used laptops. I'm going to buy another one this year. Those used ones usually don't last very long.

What if in the near future it is simply too expensive to own "personal" computers? What if you can no longer buy used computers from official channels but have to find local shops or sharpen up on soldering skills and find parts from dumps? The big techs will conveniently "rent out" cloud computer for us to use, in exchange of all of your data.

"Don't you all have cellphones?"