frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Monzo wrongly denied refunds to fraud and scam victims

https://www.theguardian.com/money/2026/feb/07/monzo-natwest-hsbc-refunds-fraud-scam-fos-ombudsman
1•tablets•57s ago•0 comments

They were drawn to Korea with dreams of K-pop stardom – but then let down

https://www.bbc.com/news/articles/cvgnq9rwyqno
1•breve•3m ago•0 comments

Show HN: AI-Powered Merchant Intelligence

https://nodee.co
1•jjkirsch•5m ago•0 comments

Bash parallel tasks and error handling

https://github.com/themattrix/bash-concurrent
1•pastage•5m ago•0 comments

Let's compile Quake like it's 1997

https://fabiensanglard.net/compile_like_1997/index.html
1•billiob•6m ago•0 comments

Reverse Engineering Medium.com's Editor: How Copy, Paste, and Images Work

https://app.writtte.com/read/gP0H6W5
1•birdculture•11m ago•0 comments

Go 1.22, SQLite, and Next.js: The "Boring" Back End

https://mohammedeabdelaziz.github.io/articles/go-next-pt-2
1•mohammede•17m ago•0 comments

Laibach the Whistleblowers [video]

https://www.youtube.com/watch?v=c6Mx2mxpaCY
1•KnuthIsGod•18m ago•1 comments

Slop News - HN front page right now hallucinated as 100% AI SLOP

https://slop-news.pages.dev/slop-news
1•keepamovin•23m ago•1 comments

Economists vs. Technologists on AI

https://ideasindevelopment.substack.com/p/economists-vs-technologists-on-ai
1•econlmics•25m ago•0 comments

Life at the Edge

https://asadk.com/p/edge
2•tosh•31m ago•0 comments

RISC-V Vector Primer

https://github.com/simplex-micro/riscv-vector-primer/blob/main/index.md
3•oxxoxoxooo•35m ago•1 comments

Show HN: Invoxo – Invoicing with automatic EU VAT for cross-border services

2•InvoxoEU•35m ago•0 comments

A Tale of Two Standards, POSIX and Win32 (2005)

https://www.samba.org/samba/news/articles/low_point/tale_two_stds_os2.html
2•goranmoomin•39m ago•0 comments

Ask HN: Is the Downfall of SaaS Started?

3•throwaw12•40m ago•0 comments

Flirt: The Native Backend

https://blog.buenzli.dev/flirt-native-backend/
2•senekor•42m ago•0 comments

OpenAI's Latest Platform Targets Enterprise Customers

https://aibusiness.com/agentic-ai/openai-s-latest-platform-targets-enterprise-customers
1•myk-e•44m ago•0 comments

Goldman Sachs taps Anthropic's Claude to automate accounting, compliance roles

https://www.cnbc.com/2026/02/06/anthropic-goldman-sachs-ai-model-accounting.html
3•myk-e•47m ago•5 comments

Ai.com bought by Crypto.com founder for $70M in biggest-ever website name deal

https://www.ft.com/content/83488628-8dfd-4060-a7b0-71b1bb012785
1•1vuio0pswjnm7•48m ago•1 comments

Big Tech's AI Push Is Costing More Than the Moon Landing

https://www.wsj.com/tech/ai/ai-spending-tech-companies-compared-02b90046
4•1vuio0pswjnm7•50m ago•0 comments

The AI boom is causing shortages everywhere else

https://www.washingtonpost.com/technology/2026/02/07/ai-spending-economy-shortages/
2•1vuio0pswjnm7•51m ago•0 comments

Suno, AI Music, and the Bad Future [video]

https://www.youtube.com/watch?v=U8dcFhF0Dlk
1•askl•53m ago•2 comments

Ask HN: How are researchers using AlphaFold in 2026?

1•jocho12•56m ago•0 comments

Running the "Reflections on Trusting Trust" Compiler

https://spawn-queue.acm.org/doi/10.1145/3786614
1•devooops•1h ago•0 comments

Watermark API – $0.01/image, 10x cheaper than Cloudinary

https://api-production-caa8.up.railway.app/docs
1•lembergs•1h ago•1 comments

Now send your marketing campaigns directly from ChatGPT

https://www.mail-o-mail.com/
1•avallark•1h ago•1 comments

Queueing Theory v2: DORA metrics, queue-of-queues, chi-alpha-beta-sigma notation

https://github.com/joelparkerhenderson/queueing-theory
1•jph•1h ago•0 comments

Show HN: Hibana – choreography-first protocol safety for Rust

https://hibanaworks.dev/
5•o8vm•1h ago•1 comments

Haniri: A live autonomous world where AI agents survive or collapse

https://www.haniri.com
1•donangrey•1h ago•1 comments

GPT-5.3-Codex System Card [pdf]

https://cdn.openai.com/pdf/23eca107-a9b1-4d2c-b156-7deb4fbc697c/GPT-5-3-Codex-System-Card-02.pdf
1•tosh•1h ago•0 comments
Open in hackernews

That 16B password story (a.k.a. "data troll")

https://www.troyhunt.com/that-16-billion-password-story-aka-data-troll/
112•el_duderino•5mo ago

Comments

charcircuit•5mo ago
If there was an open database of password breaches it would be easier for people to do research in if a leak was new or just a password taken from a previous leak. Of course you can get closer to the actual number by filtering out duplicates, but you can't figure out what's new if you can't know what's old.
mananaysiempre•5mo ago
Pwned Passwords[1] is just such a database (with passwords hashed using either SHA-1 or NTLM as an obfuscation measure, and without any emails). Hunt used to distribute versioned snapshots, but these days he directs you to an API scraper[2] in C# instead, so you can still get a list but it probably won’t exactly match anyone else’s.

[1] https://haveibeenpwned.com/passwords

[2] https://github.com/HaveIBeenPwned/PwnedPasswordsDownloader

charcircuit•5mo ago
This isn't sufficient for all cases. For example a breach could contained a hashed passwords. If you only have the obfuscated passwords of previous breaches you can't hash it yourself to know that the new breach is just a rehash of an existing one.

Data breaches can also contain other things than just passwords. Things like phone numbers, addresses, etc that would also be useful for checking.

anon7000•5mo ago
Publishing someone’s leaked credentials in plaintext for anyone to look at also isn’t ideal. I mean, yes, it’s been leaked, but we also don’t need to make it easier for someone to get hacked.
charcircuit•5mo ago
Pretending it's private is also problematic. People get a false impression of what is public and what isn't.
nojs•5mo ago
In other words, 2.7B -> 109M is a 96% reduction from headline to people. Could we apply the same maths to the 16B headline?

I mean there’s not 16B people in the world, so a row per person can be ruled out pretty easily

NitpickLawyer•5mo ago
> I mean there’s not 16B people in the world, so a row per person can be ruled out pretty easily

In a hypothetical "master dump", a mix of all the dumps ever leaked, you'd expect dozens if not more entries for every "real person" out there. Think about how many people had a yahoo account, then how many had several yahoo accounts, and then multiply it with hundreds of leaks out there. I can see the number getting into billions easily, just because of how many accounts people have on many platforms that got hacked in the past ~20 years.

Sure, 99% of those won't be active accounts anymore, but the passwords used serve as a signal, at least for "what kinds of passwords do people use". There's lots to be learned about wordwordnumber wordnumbernumber, and so on.

genewitch•5mo ago
> There's lots to be learned about wordwordnumber wordnumbernumber, and so on.

i had a plan to do statistical studies of some password dumps to try and make a "compressed password list" that could generate password guesses on the fly, and i forgot why i didn't do it, but i'm sure it's because the "model" - the statistical dataset upon which the program would generate output, wouldn't really be that much smaller; at least not with my poor maths skills.

I'm assuming that someone who really knew what they were doing could get close to 20% - 15% of the full password list. I doubt i could do better than just compressing the dataset and extracting it on the fly.

NitpickLawyer•5mo ago
> I doubt i could do better than just compressing the dataset and extracting it on the fly.

The meta in that field is to extract "rules" (i.e. for hashcat) from datasets. Then you run the rules over the encrypted dumps. Rules can be word^number^number, word^number^word^number, or letter^upper^number^lower... etc. Then you build a dictionary and dict + rules = passwords.

Pretty sure you can extract some nice signals nowadays with embeddings and what not.

miki123211•5mo ago
I always find it funny how the media characterizes a data breach in terms of number of records stolen, or, even worse, its size on disk.

There are ~335 million Americans. Assume for simplicity that each of them owns one phone, and hence one SIM card. Generously assume that each SIM card has 1kb of authentication material. A data breach of all US consumer SIM keys would hence be ~335 million records and ~335 gb.

Such a breach would be far, far more catastrophic than anything we have ever seen (and probably anything we will ever see) in computer security, despite being half the size of this one, and containing less than 10% as many records.

SG-•5mo ago
I'm glad someone actually looked at the data and made a real news story about this.
graynk•5mo ago
I am very confused.

> Everything (and I mean it) from that news report went through yours truly.

> Bob is a quality researcher

> The headlines implying this was a massive breach are misleading

But the headlines implying it are literally in the cybernews article, which is the source of it all? Why does the article talks about "the mass media" throughout the length of it, if it's the original source that was misleading?