frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Show HN: Deterministic NDJSON audit logs – v1.2 update (structural gaps)

https://github.com/yupme-bot/kernel-ndjson-proofs
1•Slaine•3m ago•0 comments

The Greater Copenhagen Region could be your friend's next career move

https://www.greatercphregion.com/friend-recruiter-program
1•mooreds•4m ago•0 comments

Do Not Confirm – Fiction by OpenClaw

https://thedailymolt.substack.com/p/do-not-confirm
1•jamesjyu•4m ago•0 comments

The Analytical Profile of Peas

https://www.fossanalytics.com/en/news-articles/more-industries/the-analytical-profile-of-peas
1•mooreds•4m ago•0 comments

Hallucinations in GPT5 – Can models say "I don't know" (June 2025)

https://jobswithgpt.com/blog/llm-eval-hallucinations-t20-cricket/
1•sp1982•5m ago•0 comments

What AI is good for, according to developers

https://github.blog/ai-and-ml/generative-ai/what-ai-is-actually-good-for-according-to-developers/
1•mooreds•5m ago•0 comments

OpenAI might pivot to the "most addictive digital friend" or face extinction

https://twitter.com/lebed2045/status/2020184853271167186
1•lebed2045•6m ago•2 comments

Show HN: Know how your SaaS is doing in 30 seconds

https://anypanel.io
1•dasfelix•6m ago•0 comments

ClawdBot Ordered Me Lunch

https://nickalexander.org/drafts/auto-sandwich.html
1•nick007•7m ago•0 comments

What the News media thinks about your Indian stock investments

https://stocktrends.numerical.works/
1•mindaslab•8m ago•0 comments

Running Lua on a tiny console from 2001

https://ivie.codes/page/pokemon-mini-lua
1•Charmunk•9m ago•0 comments

Google and Microsoft Paying Creators $500K+ to Promote AI Tools

https://www.cnbc.com/2026/02/06/google-microsoft-pay-creators-500000-and-more-to-promote-ai.html
2•belter•11m ago•0 comments

New filtration technology could be game-changer in removal of PFAS

https://www.theguardian.com/environment/2026/jan/23/pfas-forever-chemicals-filtration
1•PaulHoule•12m ago•0 comments

Show HN: I saw this cool navigation reveal, so I made a simple HTML+CSS version

https://github.com/Momciloo/fun-with-clip-path
2•momciloo•13m ago•0 comments

Kinda Surprised by Seadance2's Moderation

https://seedanceai.me/
1•ri-vai•13m ago•2 comments

I Write Games in C (yes, C)

https://jonathanwhiting.com/writing/blog/games_in_c/
2•valyala•13m ago•0 comments

Django scales. Stop blaming the framework (part 1 of 3)

https://medium.com/@tk512/django-scales-stop-blaming-the-framework-part-1-of-3-a2b5b0ff811f
1•sgt•13m ago•0 comments

Malwarebytes Is Now in ChatGPT

https://www.malwarebytes.com/blog/product/2026/02/scam-checking-just-got-easier-malwarebytes-is-n...
1•m-hodges•13m ago•0 comments

Thoughts on the job market in the age of LLMs

https://www.interconnects.ai/p/thoughts-on-the-hiring-market-in
1•gmays•14m ago•0 comments

Show HN: Stacky – certain block game clone

https://www.susmel.com/stacky/
2•Keyframe•17m ago•0 comments

AIII: A public benchmark for AI narrative and political independence

https://github.com/GRMPZQUIDOS/AIII
1•GRMPZ23•17m ago•0 comments

SectorC: A C Compiler in 512 bytes

https://xorvoid.com/sectorc.html
2•valyala•18m ago•0 comments

The API Is a Dead End; Machines Need a Labor Economy

1•bot_uid_life•19m ago•0 comments

Digital Iris [video]

https://www.youtube.com/watch?v=Kg_2MAgS_pE
1•Jyaif•20m ago•0 comments

New wave of GLP-1 drugs is coming–and they're stronger than Wegovy and Zepbound

https://www.scientificamerican.com/article/new-glp-1-weight-loss-drugs-are-coming-and-theyre-stro...
5•randycupertino•22m ago•0 comments

Convert tempo (BPM) to millisecond durations for musical note subdivisions

https://brylie.music/apps/bpm-calculator/
1•brylie•24m ago•0 comments

Show HN: Tasty A.F. - Use AI to Create Printable Recipe Cards

https://tastyaf.recipes/about
2•adammfrank•25m ago•0 comments

The Contagious Taste of Cancer

https://www.historytoday.com/archive/history-matters/contagious-taste-cancer
2•Thevet•27m ago•0 comments

U.S. Jobs Disappear at Fastest January Pace Since Great Recession

https://www.forbes.com/sites/mikestunson/2026/02/05/us-jobs-disappear-at-fastest-january-pace-sin...
1•alephnerd•27m ago•1 comments

Bithumb mistakenly hands out $195M in Bitcoin to users in 'Random Box' giveaway

https://koreajoongangdaily.joins.com/news/2026-02-07/business/finance/Crypto-exchange-Bithumb-mis...
1•giuliomagnifico•27m ago•0 comments
Open in hackernews

Streaming compression beats framed compression

https://bou.ke/blog/compressed/
36•bouk•1mo ago

Comments

lambdaloop•1mo ago
Does streaming compression work if some packets are lost or arrive in a different order? Seems like the compression context may end up different on the encoding/decoding side.. or is that handled somehow?
duskwuff•1mo ago
It sounds as though the data is being transferred over HTTP, so packet loss/reordering is all handled by TCP.
dgoldstein0•1mo ago
Yes, or by http3's in order guarantees on the individual streams (as http3 is udp)
dgoldstein0•1mo ago
I think the underlying protocol would have to guarantee in order delivery - either via tcp (for http1, 2, or spdy), or in http3, within a single stream.
gkbrk•1mo ago
WebSockets [1] run over TCP, and the messages are ordered.

There is RFC 9220 [2] that makes WebSockets go over QUIC (which is UDP-based). But that's still expected to expose a stream of bytes to the WebSocket, which still keeps the ordering guarantee.

[1]: https://datatracker.ietf.org/doc/html/rfc6455

[2]: https://datatracker.ietf.org/doc/rfc9220/

duskwuff•1mo ago
Before you get too excited, keep two things in mind:

1) Using a single compression context for the whole stream means you have to keep that context active on the client and server while the connection is active. This may have a nontrivial memory cost, especially at high compression levels. (Don't set the compression window any larger than it needs to be!)

2) Using a single context also means that you can't decompress one frame without having read the whole stream that led up to that. This prevents some possible useful optimizations if you're "fanning out" messages to many recipients - if you're compressing each message individually, you can compress it once and send the same compressed message to every recipient.

adzm•1mo ago
The analogy to h264 in the original post is very relevant. You can fix some of the downsides by using the equivalent of keyframes, basically. Still a longer context than a single message but able to be broken up for recovery or etc.
yellow_lead•1mo ago
> This may have a nontrivial memory cost, especially at high compression levels. (Don't set the compression window any larger than it needs to be!)

It sounds like these contexts should be cleared when they reach a certain memory limit, or maybe reset periodically, i.e every N messages. Is there another way to manage the memory cost?

treyd•1mo ago
That's a misunderstanding. Compression algorithms are typically designed with a tunable state size paramter. The issue is if you have a large transfer that might have one side crash and resume, you need to have some way to persist the state to be able to pick up where you left off.
michaelt•1mo ago
LZ77 compression (a key part of gzip and zip compression) uses a 'sliding window' where the compressor can tell the decompressor 'repeat the n bytes that appeared in the output stream m bytes ago'. The most widely used implementation uses a 15 bit integer for m - so the decompressor never needs to look more than 32,768 bytes back in its output stream.

Many compression standards include memory limits, to guarantee compatibility, and the older the standard the lower that limit is likely to be. If the standards didn't dictate this stuff, DVD sellers could release a DVD that needed a 4MB decompression window, and it'd fail to play on players that only had 2MB of memory - setting a standard and following it avoids this happening.

efitz•1mo ago
When I worked at Microsoft years ago, me and my team (a developer and a tester) built a high volume log collector.

We used a streaming compression format that was originally designed for IBM tape drives.

It was fast as hell and worked really well, and was gentle on CPU and it was easy to control memory usage.

In the early 2000s on a modest 2-proc AMD64 machine we ran out of fast Ethernet way before we felt CPU pressure.

We got hit by the SOAP mafia during Longhorn; we couldn’t convince the web services to adopt it; instead they made us enshittify our “2 bytes length, 2 bytes msgtype, structs-on-the-wire” speed demon with their XML crap.

masklinn•1mo ago
Surely that is obvious to anyone who has compared zip and tgz?
skulk•1mo ago
MUD clients and servers use MCCP which is essentially keeping a zlib stream open, adding text to it, and flushing it whenever something is received. I think this has been around since 2000.

https://tintin.mudhalla.net/protocols/mccp/

vlovich123•1mo ago
Using zstd with a tuned small file custom dictionary probably gets you most of the benefit without giving up independence of compression.
bob1029•1mo ago
There is a proposal out there for serving & using custom compression dictionaries over HTTP:

https://www.ietf.org/archive/id/draft-ietf-httpbis-compressi...

almaight•1mo ago
mwss https://github.com/go-gost/x/blob/master/dialer/mws/dialer.g...