frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Tiny C Compiler

https://bellard.org/tcc/
52•guerrilla•1h ago•20 comments

You Are Here

https://brooker.co.za/blog/2026/02/07/you-are-here.html
37•mltvc•1h ago•33 comments

SectorC: A C Compiler in 512 bytes

https://xorvoid.com/sectorc.html
148•valyala•5h ago•25 comments

The F Word

http://muratbuffalo.blogspot.com/2026/02/friction.html
76•zdw•3d ago•31 comments

Speed up responses with fast mode

https://code.claude.com/docs/en/fast-mode
82•surprisetalk•5h ago•89 comments

LLMs as the new high level language

https://federicopereiro.com/llm-high/
20•swah•4d ago•12 comments

Software factories and the agentic moment

https://factory.strongdm.ai/
119•mellosouls•8h ago•232 comments

Hoot: Scheme on WebAssembly

https://www.spritely.institute/hoot/
157•AlexeyBrin•11h ago•28 comments

OpenCiv3: Open-source, cross-platform reimagining of Civilization III

https://openciv3.org/
864•klaussilveira•1d ago•264 comments

Stories from 25 Years of Software Development

https://susam.net/twenty-five-years-of-computing.html
113•vinhnx•8h ago•14 comments

GitBlack: Tracing America's Foundation

https://gitblack.vercel.app/
17•martialg•50m ago•3 comments

FDA intends to take action against non-FDA-approved GLP-1 drugs

https://www.fda.gov/news-events/press-announcements/fda-intends-take-action-against-non-fda-appro...
29•randycupertino•58m ago•29 comments

Show HN: A luma dependent chroma compression algorithm (image compression)

https://www.bitsnbites.eu/a-spatial-domain-variable-block-size-luma-dependent-chroma-compression-...
21•mbitsnbites•3d ago•1 comments

Al Lowe on model trains, funny deaths and working with Disney

https://spillhistorie.no/2026/02/06/interview-with-sierra-veteran-al-lowe/
73•thelok•7h ago•13 comments

First Proof

https://arxiv.org/abs/2602.05192
75•samasblack•7h ago•57 comments

Brookhaven Lab's RHIC concludes 25-year run with final collisions

https://www.hpcwire.com/off-the-wire/brookhaven-labs-rhic-concludes-25-year-run-with-final-collis...
36•gnufx•4h ago•40 comments

Vocal Guide – belt sing without killing yourself

https://jesperordrup.github.io/vocal-guide/
253•jesperordrup•15h ago•82 comments

I write games in C (yes, C) (2016)

https://jonathanwhiting.com/writing/blog/games_in_c/
156•valyala•5h ago•136 comments

Start all of your commands with a comma (2009)

https://rhodesmill.org/brandon/2009/commands-with-comma/
532•theblazehen•3d ago•197 comments

Show HN: I saw this cool navigation reveal, so I made a simple HTML+CSS version

https://github.com/Momciloo/fun-with-clip-path
38•momciloo•5h ago•5 comments

Reinforcement Learning from Human Feedback

https://rlhfbook.com/
98•onurkanbkrc•10h ago•5 comments

Selection rather than prediction

https://voratiq.com/blog/selection-rather-than-prediction/
19•languid-photic•3d ago•5 comments

Italy Railways Sabotaged

https://www.bbc.co.uk/news/articles/czr4rx04xjpo
69•vedantnair•1h ago•55 comments

The AI boom is causing shortages everywhere else

https://www.washingtonpost.com/technology/2026/02/07/ai-spending-economy-shortages/
212•1vuio0pswjnm7•12h ago•323 comments

72M Points of Interest

https://tech.marksblogg.com/overture-places-pois.html
42•marklit•5d ago•6 comments

A Fresh Look at IBM 3270 Information Display System

https://www.rs-online.com/designspark/a-fresh-look-at-ibm-3270-information-display-system
52•rbanffy•4d ago•14 comments

Unseen Footage of Atari Battlezone Arcade Cabinet Production

https://arcadeblogger.com/2026/02/02/unseen-footage-of-atari-battlezone-cabinet-production/
129•videotopia•4d ago•40 comments

Coding agents have replaced every framework I used

https://blog.alaindichiappari.dev/p/software-engineering-is-back
273•alainrk•10h ago•452 comments

France's homegrown open source online office suite

https://github.com/suitenumerique
649•nar001•9h ago•284 comments

Microsoft account bugs locked me out of Notepad – Are thin clients ruining PCs?

https://www.windowscentral.com/microsoft/windows-11/windows-locked-me-out-of-notepad-is-the-thin-...
51•josephcsible•3h ago•67 comments
Open in hackernews

Debugging a mysterious HTTP streaming issue

https://mintlify.com/blog/debugging-a-mysterious-http-streaming-issue-when-cloudflare-compression-breaks-everything
16•skeptrune•6mo ago

Comments

Charon77•6mo ago
I don't quite get it.

cURL works because it doesn't compression.

Browsers do not work because they're accepting compression and cloudflare silently enables compression to browser who advertises that they can accept compression.

So cloudflare's compression is just flawed?

01HNNWZ0MV43FF•6mo ago
I guess their compression buffers maybe a few KB of data before sending any output, and for streaming individual words from a chatbot, a few KB looks like no streaming at all
deathanatos•6mo ago
> One thing that stuck out was that our egress systems (ALB and Cloudflare) were stripping these headers:

  'Transfer-Encoding': 'chunked',
  'Connection': 'keep-alive',
  'Content-Encoding': 'none',
Transfer-Encoding and Connection are both hop-by-hop headers.

> Unlike Content-Encoding (Section 8.4.1 of [HTTP]), Transfer-Encoding is a property of the message, not of the representation. Any recipient along the request/response chain MAY decode the received transfer coding(s) or apply additional transfer coding(s) to the message body, assuming that corresponding changes are made to the Transfer-Encoding field value.

(https://www.rfc-editor.org/rfc/rfc9112#section-6.1)

> Intermediaries MUST parse a received Connection header field before a message is forwarded and, for each connection-option in this field, remove any header or trailer field(s) from the message with the same name as the connection-option, and then remove the Connection header field itself (or replace it with the intermediary's own control options for the forwarded message).

(https://datatracker.ietf.org/doc/html/rfc9110#section-7.6.1-...)

> Furthermore, intermediaries SHOULD remove or replace fields that are known to require removal before forwarding, whether or not they appear as a connection-option, after applying those fields' semantics. […] Transfer-Encoding

(https://www.rfc-editor.org/rfc/rfc9110#section-7.6.1-7)

I.e., it is spec-legal for an intermediary to remove these headers; it should be obvious that these are a property of the hop if you consider their purpose.

E.g., say your load-balancer is maintaining a keep-alive with the backend; a client sending Connection: close is not having it's header "stripped" by the LB proxying the request to the backend but without forwarding the header, it's a property of the client<->LB connection, and not the LB<->BE connection.

Same for Transfer-Encoding: consider an HTTP/1.1 connection hitting an intermediary that will upgrade it to HTTP/2; Transfer-Encoding: chunked makes no sense in h2 (its an innate property of h2), and the header will be removed from the proxied request.

Now, obviously, if an intermediary receives a "streaming" response, one hopes a "streaming" response goes out. (But I've written what amounts to an intermediary; it would de-stream responses sometimes b/c the logic it was implementing as an intermediary required it to. So … I also know "it depends", sometimes.)

> Compression breaks HTTP streaming - This is now permanently etched in my brain

It shouldn't.

But that leaves one more header:

  'Content-Encoding': 'none',
That's not a hop-by-hop header, and I don't think intermediaries should generally screw with it; I can't find a good clear "don't" in the spec, but an intermediary changing the Content-Encoding header would have to be very careful; e.g., the ETag header notes:

> Content codings are a property of the representation data, so a strong entity tag for a content-encoded representation has to be distinct from the entity tag of an unencoded representation to prevent potential conflicts during cache updates and range requests.

(I.e., if you changed the Content-Encoding header, either by removing it & decompressing the message, or by adding it & compressing the message, you would be corrupting the/sending a wrong ETag.)

But also … "none" (the string literal?) is not a valid Content-Encoding. (The header would be omitted, typically, if no content-coding is applied.)

This could just be CF idiosyncrasies, or bugs. I don't see why compression being supported by the client should de-stream the response. One could stream the compression on the fly (and inform the client of the coding used via a Transfer-Encoding to the downstream client; if the protocol doesn't support that, e.g., h2, then probably one should just forward the message without mucking with it…).

Given that the browsers are probably doing at least h2 with CF (i.e., a browser is not speaking HTTP/1.x) … there wouldn't be a Transfer-Encoding. (I don't know if QUIC changes the situation here any. Perhaps I'll assume QUIC works like h2, in that there is only Content-Encoding.) So that would mean that if CF is compressing the response, and there's no Transfer-Encoding header … then it would be doing so & setting the Content-Encoding header, which smells wrong to me. So with curl, setting & not setting the --compressed flag, how do the responses differ. (And perhaps also control/vary the HTTP version.)

nly•6mo ago
I have a system at work that relies on HTTP long polling + chunked transfer encoding to do "streaming"

It took me a while to realise you can't observe this properly through "mitmproxy" during dev like you can with a regular request because the proxy by default effectively tries to wait until the entire response is in before forwarding it.

cURL at least does minimal to no buffering processing chunked responses.

refulgentis•6mo ago
"This isn't just about compression—it's about"

This isn't just about a technical issue—it's about...

...the effort one puts into their writing and how it affects perception of content, i.e. here's some extremely common LLM slop the writer couldn't be sussed to edit, what else did they miss? Does it affect anything I gleaned from the article?

procaryote•6mo ago
> The symptoms were confusing: streaming worked perfectly with cURL and Postman, but failed completely with node-fetch and browser fetch.

It would have been helpful to mention what "failed completely" means. Did you get garbage data? Did the connection close abruptly? Did the connection hang and not deliver data? Did it deliver the data, just with a significant delay?

Paying attention to these things also tends to make it easier to debug.