frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

Open models by OpenAI

https://openai.com/open-models/
1364•lackoftactics•8h ago•531 comments

Genie 3: A new frontier for world models

https://deepmind.google/discover/blog/genie-3-a-new-frontier-for-world-models/
1109•bradleyg223•11h ago•403 comments

Spotting base64 encoded JSON, certificates, and private keys

https://ergaster.org/til/base64-encoded-json/
219•jandeboevrie•6h ago•99 comments

Ollama Turbo

https://ollama.com/turbo
243•amram_art•6h ago•146 comments

Create personal illustrated storybooks in the Gemini app

https://blog.google/products/gemini/storybooks/
73•xnx•4h ago•25 comments

Consider using Zstandard and/or LZ4 instead of Deflate

https://github.com/w3c/png/issues/39
127•marklit•8h ago•71 comments

Claude Opus 4.1

https://www.anthropic.com/news/claude-opus-4-1
640•meetpateltech•9h ago•241 comments

Things that helped me get out of the AI 10x engineer imposter syndrome

https://colton.dev/blog/curing-your-ai-10x-engineer-imposter-syndrome/
699•coltonv•11h ago•535 comments

Scientific fraud has become an 'industry,' analysis finds

https://www.science.org/content/article/scientific-fraud-has-become-industry-alarming-analysis-finds
274•pseudolus•14h ago•236 comments

What's wrong with the JSON gem API?

https://byroot.github.io/ruby/json/2025/08/02/whats-wrong-with-the-json-gem-api.html
37•ezekg•4h ago•8 comments

The First Widespread Cure for HIV Could Be in Children

https://www.wired.com/story/the-first-widespread-cure-for-hiv-could-be-in-children/
64•sohkamyung•3d ago•12 comments

Ask HN: Have you ever regretted open-sourcing something?

114•paulwilsonn•3d ago•148 comments

Quantum machine learning via vector embeddings

https://arxiv.org/abs/2508.00024
9•adbabdadb•2h ago•0 comments

Kyber (YC W23) is hiring enterprise account executives

https://www.ycombinator.com/companies/kyber/jobs/6RvaAVR-enterprise-account-executive-ae
1•asontha•4h ago

uBlock Origin Lite now available for Safari

https://apps.apple.com/app/ublock-origin-lite/id6745342698
964•Jiahang•16h ago•383 comments

Build Your Own Lisp

https://www.buildyourownlisp.com/
219•lemonberry•13h ago•58 comments

Show HN: Stagewise (YC S25) – Front end coding agent for existing codebases

https://github.com/stagewise-io/stagewise
32•juliangoetze•10h ago•34 comments

US reportedly forcing TSMC to buy 49% stake in Intel to secure tariff relief

https://www.notebookcheck.net/Desperate-measures-to-save-Intel-US-reportedly-forcing-TSMC-to-buy-49-stake-in-Intel-to-secure-tariff-relief-for-Taiwan.1079424.0.html
296•voxadam•7h ago•347 comments

Los Alamos is capturing images of explosions at 7 millionths of a second

https://www.lanl.gov/media/publications/1663/dynamics-of-dynamic-imaging
104•LAsteNERD•10h ago•85 comments

Injecting Java from native libraries on Android

https://octet-stream.net/b/scb/2025-08-03-injecting-java-from-native-libraries-on-android.html
4•todsacerdoti•2d ago•0 comments

Cow vs. Water Buffalo Mozzarella

http://itscheese.com/reviews/mozzarella
19•indigodaddy•3d ago•19 comments

Under the Hood of AFD.sys Part 1: Investigating Undocumented Interfaces

https://leftarcode.com/posts/afd-reverse-engineering-part1/
24•omegadev•2d ago•6 comments

AI is propping up the US economy

https://www.bloodinthemachine.com/p/the-ai-bubble-is-so-big-its-propping
114•mempko•6h ago•135 comments

Cannibal Modernity: Oswald de Andrade's Manifesto Antropófago (1928)

https://publicdomainreview.org/collection/manifesto-antropofago/
20•Thevet•2d ago•3 comments

Tell HN: Anthropic expires paid credits after a year

177•maytc•23h ago•89 comments

No Comment (2010)

https://prog21.dadgum.com/57.html
60•ColinWright•10h ago•50 comments

Eleven Music

https://elevenlabs.io/blog/eleven-music-is-here
164•meetpateltech•9h ago•206 comments

Apache ECharts 6

https://echarts.apache.org/handbook/en/basics/release-note/v6-feature/
261•makepanic•18h ago•30 comments

The mystery of Winston Churchill's dead platypus was finally solved

https://www.bbc.com/news/articles/cglzl1ez283o
43•benbreen•2d ago•8 comments

GitHub pull requests were down

https://www.githubstatus.com/incidents/6swp0zf7lk8h
113•lr0•9h ago•151 comments
Open in hackernews

Consider using Zstandard and/or LZ4 instead of Deflate

https://github.com/w3c/png/issues/39
127•marklit•8h ago

Comments

zX41ZdbW•7h ago
Very reasonable.

I've recently experimented with the methods of serving bitmaps out of the database in my project[1]. One option was to generate PNG on the fly, but simply outputting an array of pixel color values over HTTP with Content-Encoding: zstd has won over PNG.

Combined with the 2D-delta-encoding as in PNG, it will be even better.

[1] https://adsb.exposed/

privatelypublic•7h ago
Does deflate lead the pack in any metric at all anymore? Only one I can think of is extreme low spec compression (microcontrollers).
adgjlsfhk1•7h ago
Even there, LZ4 is probably better.
hinkley•5h ago
You think LZ4 is more portable than zlib? I'm gonna need some citations on that.

zlib is 30 years old, according to Wikipedia. And that's technically wrong since 'zlib' was factored out of gzip (nearly 33 years old) for use in libpng, which is also 30 years old.

adgjlsfhk1•4h ago
not more portable, but probably faster in resource constrained environments
duskwuff•4h ago
A basic LZ4 decompressor is on the order of a few dozen lines of code. It's exceptionally easy to implement.
JoshTriplett•7h ago
The only metric deflate leads on is widespread support. By any other metric, it has been superseded.
atiedebee•6h ago
I'd assume memory usage as well, because it has a tiny context window compared to zstd
JoshTriplett•6h ago
You can change the context window of zstd if you want. But yes, the default context window size for zstd is 8MB, versus 32k.
arp242•7h ago
Comparison of "zpng" (PNG wth zstd) and WebP lossless, with current PNG. From https://github.com/WangXuan95/Image-Compression-Benchmark :

  Compressed format    Compressed size (bytes)  Compress Time  Decompress Time
  WEBP (lossless m5)   1,475,908,700           1,112          49
  WEBP (lossless m1)   1,496,478,650             720          37
  ZPNG (-19)           1,703,197,687           1,529          20
  ZPNG                 1,755,786,378              26          24

  PNG (optipng -o5)    1,899,273,578           27,680         26
  PNG (optipng -o2)    1,905,215,734            4,395         27
  PNG (optimize=True)  1,935,713,540            1,120         29
  PNG (optimize=False) 2,003,016,524              335         34
Doesn't really seem worth it? It doesn't compress better, and only slightly faster in decompression time.
bobmcnamara•7h ago
Am I reading those numbers right? That's like 25x faster compression than WEBP-M1, there's probably a use case for that.
arp242•7h ago
The numbers seem small enough that it will rarely matter, but I suppose there might be a use case somewhere?

But lets be real here: this is basically just a new image format. With more code to maintain, fresh new exciting zero-days, and all of that. You need a strong use case to justify that, and "already fast encode is now faster" is probably not it.

scott_w•4h ago
I don’t think it’s quite as bad, though? It’s using a known compression library that (from reading other comments) has seen use and testing. The rest of PNG would remain unchanged, as the decompression format is a plugin.

I know it needs to be battle tested as a single entity but it’s not the same as writing a new image format from scratch.

realityking•3h ago
Considering both zstandard and PNG are already web facing technologies, would the combination of both really increase the attack surface?
stephencanon•7h ago
"Only slightly faster in decompression time."

m5 vs -19 is nearly 2.5x faster to decompress; given that most data is decompressed many many more times (often thousands or millions of times more, often by devices running on small batteries) than it is compressed, that's an enormous win, not "only slightly faster".

The way in which it might not be worth it is the larger size, which is a real drawback.

arp242•6h ago
The difference is barely noticeable in real-world cases, in terms of performance or battery. Decoding images is a small part of loading an entire webpage from the internet. And transferring data isn't free either, so any benefits need to be offset against the larger file size and increased network usage.
fmbb•6h ago
Win how?

More efficiency will inevitably only lead to increased usage of the CPU and in turn batteries draining faster.

https://en.wikipedia.org/wiki/Jevons_paradox

hcs•6h ago
So someone is going to load 2.5x as many images because it can be decoded 2.5x faster? The paradox isn't a law of physics, it's an interesting observation about markets. (If this was a joke it was too subtle for me)
snickerdoodle12•6h ago
Might as well just shoot yourself if that's how you look at improvements. The only way to do something good it to stop existing. (this is a general statement, not aimed at you or anyone in particular)
m463•2h ago
you have to do the math - do you have more bandwidth or storage or cpu?

Not related to images, but I remember compressing packages of executables and zstd was a clear winner over other compression standards.

Some compression algorithms can run in parallel, and on a system with lots of cpus it can be a big factor.

out_of_protocol•3h ago
Is webp really losses here? As far as i remember its capped at 4:2:0 and can't do 4:4:4 files without loosing some of the color data
e-topy•7h ago
Instead of using a new PNG standard, I'd still rather use JPEG XL just because it has progressive decoding. And you know, whilst looking like png, being as small as webp, supporting HDR and animations, and having even faster decoding speed.

https://dennisforbes.ca/articles/jpegxl_just_won_the_image_w...

jchw•6h ago
JPEG XL definitely has advantages over PNG but there is one serious seemingly insurmountable obstacle:

https://caniuse.com/jpegxl

Nothing really supports it. Latest Safari at least has support for it not feature-flagged or anything, but it doesn't support JPEG XL animations.

To be fair, nothing supports a theoretical PNG with Zstandard compression either. While that would be an obstacle to using PNG with Zstandard for a while, I kinda suspect it wouldn't be that long of a wait because many things that support PNG today also support Zstandard anyways, so it's not a huge leap for them to add Zstandard support to their PNG codecs. Adding JPEG-XL support is a relatively bigger ticket that has struggled to cross the finish line.

The thing I'm really surprised about is that you still can't use arithmetic coding with JPEG. I think the original reason is due to patents, but I don't think there have been active patents around that in years now.

bawolff•6h ago
> The thing I'm really surprised about is that you still can't use arithmetic coding with JPEG.

I was under the impression libjpeg added support in 2009 (in v7). I'd assume most things support it by now.

jchw•6h ago
Believe it or not, last I checked, many browsers and some other software (file managers, etc.) still couldn't do anything with JPEG files that have arithmetic coding. Apparently, although I haven't tried this myself, Adobe Photoshop also specifically doesn't support it.
superjan•6h ago
Arithmetic coding decodes 1 bit at a time, usually in such a way that you can’t do two bits or more with SIMD instructions. So it will be slow and energy inefficient.
adgjlsfhk1•4h ago
this isn't necessarily true. zstd uses an ans which is a type of arithmetic coding which is very efficient to decode
kps•6h ago
> Nothing really supports it.

Everything supports it, except web browsers.

jchw•6h ago
JPEG-XL is supported by a lot of the most important parts of the ecosystem (image editors and the major desktop operating systems) but it is a long way away from "everything". Browsers are the most major omission, but given their relative importance here it is not a small one. JPEG-XL is dead in the water until that problem can be resolved.

If Firefox is anything to go off of, the most rational explanation here seems to just be that adding a >100,000 line multi-threaded C++ codebase as a dependency for something that parses untrusted user inputs in a critical context like a web browser is undesirable at this point in the game (other codecs remain a liability but at least have seen extensive battle-testing and fuzzing over the years.) I reckon this is probably the main reason why there has been limited adoption so far. Apple seems not to mind too much, but I am guessing they've just put so much into sandboxing Webkit and image codecs already that they are relatively less concerned with whether or not there are memory safety issues in the codec... but that's just a guess.

swiftcoder•5h ago
Apple also adopted JPEG-XL across their entire software stack. It's supported throughout the OS, and by pretty much every application they develop, so I'm guessing they sunk a fair bit of time/money into hardening their codec
floxy•3h ago
> >100,000 line multi-threaded C++

W. T. F. Yeah, if this is the state of the reference implementation, then I'm against JPEG-XL just on moral grounds.

lifthrasiir•2h ago
Only because it's both the reference encoder and decoder, and the encoder tends to be a lot more complex than the decoder. (Source: I have developed a partial JPEG XL decoder in the past, and it was <10K lines of C code.)
bravetraveler•2h ago
> reference

They aren't going to give you two problems to solve/consider: clever code and novel design.

Zardoz84•6h ago
You can use a polyfill.
greenavocado•4h ago
That's because people have allowed the accumulation of power and control by Big Tech. Features in and capabilities of end user operating systems and browsers are gate kept by a handful of people in Big Tech. There is no free market there. Winners are picked by politics, not merit. Switching costs are extreme due to vendor lock in and carefully engineered friction.

The justification for WebP in Chrome over JPEG-XL was pure hand waving nonsense not technical merit. The reality is they would not dare cede any control or influence to the JPEG-XL working group.

Hell the EU is CONSIDERING mandatory attestation driven by whitelisted signed phone firmwares for certain essential activities. Freedom of choice is an illusion.

google234123•2h ago
Webp is a lot older than jpg xl, right?
01HNNWZ0MV43FF•2h ago
It's also because supporting features is work that takes time away from other bug fixes and other features
IshKebab•4h ago
> I kinda suspect it wouldn't be that long of a wait

Yeah... guess again. It took Chrome 13 years to support animated PNG - the last major change to PNG.

edoceo•4h ago
Maybe they were focused on Webp?
jchw•4h ago
APNG wasn't part of PNG itself until very recently, so I'd argue it's kind-of neither here nor there.
Scaevolus•4h ago
Every new image codec faces this challenge. PNG + Zstandard would look similar. The ones that succeeded have managed it by piggybacking off a video codec, like https://caniuse.com/avif.
jchw•4h ago
Why would PNG + ZStandard have a harder time than AVIF? In practice, AVIF needs more new code than PNG + ZStandard would.
junon•4h ago
I'm just guessing, but bumping a library version to include new code cam integrating a separate library might be the differentiating factor.
jchw•4h ago
The zstd library is already included by most major browsers since it is a supported content encoding. Though I guess that does leave out Safari, but Safari should probably support Zstd for that, too. (I would've preferred that over Brotli, but oh well.)
breve•42m ago
> but there is one serious seemingly insurmountable obstacle

It can be surmounted with WebAssembly: https://github.com/niutech/jxl.js/

Single thread demo: https://niutech.github.io/jxl.js/

Multithread demo: https://niutech.github.io/jxl.js/multithread/

fluidcruft•2m ago
As I understand it JPEG XL has a lot of interest in medical imaging and is coming to DICOM. After it's in DICOM, whichever browser supports it best will rule hospitals.
bawolff•6h ago
Doesn't PNG have progressive decoding? I.e. adam7 algorithm
layer8•6h ago
It does, using Adam7: https://en.wikipedia.org/wiki/Adam7_algorithm

The recently released PNG 3 also supports HDR and animations: https://www.w3.org/TR/png-3/

bawolff•6h ago
> The recently released PNG 3 also supports HDR and animations: https://www.w3.org/TR/png-3/

APNG isn't recent so much as the specs were merged together. APNG will be 21 years old in a few weeks.

layer8•6h ago
True, but https://news.ycombinator.com/item?id=44802079 presumably holds the opinion that APNG != PNG, so I mentioned PNG 3 to counteract that. Animated PNGs being officially PNG is recent.
duskwuff•5h ago
Adam7 is interlacing, not progressive decoding (i.e. it cannot be used to selectively decode a part of the image). It also interacts extremely poorly with compression; there is no good reason to ever use it.
adzm•5h ago
Web browsers already have code in place for webp (lossless,vp8) and avif (av1, which also supports animation), as well as classic jpeg and png, and maybe also HEIC (hevc/h265)... what benefit do we have by adding yet another file format if all the use cases are already covered by the existing formats? That said, I do like JPEG-XL, but I also kind of understand the hesitation to adopt it too. I imagine if Apple's push for it continues, then it is just a matter of time to get supported more broadly in Chrome etc.
Dylan16807•2h ago
Avif is cute but using that as an excuse to not add jxl is a travesty. At the time either one of those could have been added, jxl should have been the choice.

The biggest benefit is that it's actually designed as an image format. All the video offshoots have massive compromises made so they can be decoded in 15 milliseconds in hardware.

The ability to shrink old jpegs with zero generation loss is pretty good too.

bawolff•6h ago
I think there is a benefit to knowing that if you have a png file it works everywhere that supports png.

Better to make the back compat breaks be entirely new formats.

encom•6h ago
(2021)

In my opinion PNG doesn't need fixing. Being ancient is a feature. Everything supports it. As much as I appreciate the nerdy exercise, PNG is fine as it is. My only gripe is that some software writes needlessly bloated files (like adding a useless alpha channel, when it's not needed). I wish we didn't need tools like OptiPNG etc.

heinrich5991•5h ago
Most of the comments on that issue are from this year.
ori_b•4h ago
Yes. One of the best features of png is that I don't have to wonder if it's going to work somewhere. Throwing that away in favor of a bit of premature optimization seems like a big loss. Especially as this wouldn't be the only modernized image compression format out there. Why use this over, eg, lossless webp?

I don't think I have ever noticed the decode time of a png.

willvarfar•5h ago
We ought consider using QOI instead.

QOI is often equivalent or better compression than PNG, _before_ you even compress it with something like LZ4 etc.

Compressing QOI with something like LZ4 would generally outperform PNG.

adgjlsfhk1•2h ago
QOI has some pretty major downsides. it only supports 8 bit SRGB, and is optimized for images with 8 bit transparency. Also, the handsome it uses seems to harm compression when entropy compression is used. Also, the focus on streaming means that the algorithm can't take advantage of 2d locality.

QOI is really cool, but I think the author cut the final version of the spec too early, and intentionally closed it off to a future version with more improvements. With another year or 2 of development, I think it probably works have become ~10% more efficient and suitable for more usecases.

nigeltao•1h ago
> Compressing QOI with something like LZ4 would generally outperform PNG.

https://github.com/nigeltao/qoir has some numbers comparing QOIR (which is QOI-inspired-with-LZ4) vs PNG.

QOIR has better decode speed and comparable compression ratio (depending on which PNG encoder you use).

QOIR's numbers are also roughly similar to ZPNG.

HocusLocus•5h ago
The reason we have a world full of .gif today is that the .png committee rejected animation back when everyone was saying PNG would be the "GIF killer". Just sayin'. Don't hold your breath.
edoceo•4h ago
Remember this: https://burnallgifs.org/
hughw•4h ago
Related: what's the status of content negotiation? Any browsers use it seriously, and has it been successful? If so, then why not zpng.
jasonthorsness•4h ago
One of the interesting features of ZStandard is the support for external dictionaries. It supports "training" a dictionary on a set of samples, of whatever size (16KiB, 64 KiB, etc.), then applying that dictionary as a separate input file for compression and decompression. This lets you compress short content much more effectively.

I doubt it would apply to PNG because of the length and content doesn't seem to be dictionary-friendly, but it would be interesting to try from some giant collection of scraped PNGs. This approach was important enough for Brotli to include a "built-in" dictionary covering HTML.

DefineOutside•4h ago
This has been applied to minecraft region files in a fork of paper, which is a type of minecraft server.

https://github.com/UltraVanilla/paper-zstd/blob/main/patches...

from the author of this patch on discord - the level 9 for compression isn't practical and is too slow for a real production server but it does show the effectiveness of zstd with a shared dictionary.

  So you start off with a 755.2 MiB world (in this test, it is a section of an existing DEFLATE-compressed world that has been lived in for a while). If you recreate its regions it will compact it down to 695.1 MiB

  You set region-file-compression=lz4 and run --recreateRegionFiles and it turns into a 998.9 MiB world. Makes sense, worse compression ratios but less CPU is what mojang documented in the changelog. Neat, but I'm confused as to what the benefits are as I/O increasingly becomes the more constrained thing nowadays. This is just a brief detour from what I'm really trying to test

  You set region-file-compression=none and it turns into a 3583.0 MiB world. The largest region file in this sample was 57 MiB

  Now, you take this world, and compress each of the region files individually using zstd -9, so that the region files are now .mca.zst files. And you get a world that is 390.2 MiB
immibis•3h ago
Note that each region file contains 1024 chunks that are designed to be (but probably aren't) accessed at random, so compressing a region file is like a solid archive with a solid block size of 1024 files.
lordpipe•41m ago
Author here -- the solution I discussed in that message isn't quite the same solution as the one linked. The `paper-zstd` repository is the one using dictionary compression on individual chunks. But in the `.mca.zst` solution I'm not using dictionaries at all. It's more like a glorified LinearPaper -- just take the region file, decompress the chunks, and recompress the entire container. It breaks random access to individual chunks, but it's great for archival or cloud storage offloading of infrequently visited parts of a MC world, which is what I'm using it for.

I don't remember the exact compression ratios for the dictionary solution in that repo, but it wasn't quite as impressive (IIRC around a 5% reduction compared to non-dictionary zstd at the same level). And the padding inherent to the region format takes away a lot of the ratio benefit right off the bat, though it may have worked better in conjunction with the PaperMC SectorFile proposal, which has less padding, or by rewriting the storage using some sort of LSM tree library that knows how to compactly store blobs of varying size. I've dropped the dictionary idea for now, but it definitely could be useful. More research is needed.

duskwuff•14m ago
> I doubt it would apply to PNG because of the length and content doesn't seem to be dictionary-friendly

Correct - I wouldn't expect this to be useful for PNG. Compression dictionaries are applicable in situations where a group of documents contain shared patterns of literal content, like snippets of HTML. This is very uncommon in PNG image data, especially since any difference in compression settings, like the use of a different color palette, or different row filtering algorithms, will make the pattern unrecognizable.

citrin_ru•3h ago
ZSTD is a great compression algorithm but an important PNG (v1.2) advantage is that implementations are available in almost all actively used operating systems and in most popular languages. The same cannot be said about ZSTD with very few implementations except https://github.com/facebook/zstd

I'm not even sure there is a good pure Java (no JNI) and Go (without Cgo) implementations for ZSTD. And it definitely would require more powerful hardware - some micro-controllers which can use PNG are too small for ZSTD.

jonathanoliver•3h ago
For Go, we've been using this library which supports ZSTD. https://github.com/klauspost/compress
pornel•2h ago
The developer who asked for the faster compression formats has later solved the problem himself:

https://github.com/richgel999/fpng

It turns out that deflate can be much faster when implemented specifically for PNG data, instead general-purpose compression (while still remaining 100%-standard-compatible).