frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

The next frontier in weight-loss drugs: one-time gene therapy

https://www.washingtonpost.com/health/2026/01/24/fractyl-glp1-gene-therapy/
1•bookofjoe•1m ago•1 comments

At Age 25, Wikipedia Refuses to Evolve

https://spectrum.ieee.org/wikipedia-at-25
1•asdefghyk•3m ago•2 comments

Show HN: ReviewReact – AI review responses inside Google Maps ($19/mo)

https://reviewreact.com
1•sara_builds•4m ago•0 comments

Why AlphaTensor Failed at 3x3 Matrix Multiplication: The Anchor Barrier

https://zenodo.org/records/18514533
1•DarenWatson•5m ago•0 comments

Ask HN: How much of your token use is fixing the bugs Claude Code causes?

1•laurex•8m ago•0 comments

Show HN: Agents – Sync MCP Configs Across Claude, Cursor, Codex Automatically

https://github.com/amtiYo/agents
1•amtiyo•9m ago•0 comments

Hello

1•otrebladih•11m ago•0 comments

FSD helped save my father's life during a heart attack

https://twitter.com/JJackBrandt/status/2019852423980875794
2•blacktulip•13m ago•0 comments

Show HN: Writtte – Draft and publish articles without reformatting, anywhere

https://writtte.xyz
1•lasgawe•15m ago•0 comments

Portuguese icon (FROM A CAN) makes a simple meal (Canned Fish Files) [video]

https://www.youtube.com/watch?v=e9FUdOfp8ME
1•zeristor•17m ago•0 comments

Brookhaven Lab's RHIC Concludes 25-Year Run with Final Collisions

https://www.hpcwire.com/off-the-wire/brookhaven-labs-rhic-concludes-25-year-run-with-final-collis...
2•gnufx•19m ago•0 comments

Transcribe your aunts post cards with Gemini 3 Pro

https://leserli.ch/ocr/
1•nielstron•23m ago•0 comments

.72% Variance Lance

1•mav5431•24m ago•0 comments

ReKindle – web-based operating system designed specifically for E-ink devices

https://rekindle.ink
1•JSLegendDev•26m ago•0 comments

Encrypt It

https://encryptitalready.org/
1•u1hcw9nx•26m ago•1 comments

NextMatch – 5-minute video speed dating to reduce ghosting

https://nextmatchdating.netlify.app/
1•Halinani8•27m ago•1 comments

Personalizing esketamine treatment in TRD and TRBD

https://www.frontiersin.org/articles/10.3389/fpsyt.2025.1736114
1•PaulHoule•28m ago•0 comments

SpaceKit.xyz – a browser‑native VM for decentralized compute

https://spacekit.xyz
1•astorrivera•29m ago•0 comments

NotebookLM: The AI that only learns from you

https://byandrev.dev/en/blog/what-is-notebooklm
2•byandrev•29m ago•1 comments

Show HN: An open-source starter kit for developing with Postgres and ClickHouse

https://github.com/ClickHouse/postgres-clickhouse-stack
1•saisrirampur•30m ago•0 comments

Game Boy Advance d-pad capacitor measurements

https://gekkio.fi/blog/2026/game-boy-advance-d-pad-capacitor-measurements/
1•todsacerdoti•30m ago•0 comments

South Korean crypto firm accidentally sends $44B in bitcoins to users

https://www.reuters.com/world/asia-pacific/crypto-firm-accidentally-sends-44-billion-bitcoins-use...
2•layer8•31m ago•0 comments

Apache Poison Fountain

https://gist.github.com/jwakely/a511a5cab5eb36d088ecd1659fcee1d5
1•atomic128•33m ago•2 comments

Web.whatsapp.com appears to be having issues syncing and sending messages

http://web.whatsapp.com
1•sabujp•33m ago•2 comments

Google in Your Terminal

https://gogcli.sh/
1•johlo•34m ago•0 comments

Shannon: Claude Code for Pen Testing: #1 on Github today

https://github.com/KeygraphHQ/shannon
1•hendler•35m ago•0 comments

Anthropic: Latest Claude model finds more than 500 vulnerabilities

https://www.scworld.com/news/anthropic-latest-claude-model-finds-more-than-500-vulnerabilities
2•Bender•39m ago•0 comments

Brooklyn cemetery plans human composting option, stirring interest and debate

https://www.cbsnews.com/newyork/news/brooklyn-green-wood-cemetery-human-composting/
1•geox•39m ago•0 comments

Why the 'Strivers' Are Right

https://greyenlightenment.com/2026/02/03/the-strivers-were-right-all-along/
1•paulpauper•41m ago•0 comments

Brain Dumps as a Literary Form

https://davegriffith.substack.com/p/brain-dumps-as-a-literary-form
1•gmays•41m ago•0 comments
Open in hackernews

Sharp Bilinear Filters: Big Clean Pixels for Pixel Art

https://bumbershootsoft.wordpress.com/2025/10/11/sharp-bilinear-filters-big-clean-pixels-for-pixel-art/
42•todsacerdoti•3mo ago

Comments

jan_Inkepa•3mo ago
Huh (having scanned but not read in detail the post), interesting approach. I'm not that well-versed in this area (as a game developer, I tend to jump straight to nearest-neighbour), but hadn't come across this before. I love the pathological example of a checkerboard pattern - very pleasing worst-case scenario, where I suspect it would just be a grey blur. However, the developer doesn't show us the equivalent for the suggested filter - systemically showing side-by-side comparisons of different filters would be useful. I suspect the resulting artefacts would be randomly blurry lines, which could also stand out. But nice to see people thinking about these things...

Here's a related disucssion on what 'pixelated' should mean from the css working group

https://github.com/w3c/csswg-drafts/issues/5837

(every so often browsers break/rename how nearest-neighbouring filtering works. I hope at some point it stabilizes lol - I note in the discussion linked nobody else cares about backwards compatibility ...).

scheeseman486•3mo ago
For as long as emulators supported shaders I've gotten into the habit of configuring them to scale output 4x nearest neighbor and then downscaling that to the display resolution using bilinear, which has roughly the same results; it gets rid of shimmering without blurring everything to a smudge. On any 1080p display with lower resolution content it looks great, but the method starts to fall apart once you try to scale anything higher than 480p.

With a 4K display the pixel density is high enough that virtually everything looks good scaled this way, though once you go higher than SD content you're usually dealing with 720p and 1080p, both of which 2160p divides into evenly anyway.

It's surprising how often I see bad pixel art scaling given how easy it is to fix.

d_tr•3mo ago
Sounds like exactly the same thing since bilinear filtering in the upscaled image only has an effect near the edges of the fat pixels.
CyberDildonics•3mo ago
Downscaling using bilinear interpolation doesn't really make sense, since what you want is a weighted average of pixels to make one new pixel at the lower resolution.

Single bilinear samples can lose information and leave out pixels of the higher res image, it's essentially a worse triangle filter.

TuxSH•3mo ago
> Single bilinear samples can lose information and leave out pixels of the higher res image, it's essentially a worse triangle filter.

Can you do [A B] -> [A 0.5*(A+B) B] 1.5x upscaling with a triangle filter? (I think this is not possible, but I might be wrong).

Also triangle filter samples too many pixels and makes a blurry mess of pixel-art images/sprites/...

Linear downscaling under the assumptions of pixel-center mapping and clamp-to-edge always simplifies into a polyphase filter with position-independent coefficients using at most the current input pixel and the previous one; and integer upscaling obviously is too.

Therefore any form of "sharp bilinear" that does not use bilinear upscaling reduces into such a polyphase filter. [A B] -> [A 0.5*(A+B) B] is equivalent to 2x integer upscale -> 0.75 bilinear scale (= 1.5x of input), and works on GPUs without fragment shaders too.

CyberDildonics•3mo ago
I think you're confusing a few things.

First, upscaling with a filter kernel (weighted average) doesn't make as much sense because you aren't weighting multiple pixels to make a single pixel, you are interpolating, so "upscaling with a triangle filter" isn't something practical.

Second, lots of signal processing things that can be technically applied to pixels on a row by row basis don't work well visually and don't make a lot of sense when trying to get useful results. This is why things like a fourier transform is not the backbone of image processing.

Polyphase filtering doesn't make any sense here, you have access to all the data verbatim, and you want to use it all when you upscale or downsample. There is no compression and no analog signal that needs to be sampled.

Third, any filter kernel is going to use the pixels under it's width/support. Using 'too many pixels' isn't something that makes sense and isn't the problem. How they are weighted when scaling an image down is what makes sense. If you want a sharper filter you can always use one. What I actually said was that linear interpolating samples to downsample an image doesn't make sense and is like using a triangle filter or half of a triangle filter.

This all seems to be work arounds for what people probably actually want if they are trying to get some sharpness, which is something like a bilateral filter, that weights similiar pixels more. This

TuxSH•3mo ago
You are correct in assuming that I'm not as familiar as you seem to be on these topics.

> Polyphase filtering (...) There is no compression and no analog signal that needs to be sampled.

The term "polyphase scaling" is used at least by AMD: https://docs.amd.com/r/en-US/pg325-v-multi-scaler/Polyphase-... , that's why I used the term.

> What I actually said was that linear interpolating samples to downsample an image doesn't make sense and is like using a triangle filter or half of a triangle filter.

In isolation yes it doesn't make sense, but linear downsampling is a mere implementation detail here: "4x nearest neighbor and then downscaling that to the display resolution using bilinear" is an upscaling filter (unless the output resolution is lower) that doesn't discard any pixel of the initial input.

CyberDildonics•3mo ago
The term "polyphase scaling" is used at least by AMD: https://docs.amd.com/r/en-US/pg325-v-multi-scaler/Polyphase-... , that's why I used the term.

This looks likes a custom video scaler and the context is custom filtering when one image is only slightly different in dimensions to another.

How does that apply here?

In isolation yes it doesn't make sense, but linear downsampling is a mere implementation detail here

There is no such thing as "linear downsampling". There are box filters, triangle filters and other weighted averages, then there are more sophisticated weighting schemes that take into account more than just distance.

that doesn't discard any pixel of the initial input.

It creates more data then discards it by sampling too sparsely, but the sparse samples get linear interpolated so you don't notice the aliasing as much. This is not a technically sound way to upscale an image, it only can seem to work if you compare it to a poor enough example.

If you want to see a better example look at bilateral upscaling, which would weight similiar pixels more heavily when doing interpolation, which should keep edges sharper. You can probably see this in motion with the right settings on a recent TV.

scheeseman486•3mo ago
I don't doubt there are ways that are more technically correct or efficient, but given the limited configuration options a lot of emulators offer it works well enough.
ack_complete•3mo ago
While true in the general case, in this case the bilinear downscaling pass is being applied to a signal already previously upsampled using a known specific filter (box filter). Aliasing is therefore more limited and controlled.
CyberDildonics•3mo ago
A filter is a weighted average and a box filter treats all pixels the same. Upscaling with a box filter barely makes sense because it will either end up sampling a single pixel (impulse filter) or blurring the image even more than normal upscaling.

"Bilinear downscaling" also doesn't make sense because scaling an image down means doing a weighted average of the multiple pixels going into a single pixel. Pixels being weighted linearly based on distance would be a triangle filter.

Aliasing is therefore more limited and controlled.

Aliasing doesn't need to happen at all with a reasonable filter width. If someone is interpolating between four pixels, that's a triangle filter with four samples.

TuxSH•3mo ago
> that to the display resolution using bilinear

On that topic, Pillow so-called binilnear isn't actually bilinear interpolation [1][2], same with Magick IIRC (but Magick at least gives you -define filter:blur=<value> to counteract this)

[1] https://pillow.readthedocs.io/en/stable/releasenotes/2.7.0.h...

[2] https://github.com/python-pillow/Pillow/blob/main/src/libIma...

smallerize•3mo ago
Ok but what does that image at the top look like with this new filter applied?
tobr•3mo ago
An easy way to do this that I’ve used when resizing images in photoshop is to first scale it to the closest larger integer scaling factor of the target output using nearest neighbor, and then scale that down to the final result with bilinear or bicubic.