frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

The radix 2^51 trick (2017)

https://www.chosenplaintext.ca/articles/radix-2-51-trick.html
229•blobcode•7h ago•35 comments

Radio Astronomy Software Defined Radio (Rasdr)

https://radio-astronomy.org/rasdr
24•zeristor•2h ago•4 comments

Tokenization for language modeling: BPE vs. Unigram Language Modeling (2020)

https://ndingwall.github.io/blog/tokenization
16•phewlink•2h ago•0 comments

Bridged Indexes in OrioleDB: architecture, internals and everyday use?

https://www.orioledb.com/blog/orioledb-bridged-indexes
16•pella•1h ago•1 comments

Atomics and Concurrency

https://redixhumayun.github.io/systems/2024/01/03/atomics-and-concurrency.html
20•LAC-Tech•2d ago•1 comments

What Happens When AI-Generated Lies Are More Compelling Than the Truth?

https://behavioralscientist.org/what-happens-when-ai-generated-lies-are-more-compelling-than-the-truth/
13•the-mitr•1h ago•4 comments

Turn a Tesla into a mapping vehicle with Mapillary

https://blog.mapillary.com/update/2020/12/09/map-with-your-tesla.html
42•faebi•1d ago•15 comments

Practical SDR: Getting started with software-defined radio

https://nostarch.com/practical-sdr
164•teleforce•10h ago•43 comments

Triangle splatting: radiance fields represented by triangles

https://trianglesplatting.github.io/
93•ath92•7h ago•38 comments

WeatherStar 4000+: Weather Channel Simulator

https://weatherstar.netbymatt.com/
622•adam_gyroscope•19h ago•115 comments

FLUX.1 Kontext

https://bfl.ai/models/flux-kontext
395•minimaxir•17h ago•99 comments

Why do we get earworms?

https://theneuroscienceofeverydaylife.substack.com/p/mahna-mahna-do-doo-be-do-do-why-do
8•lentoutcry•2h ago•8 comments

Show HN: MCP Server SDK in Bash (~250 lines, zero runtime)

https://github.com/muthuishere/mcp-server-bash-sdk
74•muthuishere•7h ago•20 comments

Printing metal on glass with lasers [video]

https://www.youtube.com/watch?v=J0NNO91WyXM
6•surprisetalk•2d ago•1 comments

Dr John C. Clark, a scientist who disarmed atomic bombs twice

https://daxe.substack.com/p/disarming-an-atomic-bomb-is-the-worst
98•vinnyglennon•2d ago•63 comments

OpenBAO (Vault open-source fork) Namespaces

https://openbao.org/blog/namespaces-announcement/
44•gslin•8h ago•19 comments

The atmospheric memory that feeds billions of people: Monsoon rainfall mechanism

https://phys.org/news/2025-05-atmospheric-memory-billions-people-monsoon.html
28•PaulHoule•2d ago•6 comments

Buttplug MCP

https://github.com/ConAcademy/buttplug-mcp
184•surrTurr•4h ago•98 comments

Show HN: I wrote a modern Command Line Handbook

https://commandline.stribny.name/
353•petr25102018•20h ago•92 comments

Smallest Possible Files

https://github.com/mathiasbynens/small
43•yread•2d ago•16 comments

Player Piano Rolls

https://omeka-s.library.illinois.edu/s/MPAL/page/player-piano-rolls-landing
46•brudgers•8h ago•30 comments

How to Do Ambitious Research in the Modern Era [video]

https://www.youtube.com/watch?v=w7DVlI_Ztq8
32•surprisetalk•6h ago•1 comments

Superauthenticity: Computer Game Aspect Ratios

https://datadrivengamer.blogspot.com/2025/05/superauthenticity-computer-game-aspect.html
15•msephton•3d ago•5 comments

Show HN: templUI – The UI Kit for templ (CLI-based, like shadcn/UI)

https://templui.io/
37•axadrn•7h ago•20 comments

Show HN: Donut Browser, a Browser Orchestrator

https://donutbrowser.com/
43•andrewzeno•7h ago•20 comments

Making C and Python Talk to Each Other

https://leetarxiv.substack.com/p/making-c-and-python-talk-to-each
121•muragekibicho•3d ago•75 comments

Why is everybody knitting chickens?

https://ironicsans.ghost.io/why-is-everybody-knitting-chickens/
139•mooreds•2d ago•104 comments

I'm starting a social club to solve the male loneliness epidemic

https://wave3.social
215•nswizzle31•11h ago•406 comments

Notes on Tunisia

https://mattlakeman.org/2025/05/29/notes-on-tunisia/
85•returningfory2•14h ago•41 comments

Human coders are still better than LLMs

https://antirez.com/news/153
528•longwave•18h ago•613 comments
Open in hackernews

Civil War in 3D: Stereographs from the New-York Historical Society (2015)

https://www.nyhistory.org/blogs/civil-war-in-3d-stereographs-from-the-new-york-historical-society
49•LorenDB•19h ago

Comments

ramesh31•19h ago
You can see the effect in these images directly without a device, by simply crossing your eyes and focusing on the third central image that appears, similar to those 3D optical illusion books: https://youtu.be/zBa-bCxsZDk
JKCalhoun•18h ago
The cross-eyed method requires the images be swapped left-for-right.
kazinator•17h ago
Not sure why you are downvoted; that is correct.
kazinator•17h ago
This gallery presents the original stereograms in their stare-into-distance configuration (left image goes with left eye, right with right), not cross-eyes configuration (left image goes with right eye and vice versa).
JeremyHerrman•19h ago
Is it just me or are some of these examples not actually stereo image pairs?

I'm just crossing my eyes to see the "negative" depth image but some like "McLean’s House" and "Lincoln visits General McClellan at Antietam" don't appear to have any depth changes between them.

JKCalhoun•18h ago
You need to swap left and right images to use the cross-eyed method on these. You can try downloading as an image, use an app like Preview to Flip Horizontal (that will work).

Otherwise you're seeing a kind of inverse stereo image.

(EDIT: Having said that, I tried a few of the images and the stereo effect is subtle. The soldier on the horse — I was not even able to get that to "snap" for me. I am not great with cross-eyed stereo though.)

JeremyHerrman•17h ago
yes understood that cross-eyed method inverts the depth. My point was that some of the image pairs are from the exact same perspective - so there is no stereo depth no matter if you're using cow-eyed or cross-eyed.
JKCalhoun•14h ago
Yeah, if there is depth, it was pretty subtle on the few I got to work.
kazinator•17h ago
These images were prepared for insertion into a stereogram in which the left eye looks at the left image and right eye looks at the right image, through a magnifying lens. When viewing with the naked eye, you must stare past the images into the distance to get them to converge that way.
JeremyHerrman•17h ago
Thanks, I understand how stereograms work and have quite a few of these IRL. I use cross-eyed method to quickly view them (albeit inverted depth) when shown on screen.

I've tried to show my point in these videos which show basically no difference between the two images when overlapped and crossfaded between the two. https://imgur.com/a/RMy3QA3

kazinator•17h ago
I agree that particular image is a dud; I was not able to perceive any depth.

The creator mistakenly used the same image twice.

The two men in a tent image is likewise a dud. If we look at the pole at the tent entrance, there is no difference in parallax between that and objects at the back wall.

The Abe Lincoln doesn't pop out much for me.

The dead soldiers in the field also seems to be identical images.

The clearly genuine ones are the horse-drawn carriage in the forest, and the horseman in front of the cannon.

JeremyHerrman•17h ago
Here are some videos trying to show what I mean. I overlapped the two images on top and crossfaded between the two. Aside from some minor distortion I don't see any major differences normally found between stereo pairs.

https://imgur.com/a/RMy3QA3

saddat•18h ago
Create two pictures from it and use https://huggingface.co/spaces/cavargas10/TRELLIS-Multiple3D
kazinator•17h ago
For casual viewing with the unaided eye, you want to present stereograms in cross-your-eyes order not stare-into-distance order.

Most people are not able to cause their eyes to diverge, so the scale of images in a stare-into-distance stereogram is limited by the interocular distance.

In cross-eye configuration, larger images can be used.

(Of course, the use of magnification in stereoscopes relieves the issue, as well as making it easier for the eyes to focus, since the magnified virtual images appear farther away. Viewing stare-into-distance stereograms requires the eyes to believe they are looking far away due to the parallel gaze, while simultaneously focusing near on the images; magnification brings the images farther out.)

LorenDB•17h ago
I personally find the crosseyed type to be nearly impossible, while the parallel type are pretty easy for me. So I think it really depends on the person. Additionally, most stereograms I've seen (e.g. coffee-table books) have been parallel type.
kazinator•16h ago
The parallel types are also very easy for me, but they are always small.

If the spacing between them is wider than my inter-ocular distance, I find them impossible to converge.

I made stereograms in the past and wanted to see larger images with the naked eye, so I had no choice but swap the images and cross the eyes.

6yyyyyy•8h ago
I flipped them all, enjoy:

https://imgur.com/a/OOiQ5AK

(FYI: -vf stereo3d=in=sbsl:out=sbsr in ffmpeg.)

ge96•15h ago
For an example that works see this squirrel sorry reddit link

https://www.reddit.com/media?url=https%3A%2F%2Fpreview.redd....

crazy but I feel sick now ha, I had a VR headset before and I'd get super sick trying to play FO4, VRChat wasn't bad

bredren•15h ago
Would be cool to get these converted into spatial photos for Vision Pro.
mdswanson•14h ago
Not too many steps away from this: https://blog.mikeswanson.com/spatial/