frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Start all of your commands with a comma

https://rhodesmill.org/brandon/2009/commands-with-comma/
51•theblazehen•2d ago•10 comments

OpenCiv3: Open-source, cross-platform reimagining of Civilization III

https://openciv3.org/
636•klaussilveira•13h ago•188 comments

The Waymo World Model

https://waymo.com/blog/2026/02/the-waymo-world-model-a-new-frontier-for-autonomous-driving-simula...
935•xnx•18h ago•549 comments

What Is Ruliology?

https://writings.stephenwolfram.com/2026/01/what-is-ruliology/
35•helloplanets•4d ago•30 comments

How we made geo joins 400× faster with H3 indexes

https://floedb.ai/blog/how-we-made-geo-joins-400-faster-with-h3-indexes
113•matheusalmeida•1d ago•28 comments

Jeffrey Snover: "Welcome to the Room"

https://www.jsnover.com/blog/2026/02/01/welcome-to-the-room/
13•kaonwarb•3d ago•11 comments

Unseen Footage of Atari Battlezone Arcade Cabinet Production

https://arcadeblogger.com/2026/02/02/unseen-footage-of-atari-battlezone-cabinet-production/
44•videotopia•4d ago•1 comments

Show HN: Look Ma, No Linux: Shell, App Installer, Vi, Cc on ESP32-S3 / BreezyBox

https://github.com/valdanylchuk/breezydemo
222•isitcontent•13h ago•25 comments

Monty: A minimal, secure Python interpreter written in Rust for use by AI

https://github.com/pydantic/monty
214•dmpetrov•13h ago•106 comments

Show HN: I spent 4 years building a UI design tool with only the features I use

https://vecti.com
323•vecti•15h ago•142 comments

Sheldon Brown's Bicycle Technical Info

https://www.sheldonbrown.com/
373•ostacke•19h ago•94 comments

Microsoft open-sources LiteBox, a security-focused library OS

https://github.com/microsoft/litebox
359•aktau•19h ago•181 comments

Hackers (1995) Animated Experience

https://hackers-1995.vercel.app/
478•todsacerdoti•21h ago•237 comments

Show HN: If you lose your memory, how to regain access to your computer?

https://eljojo.github.io/rememory/
278•eljojo•16h ago•165 comments

An Update on Heroku

https://www.heroku.com/blog/an-update-on-heroku/
407•lstoll•19h ago•273 comments

Dark Alley Mathematics

https://blog.szczepan.org/blog/three-points/
85•quibono•4d ago•21 comments

PC Floppy Copy Protection: Vault Prolok

https://martypc.blogspot.com/2024/09/pc-floppy-copy-protection-vault-prolok.html
57•kmm•5d ago•4 comments

Delimited Continuations vs. Lwt for Threads

https://mirageos.org/blog/delimcc-vs-lwt
26•romes•4d ago•3 comments

Vocal Guide – belt sing without killing yourself

https://jesperordrup.github.io/vocal-guide/
16•jesperordrup•3h ago•10 comments

How to effectively write quality code with AI

https://heidenstedt.org/posts/2026/how-to-effectively-write-quality-code-with-ai/
245•i5heu•16h ago•193 comments

Was Benoit Mandelbrot a hedgehog or a fox?

https://arxiv.org/abs/2602.01122
14•bikenaga•3d ago•2 comments

Introducing the Developer Knowledge API and MCP Server

https://developers.googleblog.com/introducing-the-developer-knowledge-api-and-mcp-server/
54•gfortaine•11h ago•22 comments

I spent 5 years in DevOps – Solutions engineering gave me what I was missing

https://infisical.com/blog/devops-to-solutions-engineering
143•vmatsiiako•18h ago•64 comments

Understanding Neural Network, Visually

https://visualrambling.space/neural-network/
284•surprisetalk•3d ago•38 comments

I now assume that all ads on Apple news are scams

https://kirkville.com/i-now-assume-that-all-ads-on-apple-news-are-scams/
1061•cdrnsf•22h ago•438 comments

Why I Joined OpenAI

https://www.brendangregg.com/blog/2026-02-07/why-i-joined-openai.html
136•SerCe•9h ago•124 comments

Learning from context is harder than we thought

https://hy.tencent.com/research/100025?langVersion=en
178•limoce•3d ago•96 comments

Show HN: R3forth, a ColorForth-inspired language with a tiny VM

https://github.com/phreda4/r3
70•phreda4•12h ago•14 comments

Female Asian Elephant Calf Born at the Smithsonian National Zoo

https://www.si.edu/newsdesk/releases/female-asian-elephant-calf-born-smithsonians-national-zoo-an...
28•gmays•8h ago•11 comments

FORTH? Really!?

https://rescrv.net/w/2026/02/06/associative
63•rescrv•21h ago•23 comments
Open in hackernews

Show HN: Eyesite – Experimental website combining computer vision and web design

https://blog.andykhau.com/blog/eyesite
138•akchro•8mo ago
I wanted Apple Vision Pros, but I don’t have $3,500 in my back pocket. So I made Apple Vision Pros at home.

This was just a fun little project I made. Currently, the website doesn't work on screens less than 1200x728 (Sorry mobile users!) It also might struggle on lower end devices.

For best results, have a webcam pointing right at you. I tested my website with a MacBook camera.

Any comments, questions, or suggestions are greatly appreciated!

blog: https://blog.andykhau.com/blog/eyesite

check it out: https://eyesite.andykhau.com/

github: https://github.com/akchro/eyesite

Comments

cwmoore•8mo ago
I may be the result of some evolutionary bottleneck, but wherever there is a camera lens, I assume eye tracking (and sentiment prediction) is at least possible, and at most globally always on.
jeffhuys•8mo ago
Next time you see a digital ad display, look for the little black circle on top of it. Yes, they do. Feel free to put a sticker over it.
noduerme•8mo ago
Just to play devil's advocate here... I'm about as extreme a privacy absolutist as you can find. But given that we're all on camera all the time in public spaces, like it or not, I don't consider tracking on digital billboards to be inherently evil. It could be used for evil (like - an authoritarian government tracking who looks at a billboard for an opposition political leader, for instance). But it could just be innocent data gathering. If you ran a plumbing business and paid $10k for a billboard, wouldn't you want to know if it was worth it or not? It's not as if decades of focus groups and hand-wavey feelings about what is or isn't effective advertising didn't already steer us into a society entirely dominated by big loud ads everywhere.

People who make products and sell services need to advertise. They in turn pay taxes. The many layers of parasitism in the advertising world historically relied on conning these people and taking their money in exchange for an unprovable proposition, namely that if you run this ad we tell you to run, right here, your sales will go up - but we'll never be able to actually tell you for sure by how much, or whether it was a good deal for you. From that perspective, more and better viewership data helps undermine the advertising bullshit machine and close the gap between people who run businesses and the people they're trying to sell their services to.

jll29•8mo ago
Remember this incident?

Italians deploy fearsome SPY MANNEQUINS to win Fashion Wars (2012) https://www.theregister.com/2012/11/22/bionic_mannequins_are...

According to the media at the time, the mannquins quickly disappeared again after the scandal broke. If you are more cynical, you might question that narrative.

noduerme•8mo ago
I never heard of this, but it's hilarious.
catlifeonmars•8mo ago
> People who make products and sell services need to advertise.

Do they though?

No one is entitled to better data. It should be on the advertisers to figure out more privacy preserving ways of getting feedback.

Ubiquitous technical surveillance is, in economics parlance, a negative externality.

jeffhuys•7mo ago
We're just going deeper and deeper into attention-grabbing displays; we now even have ANIMATED screens next to the ROADS in the Netherlands. Flashing, showing texts like "WATCH OUT! We have a new product!".

I have epilepsy. It's managed, I mean, I'm allowed to drive again. But what about people who don't have it managed (for some no meds will work)? They're just f'd when they walk down the street and Nike NEEDED to switch the screen every 0.2s?

If I can in _any_ way inhibit their ability to grow, I will.

If I need your product, I'll find it. I'm against any and all ads - I know it's unrealistic, but I'll do everything in my power to lower the amount of ads I see or to be a nuisance to them.

Really don't care about morality in this case.

pixl97•7mo ago
>we now even have ANIMATED screens next to the ROADS

Heh, only 30 years behind the US.

Of course our rate of people dying at younger ages over here seems to be a lot higher too.

pzo•7mo ago
I'm right now in Malaysia and my residential building lifts there are TV playing ads and make many tricks to grab attention:

- sound of phone ringing to keep you out of your zone

- loud annoying music with annoying lyrics

- putting cat in advertisement and meowing

holografix•8mo ago
Love this, any experimental HCI project managers inspires me to think differently about computers and tech
naveen_k•8mo ago
Ha! The timing is impeccable. This is a great demo. I've been experimenting with using gaze and eye tracking for cursor prediction as a side project. I like the idea of pressing 'space' for each dot. I just had the 9-dot movement going from one point to another. I'm using Mediapipe's face landmarks model (I wasn't aware of WebGazer). I'll continue to work on it, but it's great to see a similar thought process.
jll29•8mo ago
The WebGaze software used in this page was introduced in by Papoutsaki and co-workers by a team from Brown university and Georgia Tech in 2016:

- Papoutsaki, A. et al. (2016). "WebGazer: Scalable Webcam Eye Tracking Using User Interactions." Presented at the International Joint Conference on Artificial Intelligence (IJCAI). https://www.ijcai.org/Proceedings/16/Papers/540.pdf

While we are at it, you may also find the following research publications relevant to this discussion:

- "Improving User Perceived Page Load Times Using Gaze" (USENIX NSDI 2017) https://www.usenix.org/conference/nsdi17/technical-sessions/...

- "No clicks, no problem: using cursor movements to understand and improve search" (Huang, White, Dumais from Microsoft Research, ACM SIG CHI '11) https://dl.acm.org/doi/abs/10.1145/1978942.1979125

- Virtual gazing in video surveillance (SMVC '10) https://dl.acm.org/doi/abs/10.1145/1878083.1878089

weeb•8mo ago
Nice to see people getting interested in eye gaze. There are two things that you might like to look at that can help the UX.

1 - Calibration. Looking at static dots is BORING. The best idea I've seen is Tobii's gaming calibration where you look at dots to make them wobble and pop. This makes the whole process feel like a game, even when you've done it a hundred times before. I would love to see more ideas in this space to give a much more natural-feeling calibration process - even better if you can improve the calibration over time with a feedback loop, when users interact with an element.

2 - Gaze feedback. You are absolutely right that seeing a small, inaccurate and jumpy dot does more harm than good. Again, Tobii have led the way with their 'ghost overlay' for streamers.

For an example, see the following video. After calibration the ghost overlay is used to give approximate feedback. This is enough that some naive users are able to make small adjustments to a constant calibration error, or at least give feedback that the gaze is wrong, not that the UI is not responding.

https://youtu.be/mgQY4dL-09E?feature=shared&t=36

akchro•8mo ago
Thank you for the feedback!

1 - I experimented with some calibration involving staring at a point, but I found it troublesome as blinking would make lead to some inaccurate calibration data (webgazer doesn't have blink detection). It was also a little more fatiguing since the user would have to really focus on staring the entire time. I found that it was less mentally fatiguing if the user could control their own calibration give themselves room to blink or just rest their eyes for a second.

2 - Ghost overlays is a really good idea. I'll see what I can do to implement that feature.

I really appreciate you taking your time to write this!

naveen_k•7mo ago
I have been experimenting with using 5 phases of movements with each phase covering different areas of the screen while being actively moving. The last phase makes the dot move in a Lissajous-like motion which is more fluid like you are suggesting.

The challenge is recording and syncing the motion at a higher frequency and being able to save without much drift and the performance of these landmark/gaze models is often slow.

One more option to speed it up is not to do the eye tracking at record time, just record a crop video of the face and the screen first at 60Hz and then run the model on each frame and update the metadata of the dataset.

renierbotha•8mo ago
This is very cool!! Have you considered making a WebGL game that uses eye tracking for things like aiming? Could be very cool and one of the very few accessible games
akchro•7mo ago
Thanks for your interest! An eye tracking game is an interesting concept. My concern is that there is already a lot of overhead used for eye tracking that an entire game on top of that would be too laggy to be playable. Also, the eye tracking software right now doesn't seem precise enough for things like aiming. I could totally see a pure standalone game that utilizes eye tracking.
mdrzn•8mo ago
Very cool demo! Once this becomes good enough (now it's very wobbly and requires huge UI) I'd love to be able to read articles and navigate just using the eyes. Feels very natural.
akchro•7mo ago
I'm glad it feels natural! I wanted it to feel similar to Apple vision pros where the eye tracking just works. Hopefully in the future, I can just curl up in a blanket and let my eyes do all the scrolling when I'm reading.
estsauver•8mo ago
Hey, I just wanted to say this is one of the coolest demo's I've seen all year. This is really, really fun.
akchro•7mo ago
Thanks! It means a lot that people also find my stuff cool
BugsJustFindMe•7mo ago
> Screen Too Small This application requires a minimum screen size to function properly.

Why?

akchro•7mo ago
All the components needed to be big enough for eye tracking "selection" to consistently work. It's frustrating when you are trying to look at a button to press, but the eye tracking thinks you are looking elsewhere. If the screen were any small, the components wouldn't fit. I believe the dimensions should work on most computer screen sizes and just bar out mobile devices. This was also intended since the eye tracking does not work on mobile.
BugsJustFindMe•7mo ago
> the eye tracking does not work on mobile.

Isn't that a function of platform and not screen size?

> All the components needed to be big enough for eye tracking "selection" to consistently work.

Isn't this also a function more of angles and not pixels? A phone close to my face has the same apparent size as a computer screen farther away.

venk12•7mo ago
Impressive work - I have experimented with Tobii trackers - they are pretty accurate to work with. But accomplishing this with a single camera is definitely something. Would love to follow your work further. Keep going :)