frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

South Korean crypto firm accidentally sends $44B in bitcoins to users

https://www.reuters.com/world/asia-pacific/crypto-firm-accidentally-sends-44-billion-bitcoins-use...
1•layer8•1m ago•0 comments

Apache Poison Fountain

https://gist.github.com/jwakely/a511a5cab5eb36d088ecd1659fcee1d5
1•atomic128•3m ago•0 comments

Web.whatsapp.com appears to be having issues syncing and sending messages

http://web.whatsapp.com
1•sabujp•3m ago•1 comments

Google in Your Terminal

https://gogcli.sh/
1•johlo•4m ago•0 comments

Shannon: Claude Code for Pen Testing

https://github.com/KeygraphHQ/shannon
1•hendler•5m ago•0 comments

Anthropic: Latest Claude model finds more than 500 vulnerabilities

https://www.scworld.com/news/anthropic-latest-claude-model-finds-more-than-500-vulnerabilities
1•Bender•9m ago•0 comments

Brooklyn cemetery plans human composting option, stirring interest and debate

https://www.cbsnews.com/newyork/news/brooklyn-green-wood-cemetery-human-composting/
1•geox•9m ago•0 comments

Why the 'Strivers' Are Right

https://greyenlightenment.com/2026/02/03/the-strivers-were-right-all-along/
1•paulpauper•11m ago•0 comments

Brain Dumps as a Literary Form

https://davegriffith.substack.com/p/brain-dumps-as-a-literary-form
1•gmays•11m ago•0 comments

Agentic Coding and the Problem of Oracles

https://epkconsulting.substack.com/p/agentic-coding-and-the-problem-of
1•qingsworkshop•12m ago•0 comments

Malicious packages for dYdX cryptocurrency exchange empties user wallets

https://arstechnica.com/security/2026/02/malicious-packages-for-dydx-cryptocurrency-exchange-empt...
1•Bender•12m ago•0 comments

Show HN: I built a <400ms latency voice agent that runs on a 4gb vram GTX 1650"

https://github.com/pheonix-delta/axiom-voice-agent
1•shubham-coder•12m ago•0 comments

Penisgate erupts at Olympics; scandal exposes risks of bulking your bulge

https://arstechnica.com/health/2026/02/penisgate-erupts-at-olympics-scandal-exposes-risks-of-bulk...
4•Bender•13m ago•0 comments

Arcan Explained: A browser for different webs

https://arcan-fe.com/2026/01/26/arcan-explained-a-browser-for-different-webs/
1•fanf2•15m ago•0 comments

What did we learn from the AI Village in 2025?

https://theaidigest.org/village/blog/what-we-learned-2025
1•mrkO99•15m ago•0 comments

An open replacement for the IBM 3174 Establishment Controller

https://github.com/lowobservable/oec
1•bri3d•17m ago•0 comments

The P in PGP isn't for pain: encrypting emails in the browser

https://ckardaris.github.io/blog/2026/02/07/encrypted-email.html
2•ckardaris•20m ago•0 comments

Show HN: Mirror Parliament where users vote on top of politicians and draft laws

https://github.com/fokdelafons/lustra
1•fokdelafons•20m ago•1 comments

Ask HN: Opus 4.6 ignoring instructions, how to use 4.5 in Claude Code instead?

1•Chance-Device•22m ago•0 comments

We Mourn Our Craft

https://nolanlawson.com/2026/02/07/we-mourn-our-craft/
1•ColinWright•24m ago•0 comments

Jim Fan calls pixels the ultimate motor controller

https://robotsandstartups.substack.com/p/humanoids-platform-urdf-kitchen-nvidias
1•robotlaunch•28m ago•0 comments

Exploring a Modern SMTPE 2110 Broadcast Truck with My Dad

https://www.jeffgeerling.com/blog/2026/exploring-a-modern-smpte-2110-broadcast-truck-with-my-dad/
1•HotGarbage•28m ago•0 comments

AI UX Playground: Real-world examples of AI interaction design

https://www.aiuxplayground.com/
1•javiercr•29m ago•0 comments

The Field Guide to Design Futures

https://designfutures.guide/
1•andyjohnson0•29m ago•0 comments

The Other Leverage in Software and AI

https://tomtunguz.com/the-other-leverage-in-software-and-ai/
1•gmays•31m ago•0 comments

AUR malware scanner written in Rust

https://github.com/Sohimaster/traur
3•sohimaster•33m ago•1 comments

Free FFmpeg API [video]

https://www.youtube.com/watch?v=6RAuSVa4MLI
3•harshalone•33m ago•1 comments

Are AI agents ready for the workplace? A new benchmark raises doubts

https://techcrunch.com/2026/01/22/are-ai-agents-ready-for-the-workplace-a-new-benchmark-raises-do...
2•PaulHoule•38m ago•0 comments

Show HN: AI Watermark and Stego Scanner

https://ulrischa.github.io/AIWatermarkDetector/
1•ulrischa•39m ago•0 comments

Clarity vs. complexity: the invisible work of subtraction

https://www.alexscamp.com/p/clarity-vs-complexity-the-invisible
1•dovhyi•40m ago•0 comments
Open in hackernews

Eye prosthesis is the first to restore sight lost to macular degeneration

https://med.stanford.edu/news/all-news/2025/10/eye-prosthesis.html
259•gmays•3mo ago

Comments

flobosg•3mo ago
Discussion from last week: https://news.ycombinator.com/item?id=45653639
oulipo2•3mo ago
That's so cool.
meindnoch•3mo ago
Resolution, color depth?
buran77•3mo ago
> Resolution is limited by the size of pixels on the chip. Currently, the pixels are 100 microns wide, with 378 pixels on each chip.

> the PRIMA device provides only black-and-white vision

ddingus•3mo ago
378 pixels, 1bpp

The Phosphenes[0] patients sense will depend on what is left of the retina. People using earlier systems reported some interpolation happened. Maybe that is true of this device too.

[0] - that is the name for the image the brain manifests in response to signals received by the visual cortex. Most of us experience them when we close our eyes and rub them, or maybe just see stuff that is unreal.

ggm•3mo ago
The interpolation would tend to at best half a pixel? And the phosphor lag (like on a tube) would be an issue surely?

Are there instances of single eye outcome where the subject has drawn perceived image so we can understand how this exposes into conscious visual stimuli?

Even just a flash on the left == left object vs flash on the right == right object would be a useful signal compared to zero. But describing it as "vision" would be stretching it. 378 pixels is a few letters at 10x18 so it's 2-3 words. Again, massive gains on nothing, but it's beyond "large print" its "large print with a magnifying glass" and it might be phosphor burn colour against black or a foggy field, or a number of things.

To be clear, this is amazing stuff and hats off to anyone who helped make it happen, but let's not assume we're in "snow crash" territory just yet.

ddingus•3mo ago
The lag would be in signal processing external to the user.

Interpolation would be more transparent, much like it is for you right now. There are no phosphors in tubes in any of this.

I made no such "snow crash" assumption.

Users of devices like this have described their experiences and those are not generally big square pixels.

Think of those more like points the brain can do something with.

The chip stimulates the remaining neuro-signal entities present in the damaged retina. I doubt there is a 1:1 relationship between those and the signaling points on the chip.

When the company can do better than on/off bright/contrast, the overall experience should improve dramatically. There will be more signal points (1024 ish?) and those having variable output levels will give the users visual cortex a whole lot more to work with.

About the only analogous thing I can come up with is cochlear implants. Those have a number of signal points that seems a lot smaller in number than expected. That was certainly my take. The more of those there are, the more concurrent sounds can be differentiated. A greater sense of timbre, in other words, becomes possible.

aetherspawn•3mo ago
Any chance you could explain why this can only send black and white. Is colour a capability that could be added in the future?
ddingus•3mo ago
I am speculating here.

The reason for it being a two level device at present is likely due to it being mostly research and not so much engineering.

They say their next chip will deliver grey scale and many more signal points.

My guess on color is one or more of the following is true:

[0]The color info is normally sent via the color sensitive cells now damaged and we have yet to understand how that signal enters the nerves we can send a signal to.

[1]It may be that we need a far smaller, more precise signal point to achieve color. Current tech stimulates many nerve endings. This was the basis for my interpolation comment above. Basically, each pixel stimulates an area of the damaged retina which contains a great many possible signal points if it were possible to stimulate them individually. Because so many are stimulated all at once, the subject perceives white phosphines rather than colored ones.

An analogy would be the colors on a CRT. A broadband beam would light them all up, yielding monochrome vision. A narrow beam can light up a few or just one, yielding color.

One thing I just realized writing this is our blue sensor cells are scattered about, not well clustered like the green and red ones are.

Maybe current users see a bit of color at the very extent of the artificial visual field due to a failure to hit the necessary blue cells...

[2] It may be some sort of pulse is needed to encode colors. And perhaps the current signaling is continuous.

Hopefully, we get an answer from the team.

ZebusJesus•3mo ago
This is cool, glad to see people doing awesome things like this
fatyorick•3mo ago
I'm imagining a hacker sending infrared signals to a user to upload whatever image straight to their brain.
fragmede•3mo ago
And not a corporation to send advertising? What kind of cyberpunk dystopia is that?
tombakt•3mo ago
Yes, we can even call it “snow crash” ;)
ricardobeat•3mo ago
And so the cyborg era begins.
almosthere•3mo ago
https://static.wikia.nocookie.net/star-trek-universe-rpg/ima...
michael1999•3mo ago
We're well into it already. Corneal implants, pacemakers, titanium hips, closed-loop insulin pumps, cochlear implants, etc. We've been using glasses and wooden teeth for centuries, but we're really getting going now.
smath•3mo ago
Very very cool. I have this condition - I got it randomly ("idiopathic" as opposed to age-related) when I was 22. At the time it wreaked havoc on my mental health.
Quizzical4230•3mo ago
I am so sorry you went through this. How are you doing now?

I got an ICL (Intra Collamer Lens) implant at 22 (25 now) and that ruined my night vision with ghosts and glares.

tux•3mo ago
This is incredible technology. Now do computer glasses with AI and you won’t need separate devices like phone.
frays•3mo ago
This is the sort of technology that will actually bring benefit into our lives.

Can't wait to see what advancements will be made in vision-related healthcare over the next 20 years.

bayesnet•3mo ago
> Two-thirds [of participants] reported medium to high user satisfaction with the device.

I don’t know much about medical trials but this seems surprisingly low to me, especially given that the study population is presumably predisposed to liking the device (since they opted-in to an experimental study).

Did the implant not work in these cases or were there other quality-of-life issues? I wish university press releases on science were less rah-rah and presented more factual information. I guess that’s what the NEJM article is for.

kace91•3mo ago
It seems that it’s black on white, forms, and reading involves adjusting zoom and brightness until you focus a single word at a time?

And it still uses your regular peripheral vision so the experience merging the two might be uncomfortable.

Not discounting the success at all, but anything messing with your senses is probably very hard to adapt to unless it’s pretty much a perfect match with the experience you’re used to.

Rainwulf•3mo ago
The fact that this thing is basically an passive IR solar panel in the back of the eye is wild.

It also means that there is the potential here for night vision.