frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Start all of your commands with a comma

https://rhodesmill.org/brandon/2009/commands-with-comma/
142•theblazehen•2d ago•42 comments

OpenCiv3: Open-source, cross-platform reimagining of Civilization III

https://openciv3.org/
668•klaussilveira•14h ago•202 comments

The Waymo World Model

https://waymo.com/blog/2026/02/the-waymo-world-model-a-new-frontier-for-autonomous-driving-simula...
949•xnx•19h ago•551 comments

How we made geo joins 400× faster with H3 indexes

https://floedb.ai/blog/how-we-made-geo-joins-400-faster-with-h3-indexes
122•matheusalmeida•2d ago•32 comments

Unseen Footage of Atari Battlezone Arcade Cabinet Production

https://arcadeblogger.com/2026/02/02/unseen-footage-of-atari-battlezone-cabinet-production/
53•videotopia•4d ago•2 comments

Show HN: Look Ma, No Linux: Shell, App Installer, Vi, Cc on ESP32-S3 / BreezyBox

https://github.com/valdanylchuk/breezydemo
229•isitcontent•14h ago•25 comments

Jeffrey Snover: "Welcome to the Room"

https://www.jsnover.com/blog/2026/02/01/welcome-to-the-room/
16•kaonwarb•3d ago•19 comments

Monty: A minimal, secure Python interpreter written in Rust for use by AI

https://github.com/pydantic/monty
222•dmpetrov•14h ago•117 comments

Vocal Guide – belt sing without killing yourself

https://jesperordrup.github.io/vocal-guide/
27•jesperordrup•4h ago•16 comments

Show HN: I spent 4 years building a UI design tool with only the features I use

https://vecti.com
330•vecti•16h ago•143 comments

Hackers (1995) Animated Experience

https://hackers-1995.vercel.app/
494•todsacerdoti•22h ago•243 comments

Sheldon Brown's Bicycle Technical Info

https://www.sheldonbrown.com/
381•ostacke•20h ago•95 comments

Microsoft open-sources LiteBox, a security-focused library OS

https://github.com/microsoft/litebox
359•aktau•20h ago•181 comments

Show HN: If you lose your memory, how to regain access to your computer?

https://eljojo.github.io/rememory/
288•eljojo•17h ago•169 comments

An Update on Heroku

https://www.heroku.com/blog/an-update-on-heroku/
412•lstoll•20h ago•278 comments

Was Benoit Mandelbrot a hedgehog or a fox?

https://arxiv.org/abs/2602.01122
19•bikenaga•3d ago•4 comments

PC Floppy Copy Protection: Vault Prolok

https://martypc.blogspot.com/2024/09/pc-floppy-copy-protection-vault-prolok.html
63•kmm•5d ago•6 comments

Dark Alley Mathematics

https://blog.szczepan.org/blog/three-points/
90•quibono•4d ago•21 comments

How to effectively write quality code with AI

https://heidenstedt.org/posts/2026/how-to-effectively-write-quality-code-with-ai/
256•i5heu•17h ago•196 comments

Delimited Continuations vs. Lwt for Threads

https://mirageos.org/blog/delimcc-vs-lwt
32•romes•4d ago•3 comments

What Is Ruliology?

https://writings.stephenwolfram.com/2026/01/what-is-ruliology/
43•helloplanets•4d ago•42 comments

Where did all the starships go?

https://www.datawrapper.de/blog/science-fiction-decline
12•speckx•3d ago•4 comments

Introducing the Developer Knowledge API and MCP Server

https://developers.googleblog.com/introducing-the-developer-knowledge-api-and-mcp-server/
59•gfortaine•12h ago•25 comments

Female Asian Elephant Calf Born at the Smithsonian National Zoo

https://www.si.edu/newsdesk/releases/female-asian-elephant-calf-born-smithsonians-national-zoo-an...
33•gmays•9h ago•12 comments

I now assume that all ads on Apple news are scams

https://kirkville.com/i-now-assume-that-all-ads-on-apple-news-are-scams/
1066•cdrnsf•23h ago•446 comments

I spent 5 years in DevOps – Solutions engineering gave me what I was missing

https://infisical.com/blog/devops-to-solutions-engineering
150•vmatsiiako•19h ago•67 comments

Why I Joined OpenAI

https://www.brendangregg.com/blog/2026-02-07/why-i-joined-openai.html
149•SerCe•10h ago•138 comments

Understanding Neural Network, Visually

https://visualrambling.space/neural-network/
287•surprisetalk•3d ago•43 comments

Learning from context is harder than we thought

https://hy.tencent.com/research/100025?langVersion=en
182•limoce•3d ago•98 comments

Show HN: R3forth, a ColorForth-inspired language with a tiny VM

https://github.com/phreda4/r3
73•phreda4•13h ago•14 comments
Open in hackernews

$5 whale listening hydrophone making workshop

https://exclav.es/2025/08/03/dinacon-2025-passive-acoustic-listening/
105•gsf_emergency_6•1mo ago

Comments

iefbr14•1mo ago
Nice, however the mentioned github link ( https://github.com/loganwilliams/passive-acoustic-listening ) does not exist (anymore?).
turtleyacht•1mo ago
The only repo that appears searching "passive acoustic listening" is

https://github.com/Vivek-Tate/IDS-Detection-and-Exploiting-V...

throw310822•1mo ago
That's a cheap whale. I wonder how they managed to fit it in.
actionfromafar•1mo ago
Cheap workshop!
blitzar•1mo ago
The elites don't want you to know this but the whales in the ocean are free. You can take them home. I have 458 whales.
gehsty•1mo ago
A note to just be a bit careful passively monitoring ocean acoustics, it’s easy to fall foul of military / security forces, they don’t like anything that can fingerprint a vessel.

I worked on DAS acoustic monitoring for subsea power cables (to monitor cable health!), turns out they are basically a submarine detection system.

sigmoid10•1mo ago
Reminds me of how the Navy heard the OceanGate submarine implode immediately when it lost contact en route to the Titanic, but waited several days before they admitted that because at the time noone even knew they had such a system of hydrophones in place. I wonder what else they have that we don't know about. The oceans are not just unexplored as a habitat, but also as an intelligence theater.
defrost•1mo ago
Pretty sure a fair number of people knew the US Navy and others had hydrophones in place, they've always been coy about it though.

For interest:

* it's one reason we know so much about ocean tempretures and tangentially have great data on climate change being real, and

* they had some cool R&D vessels:

  FLIP was originally built to support research into the fine-scale phase and amplitude fluctuations in undersea sound waves caused by thermal gradients and sloping ocean bottoms. This acoustic research was conducted as a portion of the Navy's SUBROC program. 
~ https://en.wikipedia.org/wiki/RP_FLIP
zipy124•1mo ago
the systems are pretty public, for instance the UK tender for Atlantic Net is easy to read. And the russians have Bastion which we known well about as well.
hencq•1mo ago
I remember at the time it felt a little bit suspicious to me. Only after everyone already knew it had imploded, the navy came out to say their hyper advanced detection system for enemy submarines had of course also detected it.
sandworm101•1mo ago
That was not the first time such data was used to from and a wreck. They have released locations for things like downed airliners for years, decades. Everyone knows about SOSUS. The classified bits are its locations and exact capabilities.

https://en.wikipedia.org/wiki/SOSUS

sandworm101•1mo ago
Even biulding the equipment. There are rules about hydrophones at certain certain frequencies. Just putting the plans online might runafoul of export rules. Beware of stringing multiple hydrophones as this article suggests. Put too many on a system and you are into possible beamforming territory ... the tech used for geolocating noises underwater. The USN gets kinda twitchy about such things.

https://www.ecfr.gov/current/title-22/chapter-I/subchapter-M...

(Search for hydrophone)

emsign•1mo ago
as if they own the oceans
andai•1mo ago
I know a man who built a fish shaped vehicle, and was immediately approached by the men in black...
quietbritishjim•1mo ago
Good advice but there's a bit of a difference between a device (or even several) you can knock together yourself and throw out of the side of a (surface) boat vs access to a whole undersea cable which (I have just learned) is what you need for DAS. Plus, if you can do it yourself with virtually no resources, it's a safe bet that any potential adversaries are already doing something many orders of magnitude greater.

Supposedly new submarines are so quiet that they can't be detected anyway. I'm sure there's a large element of exaggerating abilities here, but there's definitely an element of truth: in 2009, two submarines carrying nuclear weapons (not just nuclear powered) collided, presumably because they couldn't detect each other. If a nuclear submarine cannot detect another nuclear submarine right next to it then it's unlikely your $5 hydrophone will detect one at a distance.

Of course, none of this means that the military will be rational enough not to be annoyed with you.

[1] https://en.wikipedia.org/wiki/HMS_Vanguard_and_Le_Triomphant...

seydar•1mo ago
I used to be a submariner and now work in an unrelated acoustic space (acoustic analysis of the electric grid), but I'd love to learn more about the DAS world — my email is in my profile.
anfractuosity•1mo ago
Acoustic analysis of the electric grid sounds interesting, is that to detect things like sparks?
psc•1mo ago
DAS has really been taking off in the marine bioacoustics world!

https://www.birds.cornell.edu/home/deep-listening/

https://depts.washington.edu/uwb/revolutionizing-marine-cons...

Very cool and very powerful technology, it'll be interesting to see how fiber sensing progresses, especially with how much undersea fiber already exists. For subsea power cables, is there a parallel fiber dedicated just for DAS monitoring? Do these get bundled in with data fiber runs as well? I've been curious how well DAS can work over actively lit / in-service fiber.

gehsty•1mo ago
On the cables I worked on they would use a separate fibre, but power cables tended to overspec the number of fibers massively so was never an issue. Some even have two bundles of fibers.

A supplier played whale song they recorded from cables, and said they repackage and sell the same product to defense contractors.

wzdd•1mo ago
Last Chance to See has a fun bit about listening for dolphins in the Yangtze by taking a regular microphone and putting a condom over it. Always wondered how they sealed the end.
micw•1mo ago
Must just be long enough ^^
bcraven•1mo ago
https://www.hbmpodcast.com/podcast/hbm088-riptides-and-a-sin...

Information here from a superb podcast

gnatman•1mo ago
what if you tied it tightly beneath the mic with a rubber band, and then attached a weight 2-3 feet down the cord? that way the condom/mic floats end down and keeps things dry.
unwind•1mo ago
TIL about "plug-in power", that seems to be a thing that some sound recording devices with 3.5 mm "phono" jacks can provide.

Here [1] is a page at Klover, and here [2] is one at Shure. Not sure if there's a formal specification for this, or if it's just something that manufacturers started doing.

[1]: https://www.kloverproducts.com/blog/what-is-plugin-power

[2]: https://service.shure.com/s/article/difference-between-bias-...

emsign•1mo ago
Røde has the VXLR series of 3.5 jack to XLR adaptors, one of them supports plug in power.
micw•1mo ago
Cool, I wish I had seen this before we went for whale watching to the Azores last year.

Can we now have lot of audio records with a documentation of whale behavior to train an AI and get a whale-translator at the end?

fleahunter•1mo ago
The most interesting bit here to me isn’t the $5 or the DIY, it’s that this is quietly the opposite of how we usually “do” sensing in 2025.

Most bioacoustics work now is: deploy a recorder, stream terabytes to the cloud, let a model find “whale = 0.93” segments, and then maybe a human listens to 3 curated clips in a slide deck. The goal is classification, not experience. The machines get the hours-long immersion that Roger Payne needed to even notice there was such a thing as a song, and humans get a CSV of detections.

A $5 hydrophone you built yourself flips that stack. You’re not going to run a transformer on it in real time, you’re going to plug it into a laptop or phone and just…listen. Long, boring, context-rich listening, exactly the thing the original discovery came from and that our current tooling optimizes away as “inefficient”.

If this stuff ever scales, I could imagine two very different futures: one is “citizen-science sensor network feeding central ML pipelines”, the other is “cheap instruments that make it normal to treat soundscapes as part of your lived environment”. The first is useful for papers. The second actually changes what people think the ocean is.

The $5 is important because it makes the second option plausible. You don’t form a relationship with a black-box $2,000 research hydrophone you’re scared to break. You do with something you built, dunked in a koi pond, and used to hear “fish kisses”. That’s the kind of interface that quietly rewires people’s intuitions about non-human worlds in a way no spectrogram ever will.

jasonjayr•1mo ago
Cheap sensors, used by many, is how we get more reproducability, more citizen science, and more understanding of the world around us.

RTL-SDR is another area where this there is so much to see 'hidden' in electromagnetic radio frequency space.

zimpenfish•1mo ago
> You’re not going to run a transformer on it in real time

Why not? You can run BirdNET's model live in your browser[0]. Listen live and let the machine do the hard work of finding interesting bits[1] for later.

[0] https://birdnet-team.github.io/real-time-pwa/about/

[1] Including bits that you may have missed, obvs.

chankstein38•1mo ago
This is what I was going to say. My whole goal when setting up sensing projects is to eventually get it to a point that I can automate it. And I'm just a DIY dude in his house. I've been working on the detection of cars through vibrations detected by dual MPUs resonating through my house. I don't mean to imply I've had great success. I can see the pattern of an approaching car but I'm struggling to get it recognized as a car reliably and to not overcount.

But yeah, totally been doing projects like this for a long time lol not sure why OP implies you wouldn't do that. First thing I thought was "Oh man I want to put it in the lake near me and see if I can't get it detecting fish or something!"

zimpenfish•1mo ago
> First thing I thought was "Oh man I want to put it in the lake near me and see if I can't get it detecting fish or something!"

Same. Although my first effort with my hydrophone (in my parents pond) was stymied because they live on a main road and all I picked up was car vibrations.

Maybe that's your solution - get a fish tank/pond and hydrophone!

asdhtjkujh•1mo ago
Slight tangent, but does anyone have experience with recording hydrophones in excess of 192khz? Last I checked, most of these are specialty devices with high price tags.

Recording full-fidelity whale or dolphin sounds (amongst others) requires using a higher sample rate than is available in most consumer-grade equipment. There's a lot more information down there!

hecanjog•1mo ago
I've used audiomoths for this. They can be configured to record at sampling rates up to 384kHz and be deployed with waterproof cases: https://www.openacousticdevices.info/audiomoth
DoctorOetker•1mo ago
Why not use the ADC's used for CIS (contact image sensors, the linear sensors in consumer-grade scanners / photocopiers)?

For example:

12 MHz (i.e. 12 000 kHz) sample rate per channel, 4 channels, 16 bit ADC:

https://www.akm.com/eu/en/products/mfp-lbp/ak8471vn/

single digit dollar unit prices:

https://octopart.com/search?q=AK8471VN&currency=USD&specs=0

Agingcoder•1mo ago
This title is very hard to parse
psc•1mo ago
I've been working on a citizen science version of this, we have 7 hydrophones deployed that anyone can listen to live:

https://live.orcasound.net/

These hydrophones are a bit more expensive (~$1k per deployment) but still very accessible compared to how much it usually costs. And the goal is to bring the cost down to the ~$100 range (so $5 is very impressive!):

https://experiment.com/projects/can-low-cost-diy-hydrophones...

All the data is being saved (used for scientific research & ML training), with some of the hydrophones going back to 2017, and yes it's quite difficult to listen to and review so much audio. Better tools like the hydrophone explorer UI are much needed (been working on something similar).

One of the things that's surprised me the most is how difficult to keep hydrophones up and running. I can sympathize with both the technical and social challenges—underwater is not a friendly environment for electronics, and it can be difficult to get permission to deploy hydrophones. But it's incredibly rewarding when it works and you capture some cool sounds.

For anyone interested, all the code is open source and acoustic data is freely available:

Code: https://github.com/orcasound/

Data: https://registry.opendata.aws/orcasound/

Community: https://orcasound.zulipchat.com/

alexpotato•1mo ago
> Better tools like the hydrophone explorer UI are much needed (been working on something similar).

Where could I learn more about requirements for this as a I love building tools like this.

psc•1mo ago
It's actually partly built, still in beta testing (has some bugs) but here's a nice example: https://live.orcasound.net/bouts/bout_031YiYO7OqPbPhcmP3mQeb

Requirements are pretty flexible, but the inspiration is largely iNaturalist, and also this very cool project put together by Google Creative Lab back in 2019 https://patternradio.withgoogle.com/

Best place to learn more is to stop by the community Zulip chat (https://orcasound.zulipchat.com/) and ask questions, it's full of really knowledgeable people. Also you can explore the entire codebase here: https://github.com/orcasound/orcasite