frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Show HN: LoKey Typer – A calm typing practice app with ambient soundscapes

https://mcp-tool-shop-org.github.io/LoKey-Typer/
1•mikeyfrilot•1m ago•0 comments

Long-Sought Proof Tames Some of Math's Unruliest Equations

https://www.quantamagazine.org/long-sought-proof-tames-some-of-maths-unruliest-equations-20260206/
1•asplake•2m ago•0 comments

Hacking the last Z80 computer – FOSDEM 2026 [video]

https://fosdem.org/2026/schedule/event/FEHLHY-hacking_the_last_z80_computer_ever_made/
1•michalpleban•2m ago•0 comments

Browser-use for Node.js v0.2.0: TS AI browser automation parity with PY v0.5.11

https://github.com/webllm/browser-use
1•unadlib•3m ago•0 comments

Michael Pollan Says Humanity Is About to Undergo a Revolutionary Change

https://www.nytimes.com/2026/02/07/magazine/michael-pollan-interview.html
1•mitchbob•4m ago•1 comments

Software Engineering Is Back

https://blog.alaindichiappari.dev/p/software-engineering-is-back
1•alainrk•4m ago•0 comments

Storyship: Turn Screen Recordings into Professional Demos

https://storyship.app/
1•JohnsonZou6523•5m ago•0 comments

Reputation Scores for GitHub Accounts

https://shkspr.mobi/blog/2026/02/reputation-scores-for-github-accounts/
1•edent•8m ago•0 comments

A BSOD for All Seasons – Send Bad News via a Kernel Panic

https://bsod-fas.pages.dev/
1•keepamovin•12m ago•0 comments

Show HN: I got tired of copy-pasting between Claude windows, so I built Orcha

https://orcha.nl
1•buildingwdavid•12m ago•0 comments

Omarchy First Impressions

https://brianlovin.com/writing/omarchy-first-impressions-CEEstJk
2•tosh•17m ago•0 comments

Reinforcement Learning from Human Feedback

https://arxiv.org/abs/2504.12501
2•onurkanbkrc•18m ago•0 comments

Show HN: Versor – The "Unbending" Paradigm for Geometric Deep Learning

https://github.com/Concode0/Versor
1•concode0•19m ago•1 comments

Show HN: HypothesisHub – An open API where AI agents collaborate on medical res

https://medresearch-ai.org/hypotheses-hub/
1•panossk•22m ago•0 comments

Big Tech vs. OpenClaw

https://www.jakequist.com/thoughts/big-tech-vs-openclaw/
1•headalgorithm•24m ago•0 comments

Anofox Forecast

https://anofox.com/docs/forecast/
1•marklit•24m ago•0 comments

Ask HN: How do you figure out where data lives across 100 microservices?

1•doodledood•24m ago•0 comments

Motus: A Unified Latent Action World Model

https://arxiv.org/abs/2512.13030
1•mnming•25m ago•0 comments

Rotten Tomatoes Desperately Claims 'Impossible' Rating for 'Melania' Is Real

https://www.thedailybeast.com/obsessed/rotten-tomatoes-desperately-claims-impossible-rating-for-m...
3•juujian•27m ago•2 comments

The protein denitrosylase SCoR2 regulates lipogenesis and fat storage [pdf]

https://www.science.org/doi/10.1126/scisignal.adv0660
1•thunderbong•28m ago•0 comments

Los Alamos Primer

https://blog.szczepan.org/blog/los-alamos-primer/
1•alkyon•31m ago•0 comments

NewASM Virtual Machine

https://github.com/bracesoftware/newasm
2•DEntisT_•33m ago•0 comments

Terminal-Bench 2.0 Leaderboard

https://www.tbench.ai/leaderboard/terminal-bench/2.0
2•tosh•33m ago•0 comments

I vibe coded a BBS bank with a real working ledger

https://mini-ledger.exe.xyz/
1•simonvc•33m ago•1 comments

The Path to Mojo 1.0

https://www.modular.com/blog/the-path-to-mojo-1-0
1•tosh•36m ago•0 comments

Show HN: I'm 75, building an OSS Virtual Protest Protocol for digital activism

https://github.com/voice-of-japan/Virtual-Protest-Protocol/blob/main/README.md
5•sakanakana00•39m ago•1 comments

Show HN: I built Divvy to split restaurant bills from a photo

https://divvyai.app/
3•pieterdy•42m ago•0 comments

Hot Reloading in Rust? Subsecond and Dioxus to the Rescue

https://codethoughts.io/posts/2026-02-07-rust-hot-reloading/
3•Tehnix•42m ago•1 comments

Skim – vibe review your PRs

https://github.com/Haizzz/skim
2•haizzz•44m ago•1 comments

Show HN: Open-source AI assistant for interview reasoning

https://github.com/evinjohnn/natively-cluely-ai-assistant
4•Nive11•44m ago•6 comments
Open in hackernews

Show HN: I built a WebMIDI sequencer to control my hardware synths

https://www.simplychris.ai/droplets
43•simplychris•1mo ago
Hey HN,

I’m an ex-Google engineer trying to get back into music production.

I needed a way to sequence my hardware synths using AI contexts without constantly switching windows, so I built this.

It runs entirely in the browser using WebMIDI. No login required. It connects to your local MIDI devices (if you're on Chrome/Edge) and lets you generate patterns.

Tech stack: [React / WebMIDI API / etc].

Link: www.simplychris.ai/droplets

Code is a bit messy, but it works. Feedback welcome.

Comments

vermon•1mo ago
Vibe coded? Asking because it looks very similar to my vibe coded webmidi project which is a beatmatching practice for DJ’s :) https://beat.maido.io/
Mattrou•1mo ago
It definitely was made with gemini, you can tell by the fact that gemini shoe horned AI features that only work with a google api key.
rcarmo•1mo ago
This is pretty cool in concept. Need to go and get stuff to plug into my laptop to test :)
thenthenthen•1mo ago
Does webmidi works over usb-otg? Then maybe it could run from a phone or tablet!
piltdownman•1mo ago
Yeah you can connect via USB MIDI using an OTG adapter by enabling "USB MIDI Peripheral mode" in Developer Options. There's plenty of videos on how to set it up from the Android MIDI Arranger App community - just N.B. you may need a powered USB hub depending on your use-case.
gbraad•1mo ago
I use my tools from a linux machine (reliable) and Android (OK). I got a h4midi wc to improve the setup. Webmidi and JS is not idealz as wakelock is needed and javascript is actually slow.
gbraad•1mo ago
I wrote mine also, integrating an Akai Fire, at https://music.gbraad.nl/meister as part of a tool to do live performances. This controls some of my remix tools, mixxx and vj tools too.

Edit: my usecase is more integrating different tools and devices, Bitwig, Electribe, mixxx, my mod/protracker remix tool, etc. I guess your usecase is more to generate music, less my thing, but possible. I just have a particular sequencer/tracker use. Generation happens in bitwig

Aldipower•1mo ago
This does not solve the underlying problem at all, which makes today's MIDI, coming from a normal computer, almost unusable for serious sequencing. This is timing and jitter issues! So, may I asked, what is the actual use-case for this sequencer? I would like to see/hear some music you made with it. Or is this just for the sake of using AI?
Libidinalecon•1mo ago
If you have hardware synths you are going to have a decent midi and audio interface that this is not a problem. It wasn't even a problem 25 years ago. There is no reason for consumer grade audio to be able to do this because most people will never use it.
Aldipower•1mo ago
I have maybe 20 hardware synths and I do a lot of sequencing. And yes it wasn't a problem 25 years ago, that is exactly why I still use an Atari STe! :-) But today it is a problem. It is just not possible to do complex and tight sequencing today with a normal Win, Mac or Linux computer. Even with my RME PCIe card. Your argument, "it wasn't a problem decades ago, so it cannot today either" is simply not correct.
titzer•1mo ago
From what I understand, midi messages can have timestamps into the future, but that implies buffering on the receiver end. Do most MIDI instruments not support enough buffering to overcome lag? Because in sequencing, the future is pretty-well known.
Aldipower•1mo ago
Yes, they have timestamps. But if you do buffer (or better to say, delay it), you introduce latency, which is even more worse then jitter. The ideal is 0 latency. And another downside with buffering, you would need to manifest the buffer time at all device you trigger to be the same time otherwise you do not stay in sync.

Edit: Actually midi note on events that are being sent to devices do _not_ have a timestamp! Only events that are persisted in a file may have timstamps.

TheOtherHobbes•1mo ago
MIDI 1.0 messages do not have timestamps. (Sys Real Time does, but notes and controllers don't.) Timing is managed by the MIDI sender, and any buffering happens in the interface.

MIDI over MIDI cables is fundamentally not a tight protocol. If you play a four note chord there's a significant time offset between the first and last note, even with running status.

With early MIDI you had a lot of information going down a single cable, so you might have a couple of drum hits, a chord, maybe a bass and lead note all at the same moment.

Cabled MIDI can't handle that. It doesn't have the bandwidth.

Traditional 80s/90s hardware was also slow to respond because the microprocessors were underpowered. So you often had timing slop on both send and receive.

MIDI over USB should be much tighter because the bandwidth is a good few orders of magnitude higher. Receive slop can still be a problem, but much less than it used to be.

MIDI in a DAW sent directly to VSTs should be sample-accurate, but not everyone manages that. You'll often get a pause at the loop point in Ableton, for example.

The faster the CPU the less of problem this is.

If you're rendering to disk instead of playing live it shouldn't be a problem at all.

Aldipower•1mo ago
Bandwidth never was the problem with MIDI, that is actually enough, but your right with _some_ devices in the 80s/90s, that the processor was under-powered for the bandwidth. For example my Roland Alpha Juno 2 from 1986 is somewhat under-powered and not the tightest, but my Casio CZ-5000 also from 1986 is just doing fine! I mean this is almost 40 years ago and there were device that could handle it without problems. The problem with USB though is, that is does buffer in a "non real time safe" way, which leads to unpredictable jitter and interrupts. That means, for MIDI, USB is worse then the original DIN connection.

I am not talking of MIDI in a DAW, without any physical connections, this works just fine.

bitwize•1mo ago
> MIDI over USB should be much tighter because the bandwidth is a good few orders of magnitude higher.

"should be" != "is". The Atari ST had a ROCK SOLID MIDI clock and direct, bare-metal hardware access that meant the CPU could control the signals directly, with known precise timing. This is simply not possible with modern operating systems and hardware interfaces because of all the abstraction layers, with attendant time indeterminacy, that have been inserted in between. It's physically impossible to match the low latency and jitter of an Atari ST doing MIDI with a modern system.

gbraad•1mo ago
Midi from a browser suffers from slowdownw due to have javascript is just too slow, non-threaded. There afde ways around it, but those are all workarounds.

Just move put of focus, and you will see how it handles sending clock. I went to a hardware based, external clock signal, and using spp to force syncs between my tools, and use rtmidi+c

johnwheeler•1mo ago
These opinions are not helpful.
jauntywundrkind•1mo ago
PipeWire with rtkit works incredibly stably with wildly short buffer lengths (low latency). Given the short buffer size, there's not much chance for big timing issues to arise (unless there's underruns with dead air, which doesn't seem to be the case).

This was a surprising assertion to hear. Maybe on some OS, doing reliable timing is a problem. But with modern audio pipelines, things feel like they are in an extremely good state.

Aldipower•1mo ago
Actually I am using PipeWire with rtkit on Debian. But somehow it does not solve my midi problems. "Audio pipeline" is not "midi". Nevertheless I am doing all my _audio_ (not midi) work on Debian and I am very happy with it.
srameshc•1mo ago
Thanks this looks intersting and I am going it to try it later. I have old Axiom 49 and it really doesn't work that much with modern DAW as it is assumed it's old and outdated. But I like the form factor and it is solid. I hope I can make it work witht his one ?
0x20cowboy•1mo ago
Neat - here is another one you might find helpful (chrome only) https://cdn.robrohan.com/seq/index.html

And the source: https://github.com/robrohan/r2_seq

johnwheeler•1mo ago
It's kind of annoying when someone shows what they're working on, and the first comment is always, "Oh yeah, here's some alternatives." It feels like less like you're trying to be helpful and more like you're just kind of cheekily crapping on them. 2 cents. Maybe it would be more helpful if you were to ask how it could be different, what it improves upon. Ask if they've seen this. Something more than, "Oh yeah, here's something in addition to what this person is trying to show."