frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

GPT-5.3-Codex System Card [pdf]

https://cdn.openai.com/pdf/23eca107-a9b1-4d2c-b156-7deb4fbc697c/GPT-5-3-Codex-System-Card-02.pdf
1•tosh•1m ago•0 comments

Atlas: Manage your database schema as code

https://github.com/ariga/atlas
1•quectophoton•4m ago•0 comments

Geist Pixel

https://vercel.com/blog/introducing-geist-pixel
1•helloplanets•6m ago•0 comments

Show HN: MCP to get latest dependency package and tool versions

https://github.com/MShekow/package-version-check-mcp
1•mshekow•14m ago•0 comments

The better you get at something, the harder it becomes to do

https://seekingtrust.substack.com/p/improving-at-writing-made-me-almost
2•FinnLobsien•16m ago•0 comments

Show HN: WP Float – Archive WordPress blogs to free static hosting

https://wpfloat.netlify.app/
1•zizoulegrande•17m ago•0 comments

Show HN: I Hacked My Family's Meal Planning with an App

https://mealjar.app
1•melvinzammit•18m ago•0 comments

Sony BMG copy protection rootkit scandal

https://en.wikipedia.org/wiki/Sony_BMG_copy_protection_rootkit_scandal
1•basilikum•20m ago•0 comments

The Future of Systems

https://novlabs.ai/mission/
2•tekbog•21m ago•1 comments

NASA now allowing astronauts to bring their smartphones on space missions

https://twitter.com/NASAAdmin/status/2019259382962307393
2•gbugniot•26m ago•0 comments

Claude Code Is the Inflection Point

https://newsletter.semianalysis.com/p/claude-code-is-the-inflection-point
3•throwaw12•27m ago•1 comments

Show HN: MicroClaw – Agentic AI Assistant for Telegram, Built in Rust

https://github.com/microclaw/microclaw
1•everettjf•27m ago•2 comments

Show HN: Omni-BLAS – 4x faster matrix multiplication via Monte Carlo sampling

https://github.com/AleatorAI/OMNI-BLAS
1•LowSpecEng•28m ago•1 comments

The AI-Ready Software Developer: Conclusion – Same Game, Different Dice

https://codemanship.wordpress.com/2026/01/05/the-ai-ready-software-developer-conclusion-same-game...
1•lifeisstillgood•30m ago•0 comments

AI Agent Automates Google Stock Analysis from Financial Reports

https://pardusai.org/view/54c6646b9e273bbe103b76256a91a7f30da624062a8a6eeb16febfe403efd078
1•JasonHEIN•33m ago•0 comments

Voxtral Realtime 4B Pure C Implementation

https://github.com/antirez/voxtral.c
2•andreabat•36m ago•1 comments

I Was Trapped in Chinese Mafia Crypto Slavery [video]

https://www.youtube.com/watch?v=zOcNaWmmn0A
2•mgh2•42m ago•0 comments

U.S. CBP Reported Employee Arrests (FY2020 – FYTD)

https://www.cbp.gov/newsroom/stats/reported-employee-arrests
1•ludicrousdispla•44m ago•0 comments

Show HN: I built a free UCP checker – see if AI agents can find your store

https://ucphub.ai/ucp-store-check/
2•vladeta•49m ago•1 comments

Show HN: SVGV – A Real-Time Vector Video Format for Budget Hardware

https://github.com/thealidev/VectorVision-SVGV
1•thealidev•50m ago•0 comments

Study of 150 developers shows AI generated code no harder to maintain long term

https://www.youtube.com/watch?v=b9EbCb5A408
1•lifeisstillgood•51m ago•0 comments

Spotify now requires premium accounts for developer mode API access

https://www.neowin.net/news/spotify-now-requires-premium-accounts-for-developer-mode-api-access/
1•bundie•54m ago•0 comments

When Albert Einstein Moved to Princeton

https://twitter.com/Math_files/status/2020017485815456224
1•keepamovin•55m ago•0 comments

Agents.md as a Dark Signal

https://joshmock.com/post/2026-agents-md-as-a-dark-signal/
2•birdculture•57m ago•0 comments

System time, clocks, and their syncing in macOS

https://eclecticlight.co/2025/05/21/system-time-clocks-and-their-syncing-in-macos/
1•fanf2•58m ago•0 comments

McCLIM and 7GUIs – Part 1: The Counter

https://turtleware.eu/posts/McCLIM-and-7GUIs---Part-1-The-Counter.html
2•ramenbytes•1h ago•0 comments

So whats the next word, then? Almost-no-math intro to transformer models

https://matthias-kainer.de/blog/posts/so-whats-the-next-word-then-/
1•oesimania•1h ago•0 comments

Ed Zitron: The Hater's Guide to Microsoft

https://bsky.app/profile/edzitron.com/post/3me7ibeym2c2n
2•vintagedave•1h ago•1 comments

UK infants ill after drinking contaminated baby formula of Nestle and Danone

https://www.bbc.com/news/articles/c931rxnwn3lo
1•__natty__•1h ago•0 comments

Show HN: Android-based audio player for seniors – Homer Audio Player

https://homeraudioplayer.app
3•cinusek•1h ago•2 comments
Open in hackernews

Time travel? Or, just clever technology

https://www.syncdna.com/blog/time-travel-or-just-clever-tech
75•yabones•4mo ago

Comments

AshamedCaptain•4mo ago
Latency compensation. This why most people do not perceive any latency or lipsync issue when browsing YouTube on their phones with their expensive bluetooth headsets that have ridiculous amounts of latency (>250ms).
kgwxd•4mo ago
destroys game sounds and music-making apps though. Guess these things are designed for pure consumption though, no fun or creativity allowed.
toast0•4mo ago
Large delays in audio/video aren't condusive to interactive applications. But chat up someone who plays a pipe organ; they often play in ensembles and have to deal with serious latency (but probably they just worry about hitting their notes in time to be on time and the ensemble has to match up with them)
vunderba•4mo ago
Great point. I'd love to see a chart for respective "attack" values for classical instruments. My experience with organs is only smaller electric organs like the Hammond B-3 so it's not really as much of a factor there.
ricardobeat•4mo ago
> these things are designed for pure consumption

This is an old trope that needs to die. I used to play synths on a 1st gen iPad back in 2010, it had amazing <5ms latency at a time when you'd struggle to hit that on a PC using external hardware.

Wired headphones have always been around, even now all it takes is a $5 adapter. Bluetooth "aptX Low Latency" has also been around for years, though adoption has been a bit slow. It is quite standard on Android phones. On the Apple side, Airpods have had decent ~100ms latency for a few years (enough for casual gaming), and more recently have a custom wireless, low-latency lossless connection (Apple Vision only atm), and <20ms latency on the new iPhone 17 using Bluetooth 6.

It really is a wireless technology problem. Bluetooth LE audio only came around 2020, and barely adopted. Bluetooth 6 was announced late last year and just starting to show up in devices now.

forrestthewoods•4mo ago
End to end audio latency is sooooo bad in games. On your desktop wired headphones a lot of AAA games will have 150ms of audio latency. Effectively no one measures this.

https://youtu.be/JTuZvRF-OgE

Android audio stack is notoriously easy to make very very bad. Too many layers of software abstraction. Later upon layer of buffering. It’s all so bad. :(

AshamedCaptain•4mo ago
> In the Apple side, Airpods have had decent ~100ms latency for a few years (enough for casual gaming)

This needs a big citation. It has always been claimed that AirPods have no discernible latency and every time it is tested it is actually pretty subpar (>150ms).

> Bluetooth "aptX Low Latency" has also been around for years, though adoption has been a bit slow. It is quite standard on Android phones

Almost no Android phones support it. Anything Samsung for example is excluded, even if they use Snapdragon Sound chips.

It is not really a technology problem, since I was doing 50ms latency with plain old HFP profile on BT 1.x back in 2002 with a Nokia and the cheapest headset. Latency has being going up even though nothing really changed in the underlying technology (Classic Bluetooth Audio HFP/A2DP is practically unchanged since Bluetooth 2.x times, while LE Audio introduced in BT 5.x is used by almost no one and their codec selection can be considered sabotage).

The problems are (from an enthusiast & armchair analyst PoV):

- Consumers don't care (YouTube works well, after all) and can't even measure it correctly. Manufacturers don't report latency on specs.

- Everyone has an incentive to make it subpar so that they can promote their proprietary solutions with vendor lock-in. Qualcomm/CSR is _specially_ guilty of this, and they dominate the BT headset industry. But literally everyone is doing it these days (Samsung, Sony, Apple, etc.). And even then, most of the time these techs provide negligible improvements on latency or quality (since, #1, customers can't measure).

- The Bluetooth SIG no longer has any remaining teeth (it never had many to begin with). They just rubber-stamp and things barely interoperate with each other these days (e.g. last Sony headsets "support" LE Audio as per logo but on release could not talk with any of the existing LE Audio stacks).

CharlesW•4mo ago
> This needs a big citation.

Here's one from 2022 (so AirPods Pro 2 and iOS 15 or 16), but: https://stephencoyle.net/airpods-pro-2

"As you can see, the second-generation AirPods Pro perform about 40ms better than their predecessors, with an average latency of 126ms vs the original’s 167ms.

"Perhaps a more interesting point to note is that the second-generation AirPods Pro perform only 43ms worse than the built-in speakers (at 83ms). That suggests that up to two-thirds of the time between touching the screen and hearing a noise occurs before Bluetooth data leaves the device. I think there’s still too much latency for audio feedback to feel snappy and responsive over AirPods Pro 2, but maybe at this point there are easier gains to be made by working to reduce the device-side latency."

ricardobeat•4mo ago
On the last point, most of that latency is from the touchscreen response, the audio system is capable of single-digits latency.

Things may have gotten much worse recently as I distinctly remember the iPhone around the 4s-6s era having <30ms latency which was a huge advantage over Android.

The Apple Pencil can also go under 10ms latency when drawing, there must be a way of taking advantage of that for music apps?

AshamedCaptain•4mo ago
Is there really _any_ other measurement? This is the number quoted by a lot of PRs, but e.g. professional reviewers put much larger numbers

e.g. rtings puts AirPods Pro 2 at 160ms https://www.computerbase.de/artikel/audio-video-foto/apple-a... (AirPods Pro 3 review is in progress)

and ComputerBase puts AirPods Pro 3 at 160-180ms. https://www.computerbase.de/artikel/audio-video-foto/apple-a...

vunderba•4mo ago
You can definitely do music recording on an iPad with WIRED headphones - I have an iPad Pro 10.5 that I'll use in conjunction with a 37-key midi keyboard and it's great for travel.

But there is NO world where you are doing realtime playback/recording even with very loose quantization with an iPad and BT headphones. Maybe some day, but that day is not now.

Latency

- under 30ms = acceptable

- between 30-50ms = irritating but you can work around it

- between 50-100ms = easily detectable when you press a key and almost unusable

- above 100ms = patently absurd

And this is latency figures for laying down melodies. Latency needs to be even tighter when laying down drums.

The only decent wireless headphones I've ever used in a recording setting were AIAIAI TMA-2 Wireless+ [1] which uses a dedicated radio transmitter.

[1] https://aiaiai.audio/stories/products/deep-dive-w-link

zenmac•4mo ago
Caching in the headset. Yeah like the sibling comments. People who makes music and sound stuff hates Bluetooth! And hate those companies that takes the mini jack out of the phone.
reactordev•4mo ago
Oh man, any guitarist that comes by with those wireless Bluetooth connectors always is an eighth beat behind… no matter how we tweak it, they just can’t play on time when using them.

Plug them in directly, no problem.

jama211•4mo ago
Err you know they match the video to the audio delay on modern smart phone apps right? The only time you actually experience latency is when you attempt to pause/play the media, or you’re trying to do something in real time (like record with people around you). If you use airpods and an iphone or something the video will not be out of sync with the audio.

You can literally even see the video lag when you hit play as it ensures it syncs with the audio.

alganet•4mo ago
I don't know about podcasts and stuff, but for music, you can already OBS this puzzle out real quick and use the natural song bars as synchronization steps.

1. One musician plays simple bars, repetitive stuff, and streams it.

2. Second musician receives the audio from musician 1, records a multi audio track video of himself alone (in one track) and the musician 1 output in another.

3. Stack undefinitely.

You play to what you hear, in real time. All tracks are recorded separately in separate computers and can be edited together quite easily.

Plus, this is already how most jams work in real life.

> "now" isn't a single universal instant, it's a moving target

Rhythm is already a moving target, a delay cycle. Musicians just need to know the beat 1 for each bar (which they should already know, as it is their job).

curtisblaine•4mo ago
Yes, but musician 1 can't possibly react to musician 2's output meaningfully, because it happens after musician 2 listened to musician 1 and played its part. That's not how jams with musicians physically in the same room work.
alganet•4mo ago
Fair enough, you couldn't have something like the vocalist cueing the bassist and the drummer picking it up out of thing air and doing an improvised roll, like it happens here:

https://youtu.be/eg_rgm9VDAw?t=1597

The drummer takes a little bit more than a second to react to it (higher than a lot of stream delays by the way, but I can see how the stacking could mess it up).

That is, however, a bunch of experienced jazz musicians improvising at a high level. In most jams, these conversations happen very often on the next bar (1 and 2 and 3 and 4 and you react on the next 1).

You can see a drummer using the end of a bar to cue the flourish used on the organ in this part, for example:

https://youtu.be/jhicDUgXyNg&t=587s

It takes multiple seconds for the organist to recognize the cue, that is actually for the next bar, then he joins in. This sort of stuff is actually doable just with just video chat and OBS.

Please also note that the product's example workflow is actually worse in that "reaction jammyness" regard than what I proposed:

> The performer receives the track early, and waits the rest of the delay period to play it

This is designed for recording. It sounds more like a studio arrangement in which you have to record your part than a jam session.

> The fidelity of the live stream isn't high enough to record

Seems like an incomplete product. OBS can already record multi-tracks and monitor tracks, which you can leverage to produce high quality artifact recordings. I use to sync them manually using a DAW, but with all those auto-timers, it's a surprise it doesn't do it automatically.

ghusbands•4mo ago
> This sort of stuff is actually doable just with just video chat and OBS.

If what each person is hearing is 100-400ms delayed from what each person is producing, how can they possibly mutually react or even get their music in time? If B plays in time with what they hear from C, C hears what B did 200-800ms later - that's far too much and will sound terrible.

Jamming would seem to require incredibly low latency audio just for the rhythm to work between two performers.

alganet•4mo ago
I just showed you, with examples. Musicians reacts to musical structure, which can be very loose compared to what engineers think of latency. A 12-bar blues can give lots of free time to improvise without feedback.

Also, the stacked delay is part of their product. My solution just does it for free, but it's the same idea.

belter•4mo ago
> Our Universe has one very inconvenient problem: it has an unbreakable speed limit. While it may seem instant, light takes quite a while to get around, traveling at just under 300,000 KM per second.

Google and Azure Availability Zones seem to break that limit daily ... ;-)

belter•4mo ago
This was a test and you failed...
curtisblaine•4mo ago
This works for a listener down the chain, but obviously can't work for performers playing together. The article mentions a producer listening to remotely located performers as they were playing together, but fails to mention how these remotely located performers can sync to each other in the first place.
FrancoisBosun•4mo ago
Explained in the article:

> A producer sends a backing track to a performer - SyncDNA adds a slight delay to the outbound feed

The "backing track" is probably the beat or something similar.

curtisblaine•4mo ago
I still fail to understand why this is a thing. Two possibilities:

1) the beat is created live by a human performer who can't meaningfully hear the other performer(s) in time. He / she is stuck with playing blindly.

2) the beat is pre-recorded - sampled or electronically generated on a sequencer. Then what's the use case in the first place? The other performer can download it offline and play on it live.

All this is done to get something that mimics a live performance (but isn't, because the band components can't hear each other in real time) to someone listen-only at the end of the chain. What's the advantage in doing so? What's the use case?

Xmd5a•4mo ago
I think this would be easier to explain using strips of paper. Start by placing the strips horizontally, drawing a vertical line across all of them with a red pen. Then, shift the strips to the right relative to each other to represent latency. Next, having fold a section of each strip onto itself to shorten them beforehand, unfold them to show an extended length representing delay. Finally, fold them back again to represent waiting for the remaining delay from the transmit track.
Nurbek-F•4mo ago
No. It's not easier mate. I don't know what I just read...
jama211•4mo ago
To be fair, you read a description of their method, rather than them using the method in front of you.
k__•4mo ago
Wouldn't this only work for DAGs?

For example, you got a drummer that does their thing.

The bass can react to the drummer.

The guitar and vocals can react to drummer and bass.

Each one could get a finished version, but with so much delay that they can't meaningfully react to each one coming after them or being on the same level.

supermatt•4mo ago
Each performer plays to a backing track, and their performances are synchronised by artificial delay. They don’t hear the composite performance in realtime while playing (although they can hear a delayed version, if desired).
vivzkestrel•4mo ago
I request you to edit the article and also add the point that "30 mins is only from the perspective of the observer on earth or mars" For a person who is actually making the voyage at the speed of light (impossible if you mass honestly) it is instantaneous
verzali•4mo ago
...are you calling me fat?