Also it's interesting that with ProMotion enabled it reports 16.67ms per frame (indicating 60Hz redraw rate) in Safari, but in Chrome it's 8.33.
Although it's for gamepads, it's pretty much indispensable in debugging gamepad-related latency issues. For example, I found that my presumably 1000Hz controller can do only 500Hz in ideal conditions and it starts to drop at a much lower distance from the computer than advertised. Neat stuff.
I’m curious if there is a USB hub that I could buy of higher quality as my mac doesn’t have too much i/o
I’d love to be wrong on this but haven’t been so far.
I can’t buy this:
> I've also learnt I do benefit from the 8 kHz setting of my mouse, as even at 3200 DPI with fast & smooth motion, some frames still miss a pointer update
It may be true that pointer updates were being missed. But does that really affect anything?
It turns out that there’s a way to test this experimentally. Do a double blind experiment, just like in science. If you can tell which monitor is 240hz more than randomly, then it matters. Ditto for the pointer updates.
The corollary is that if you can’t tell with better than random chance, then none of this matters, no matter how much you think it does.
Experiments like this have decisively settled the “Does higher sampling rate matter when listening to music?” debate, among other questions. People still swear that they can tell that there’s a difference, but it’s expectation bias. They’re mistaken.
(10ms drops every few seconds would definitely be noticeable though; that wasn’t the point.)
I'm game for a randomized blinded test on 120 Hz refresh rate vs 240 Hz refresh rate. I would indeed be very curious to confirm I can tell the difference with a proper protocol.
Many years back (we were on CRTs), I was in similar shoes, convinced my friend couldn't tell the difference between 60 Hz and 90 Hz when playing video games.
Turns out he only needed to look at the pointer through one push of the mouse to tell right away, successful 100% of the time in a blinded experiment.
It's like lightning strokes of tens of microseconds making a lasting impression on your perception of the scene. You don't "count" strokes over time, but in space.
When you make circles fast and large enough on screen, you can evaluate the number of cursors that appear before your eyes. At 4 circles per second, is each circle made of ~60 pointers or ~30? Belief not fact: it's not hard to guess.
That’s a silly experiment. I could look at a CRT with a completely static image and tell almost immediately whether it was at 60Hz, 90Hz or 120Hz. Flickr at 60Hz was awful, 90Hz was clearly perceptible, and even 120Hz was often somewhat noticeable. And most CRT/graphics card combos would become perceptibly blurry in the horizontal direction at 120Hz at any reasonable desktop resolution, so you could never truly win. Interlaced modes made the flicker much less visible, but the crawling effect was easy to see and distracting.
There are videos on youtube showing people perceive differences at much higher framerates. e.g. https://www.youtube.com/watch?v=OX31kZbAXsA (long video, so you can skip to the end - they found that even casual players were performing measurably more consistently at 240Hz than even 144Hz.)
Anecdotally, I recently switched to playing racing games at 165FPS and the difference is massive!
Imagine 2 identical gaming setups with 2 players that have the same skill set. In an FPS game, you'd expect each of those players to win 50% of the games.
Now switch one monitor from 120Hz to 240Hz. On average, the player on the 240Hz monitor will see their adversary 4ms earlier than the player on the 120Hz monitor and thus be able to push the mouse button earlier too.
A pro FPS player might notice that they loose contests peeking around corners more often. Obviously network latency in online games will be a factor as well, but since it likely averages out for both players over time, I would guess you can mostly discount it along with alternating who’s doing the peeking.
I don’t think anyone could look at a scene on a 120hz vs 240hz display and tell the difference, there needs to be some indirect clue.
If I’m just watching, I’m not sure I could even tell the difference between 60hz and 144hz.
Not really relevant. Music is experienced after a Fourier transform, in frequency space,
The more telling example is that experienced drummers get frustrated by lag of 2 ms from computer-generated effects. That's 500 Hz.
From this one paper alone, humans can perceive information from a single frame at 2000 Hz.
https://doi.org/10.1080/00223980.1945.9917254
Humans can read numbers and reproduce them immediately a 5 digit number is displayed for 1 frame at 400 fps. This is a single exposure, it is not a looping thing with persistence of vision or anything like that. 7 digit numbers required the framerate to be 333 fps. Another student produced 9 digit number from a single frame at 300 fps. These were the average results. The record results were a correct reproduction of a 7 digit number from a single viewing of a single frame at 2000 Hz. This was the limit within 2% accuracy of the tachistoscopic equipment in question. From the progression of the students chasing records, no slowing of their progression had ever been in sight. The later papers from this author involve considerable engineering difficulty to construct an even faster tachistocope and are limited by 1930s-1940s technology.
This research led the US Navy in WW2 to adopt tachistotopic training methods for aircraft recognition replacing the WEFT paradigm (which had approximately a 0% success rate) to a 1 frame at 75 fps paradigm which led to 95% of cadets reaching 80% accuracy on recognition, and 100% of cadets reaching 62.5% accuracy after just 50 sessions.
Yes, humans can see 2000 fps. Yes, humans can see well beyond 2000 fps in later work from this researcher.
https://doi.org/10.1080/00223980.1945.9917254
Yes, humans can detect flicker well above 1000 fps in daily life at the periphery of vision with cone cells as cone cells can fire from a single photon of light and our edge detection circuits operate at a far higher frequency than our luminance and flicker-fusion circuits. Here's flicker being discriminated from steady light at an average of 2 kHz for 40 degree saccades, and an upper limit above 5 kHz during 20 degree saccades, which would be much more typical for eyes on a computer monitor.
There is no known upper limit to the frequency of human vision that is detectable. As far as I know, all studies (such as this one I link) have always been able to measure up to the reliable detection limit of their equipment, never up to a human limit.
Presumably the 3200 Hz is needed for a combination of reasons:
- Under ideal conditions, if you want less than 10% variation in the number of samples per frame at 240Hz, you may need ~2400Hz. This effect is visible even by human eyeballs — you can see multiple cursor images across your field of view, and uneven spacing is noticeable.
- The mouse itself may work less well at a lower sampling rate.
- The OS and input stack may be poorly designed and work better at higher rates.
In any case, the application and cursor implementation are unlikely to ask for a mouse location more than once per frame, so the user is not really using 3200 updates per second, but that’s irrelevant.
There are other differences in the tools, mine was designed for what I wanted to understand so I'm biased toward it.
This delay wasn't present on the Logitech gaming mouse I previously used, probably a combination of a high polling rate (500Hz) and a much longer idle delay. The battery life was also much shorter, only 250 hours on high-performance mode, but I just recharged a set of AA batteries every week so it was never an issue.
I ended up returning the Marathon mouse.
ptramo•5h ago
BearOso•5h ago
Neat tool, though. I'm also very sensitive towards latency.