I wonder who "everyone" was, I'm not aware of many high-profile projects adopting PCG as a default. As of 2025, several high-profile runtimes (including all the major browsers) use xorshift variants [1]
Is there a list of users of PCG?
[1] See Adoption section in https://prng.di.unimi.it/
> O'Neill proposes testing PRNGs by applying statistical tests to their reduced-size variants and determining the minimum number of internal state bits required to pass.[7] TestU01's BigCrush examines enough data to detect a period of 235, so even an ideal generator requires 36 bits of state to pass it. Some very poor generators can pass if given a large enough state;[8] passing despite a small state is a measure of an algorithm's quality, and shows how large a safety margin exists between that lower limit and the state size used in practical applications. PCG-RXS-M-XS (with 32-bit output) passes BigCrush with 36 bits of state (the minimum possible), PCG-XSH-RR (pcg32() above) requires 39, and PCG-XSH-RS (pcg32_fast() above) requires 49 bits of state. For comparison, xorshift*, one of the best of the alternatives, requires 40 bits of state,[5]: 19 and Mersenne twister fails despite 19937 bits of state.[9]
[1] https://numpy.org/doc/stable/reference/random/bit_generators...
BTW, people have broken PCG already: https://hal.science/hal-02700791/file/main.pdf
It takes up to 20000 CPU hours to break the seed from 512 output bits with an unknown state, increment and multiplier. (the multiplier is usually fixed constant)
Showing that reversal takes that many CPU hours shows how good the PRNG quality is.
All else being equal, I don't think it is possible for a trivially reversible generator to have better statistical properties than a generator whose output behaves more like a CSPRNG.
It can definitely be good enough and or faster, though.
Why shouldn’t I just use eg sha512 on the previous hash and drop half the bits?
PRNGs are not meant to be cryptographically secure. If you don't want recoverability by all means use SHA512 or a proper CSPRNG.
But saying PRNGs are bad because there is recoverability is like saying salt is bad because it isn't sweet. PRNGs are not meant for non-recoverability and salt isn't meant to be sweet.
In my conception time is made out of events, and the events are I suppose all material, and all have probabilities. So maybe time follows inevitably from matter. But I think it exists in its own right as a phenomenon that isn't material. There are such things. Knowledge is another one.
He is responsible for multiply-with-carry, xorshift (the original version), KISS (a high quality generator predating the mersene twister) , the Ziggurat algorithm, diehard
Fun fact, one of the earliest methods for generating random mumbers, the middle square method, actually still passes all moderm statistical randomness test suites, if you hook up a weyl sequence to it: https://arxiv.org/abs/1704.00358
This, the middle square weyl sequence PRNG is my favoeite PRNG, because it's simple enough to implement from memory:
uint64_t x, weyl;
uint32_t msws(void) {
x = x * x + (weyl += CONSTANT);
return x = (x >> 32) | (x << 32);
}
You just take a number, square it, advace and add the weyl sequence to it amd finally swap the lower and upper bits, using the trucated result as the output.The CONSTANT is pretty much arbitrary, it just needs to be odd and not too regular. A good rule of thumb is to have no repeating or zero nibbles in each group of 4 bytes, e.g. 0xB5AD4ECEDA1CE2A9.
I've tested msws32 it passes TestU01s BigCrush and didn't fail in >=1 TB of PractRand (I stopped after that). A scaled down msws16 fails PractRand after 2 GB, a msws24() variant passes >=256 GB (I stopped after that).
It's certainly not as good as more state of the art PRNGs like PCG, xoshiro, romu, sfc64, tylo64, but it is very simple and has quite high quality output, much better than any similarly simple to construct PRNG I know of.
The renaming and the having the constant be a variable confused me when skimming for the parts that I was looking for.
So, the state is 128 or 256 for the versions presented and 64 for msws16.
I don't remember if running PractRand in word mode changes the way it reports results but either way failing at 2GB would mean it failed even before going through the whole Weyl sequence although the period itself isn't necessarily reduced.
I'm not sure if the middle-square is acting as a decent non-linear scrambler on the poor adder state or if both combined manage to hold 30 bits worth of state. Swapping the adder with an lcg or lfsr on msws16 would provide an answer.
PractRand has the benefit that we can look at where and how failure happens in these reduced versions so I think the criticism ultimately stands regarding the paper.
Read it and gain a gnawing sense of unease at how "good" things might really be at present!
I don’t get that part.
I gotta think there are going to be some periodics in there that will be toggling the LSB. Like some hum from some device far away will be at the right tiny amplitude to toggle it in a predictable way. Also the ADC hardware could concievably get stuck.
The whole system breaks because someone didn't set up their pulseaudio correctly?
and what if you need 1TB of random data? With 48kHz audio you would be waiting 5000 years haha. 1MB is still more than a day
I think you dropped the "k" in "kHz" in your calculations.
From a particle physics perspective, as an observer in the electromagnetic spectrum, we're always observing through a reference frame based on the speed of light in relation to the observed object. Because it's always in reference to a constant, c, anything perceived at random can theoretically be measured if you had the position of the observer at the time of observation, right?
Am I way off-base here?
I got a 27 yesterday.
lucasfcosta•8h ago