From Wikipedia:
"Some of the mined uranium was found to have a lower concentration of uranium-235 than expected, as if it had already been in a nuclear reactor. When geologists investigated they also found products typical of a reactor. They concluded that the deposit had been in a reactor: a natural nuclear fission reactor, around 1.8 to 1.7 billion years BP – in the Paleoproterozoic Era during Precambrian times, during the Statherian period – and continued for a few hundred thousand years, probably averaging less than 100 kW of thermal power during that time. At that time the natural uranium had a concentration of about 3% 235U and could have reached criticality with natural water as neutron moderator allowed by the special geometry of the deposit."
wjnc•1h ago
I only know (or knew) high school physics, and when entering this in Claude I get an answer but am unable to verify the answer. Claude says 680 kWh gained per 0.03 grams of U-235 lost due to fission. I am left wondering into what the U-235 fizzed into (sorry, pun) and if I should take that into account.
Edit: There we go with modernity. I went to Claude instead of Wikipedia. Wikipedia at least has the answers. Thanks u/b800h. 100kW of heat on average. I can start filling in the blanks now.