ChatGPT tells me that PA+"PA is consistent" is not quite enough. I believe that it has digested enough logic textbooks that I'll believe that claim.
> I think just PA+"PA is consistent" is enough?
It's not clear to me how. I believe PA+"PA is consistent" would allow a model where Goodstein's theorem is true for the standard natural numbers, but that also contains some nonstandard integer N for which Goodstein's theorem is false. I think that's exactly the case that's ruled out by the stronger statement of ω-consistency.
The work linked here doesn't show that PA is inconsistent, however: what it does is to define a new, weaker notion of what it means for PA to “prove its own consistency” and to show that PA can do that weaker thing.
Interesting work for sure, but it won't mean anything to you unless you already know a lot of logic.
That is: Real numbers describe real, concrete relations. For e.g., saying that Jones weighs 180.255 pounds means there's a real, physical relationship -- a ratio -- between Jones' weight and the standard pound. Because both weights exist physically, their ratio also exists physically. Thus, from this viewpoint, real numbers can be viewed as ratios.
In contrast, the common philosophical stance on numbers is that they are abstract concepts, detached from the actual physical process of measurement. Numbers are seen as external representations tied to real-world features through human conventions. This "representational" approach, influenced by the idea that numbers are abstract entities, became dominant in the 20th century.
But the 20th century viewpoint is really just one interpretation (you could call it "Platonic"), and, just as it's impossible to measure ratios to infinite precision in the real world, absolutely nothing requires an incomputable continuum of reals.
Physics least of all. In 20th and 21st century physics, things are discrete (quantized) and are very rarely measured to over 50 significant digits. Infinite precision is never allowed, and precision to 2000 significant digits is likewise impossible. The latter not only because quantum mechanics makes it impossible to attain great precision on very small scales. For e.g., imagine measuring the orbits of the planets and moons in the solar system: By the time you get to 50 significant digits, you will need to take into account the gravitational effects of the stars nearest to the sun; before you get to 100 significant digits, you'll need to model the entire Milky Way galaxy; the further you go in search of precision, the exponentially larger your mathematical canvas will need to grow, and at arbitrarily high sub-infinite precision you’d be required to model the whole of the observable universe – which might itself be futile, as objects and phenomena outside observable space could affect your measurements, etc. So though everything is in principle simulatable, and precision has a set limit in a granular universe that can be described mathematically, measuring anything to arbitrarily high precision is beyond finite human efforts.
btilly•17h ago
It covers both the limits of what can be proven in the Peano Axioms, and how one would begin bootstrapping Lisp in the Peano Axioms. All of the bad jokes are in the second section.
Corrections and follow-up questions are welcome.
Cheyana•16h ago
burnt-resistor•20m ago
im3w1l•10m ago
anthk•49s ago
Ditto with some Forths.