One last rant point is that you don't have "the manual" of math in the very same way you would go on your programming language man page and so there is no single source of truth.
Everybody assumes...
Your rant would be akin to this if the sides are reversed: "It's surprising how many different ways there are to describe the same thing. Eg: see all the notations for dictionaries (hash tables? associative arrays? maps?) or lists (vectors? arrays?).
You don't have "the manual" of programming languages. "
All of which is compounded by the desire to provide minimal "proofs from the book" and leave out the intuitions behind them.
Do you know the reason for that? The reason is that those problems are open and easy to understand. For the rest of open problems, you need to be need expertise to even understand the problem statement.
If we are already venturing outside of scientific realm with philosophy, I'm sure fields of literature or politics are older. Especially since philosophy is just a subset of literature.
As far as anybody can tell, mathematics is way older than literature.
The oldest known proper accounting tokens are from 7000ish BCE, and show proper understanding of addition and multiplication.
The people who made the Ishango bone 25k years ago were probably aware of at least rudimentary addition.
The earliest writings are from the 3000s BCE, and are purely administrative. Literature, by definition, appeared later than writing.
geomark•45m ago
ekjhgkejhgk•39m ago
You know the meme with the normal distribution where the far right and the far left reach the same conclusion for different reasons, and the ones in the middle have a completely different opinion?
So on the far right you have people on von Neumann who says "In mathematics we don't understand things". On the far left you have people like you who say "me no mats". Then in the middle you have people like me, who say "maths is interesting, let me do something I enjoy".
geomark•34m ago
ekjhgkejhgk•31m ago
srean•11m ago
To date I have not met anyone who thought he summed the terms of the infinite series in geometric series term by term. That would take infinite time. Of course he used the expression for the sum of a geometric series.
The joke is that he missed a clever solution that does not require setting up the series, recognising it's in geometric progression and then using the closed form.
The clever solution just finds the time needed for the trains to collide, then multiply that with the birds speed. No series needed.
Davidzheng•21m ago
ekidd•12m ago
To use an example from functional programming, I could say:
- "A monad is basically a generalization of a parameterized container type that supports flatMap and newFromSingleValue."
- "A monad is a generalized list comprehension."
- Or, famously, "A monad is just a monoid in the category of endofunctors, what's the problem?"
The basic idea, once you get it, is trivial. But the context, the familiarity, the basic examples, and the relationships to other ideas take a while to sink in. And once they do, you ask "That's it?"
So the process of understanding monads usually isn't some sudden flash of insight, because there's barely anything there. It's more a situation where you work with the idea long enough and you see it in a few contexts, and all the connections become familiar.
(I have a long-term project to understand one of the basic things in category theory, "adjoint functors." I can read the definition just fine. But I need to find more examples that relate to things I already care about, and I need to learn why that particular abstraction is a particularly useful one. Someday, I presume I'll look at it and think, "Oh, yeah. That thing. It's why interesting things X, Y and Z are all the same thing under the hood." Everything else in category theory has been useful up until this point, so maybe this will be useful, too?)
agumonkey•4m ago