https://www.youtube.com/playlist?list=PL3C690048E1531DC7
walks you through step by step, builds the intuition and provides some historical context in the background.
despite "outdated" animation (15 years ago), still a great resource.
also available in 7 other languages.
Isn't that exactly what topological manifolds are for?
If, however, the (image of) the curve is a topological manifold (with the induced topology), then the (integer) dimension of the manifold will agree with the Minkowski dimension that's explained in the article (and also with the more commonly used Hausdorff dimension[0]).
I recall doing a 4th year "advanced mathematical techniques" course where they went over the basics of graph theory and RSA algorithm. Descrete maths. This kind of thing is not the intensive calculus that people get told is math in high school, but a high schooler could do it if you showed them.
Rules you learn in maths are often based on much simpler, truer (I'll get back to this), things. I learned loads of "move this denominator up to this side, like an escalator!" and such and only when I was something like 13 I nearly shouted "EQUALS MEANS THE TWO SIDES ARE THE SAME THING" as I realised why so many of these god damn rules existed. It also explained suddenly why some things are OK to do and some aren't (notably taking roots).
The other key thing was finding out that these things are decisions. We can choose whatever rules we want and see what happens, and keep whatever is useful. Lots of the more interesting things I've seen later in life have been "well there's no number you can square and get a negative number. What if there was?" or "What if there was a way to raise to a real number, what would that mean?". There's no magical reason we can't divide by 0 but there's not a neat useful answer really, but you can totally define a setup where you can do that and have fun if you want.
Imaginary numbers and Argand diagrams are things my 6 year old can mess about with. He's still developing reasoning so certain things trip him up about seeing the process for solving a thing (fascinating to see where his comprehension is up to and the leaps that happen, I remember the first time seeing him pick up a chess piece, go to move it and before putting it anywhere say "no, if I do that you'll do X") but lots of things are very accessible. Powers are fun because you can make big things quickly, factorials too. Basic solving of equations I was only taught in secondary school yet I was taught "two bananas cost 30p and one banana and one apple costs 40p, how much is one apple" years before and that's just swapping fruit for letters.
I was always good at maths, but I think it's far more fun now I've understood more about what you can mess about with and how simple parts are at times.
Changes to how The Holy Curriculum is taught can occur on decades, but what has changed hardly at all, despite the fact that frankly the curriculum was audibly creaking when I was doing it in the 1980s... at least, if you tried to hear the creaking. Even then it was obvious that what we were being taught had been frozen circa 1930 or 1940 and needed updating for a number of reasons. It's even more out of step with the world now, in every way, but here we are.
Personally, I'd say that geometry and trigonometry seemed the least useful out of the high school math sequence. Trigonometry seemed like "Algebra 3: this time with identities to memorize" and geometry seemed mostly like an homage to Euclid. In my day-to-day life, algebraic thinking comes up much more often than geometric thinking, although of course I know this is not universally true.
Geometry is the classic introduction to proof and rigor, but if I were the benevolent dictator of the math curriculum, I would try to accomplish that with basic combinatorics, graph theory, or even a tiny bit of abstract algebra (proving basic arithmetic facts from the real number axioms, maybe modular arithmetic).
I would cut out a lot of integration grinding. The concept of integration is extremely important, and I want the students introduced to the ideas of doing it symbolically, but the details of all of the manipulations are much less important than the concepts. I would retain only things that are of mathematical interest, like integration by parts (useful as an exercise in how much fluidity you have in doing math and a good check you understand the concept).
Symbolic differential equations I'd cut down a lot. I think there's a sense in which they are useful but the utility is not revealed until a couple of semesters in. Even my college semester was frankly not that useful, you really need to dedicate yourself to them to get the value out of them. I'd put in some more work on numeric integration, and working with computers to get them done. The concepts of differential equations are super, super, super important; the exact knowledge of how to solve this super precise form of this exact differential equation is not. More practical experience to teach intuition, less grinding on symbolic details. The net time spent on them if I were writing the curriculum might even go up, because I'd tip in some (very) basic chaos mathematics here instead of all that grinding.
Matrices I'd have a hard think about. HN is at the epicenter of their utility so it might be hard to see that for most people it's not all that useful in any sense. I might like to move them on to a very explicitly STEM-focused track.
If that sounds horrible, remember that my whole point here is that there's a lot of stuff that should be in the curriculum that currently isn't, or is just glossed over very briefly. Like, I'd like more work spent on basic financial math, both for personal finances and doing things like a bit of running stochastic financial scenarios (integrate this into the stats curriculum, for instance). Which also brings up the idea of doing more simulations of somewhat larger statistical scenarios than you can solve as a closed form by doing Monte Carlo simulations; statistics and probability is very important but for most people it spins off into the symbolic weeds when in fact most problems people will have in real life are not cleanly amenable to such things. (This might also help erase the implicit belief that stats tends to teach that everything is uniformly distributed.)
The curriculum as it is is also structured as an unmotived series of solutions to problems the students don't have. I would try to give them the problems before the solutions; e.g., if I'm building a stats curriculum around simulations that go into symbolic math rather than just handing them symbolic math from the beginning, imagine covering the Central Limit Theorem from the point of view of me giving out a couple of different simulation assignments that all "happen" to converge to it despite very different scenarios, and we can discuss & teach why the seemingly totally different scenarios produced such a similar outcome, rather than just handing it down from On High as a Solution to a problem none of them have.
You can't have things like that if you aren't willing to cut something else because we are full up right now. There's a lot of room now for people to get more intuitive grasps of these things by virtue of working with them through practical numeric integration, Monte Carlo simulations, maybe we'll do a brief section on enough 3D geometry to do a useful introduction to 3D modelling/CAD/CNC/3D printing/the whole complex of modern tools available that use some form of 3D modelling, there's so much useful stuff that's just aching to be let in that the culling of the old needs to be a bit aggressive after a century.
To be honest I'd not cover this dimensionality stuff, except perhaps as a special-interest side day or something. It's not bad to do a few of those, just to expose students to the diversity in math. I got graph theory like that in my own high school, and it can stay that way; I wouldn't amp it up any. Dimensionality does pivot nicely into fractals and fractals have notoriously pretty pictures associated with them, so maybe I'd sneak it in with that.
The limit for x is the mapping from x to a point in 2D.
The proof is by observing that the subsequent "jumps" are exponentially smaller.
First, forget briefly about the Hilbert curve and just think about the unit square [0,1]^2.
If you take any point (x,y) in the unit square, we can associate x and y with binary coordinates, say x = 0.a1a2a3... and y = 0.b1b2b3... Then we can just define a new number z with binary representation 0.a1b1a2b2a3b3... And going the other way, given z in [0,1], we can take the 'even' binary coordinates to get x, and the 'odd' binary digits to get y.
The problem with this specific mapping is that the function is not continuous. But if you are a bit more careful:
1. The first digit says "left half vs right half"
2. The second digit says "top half vs bottom half" (of the rectangle from 1)
3. The third digit says "left half vs right half" (of the square from 2)
etc.
and then if two numbers share the first n binary digits (i.e. your points are close on the real line) then the corresponding points in the plane will also be quite close together (they are inside the same square / rectangle with side length like 2^(-n/2) at step n).
The "reason" why the dimension is different is precisely because of the "n/2": for every n digits of precision you have in the number z, you only get n/2 digits of precision for each of (x, y).
This is a bit imprecise because of issues with non-unique binary representation but (at least for me) it captures the spirit of why this should work!
Thus this is a constructive proof that you can use easily convert Hilbert indices and coordinates back and forth between each other. Again, reading the algorithm I'm pretty sure if you give it the algorithm arbitrarily detailed numbers out to infinity, the obvious extension would "work" in that you can slap a "limit to infinity" on the algorithm in both cases and it would converge to any given point you like.
Topological dimension is indeed something one can define: e.g. the Koch snowflake [1] or the graph of the Weierstrass function [2] have topological dimension 1. Actually, the first is homeomorphic to the unit circle and the second is homeomorphic to real line. It's great if you are doing topology and you only care about how things look like up to homeomorphism. But if you have metric structure (and you care about it), it is not so useful.
Minkowski dimension is certainly easy to define but it has some problems: sets which are "very small" (like a sequence `1/log(n)`) can have Minkowski dimension 1. The article has a minor technical oversight: the limit certainly does not need to exist. Minkowski defined it as the limit supremum of the sequence (actually, he defined it in terms of the decay rate of the size of the neighbourhood of the set, but this is equivalent). But one could analogously define a "lower" variant by taking the limit infimum instead.
Hausdorff dimension is not discussed in this article, but it is probably the most "robust" notion of dimension one can define. The Hausdorff dimension of any sequence is 0. But even then, lots of sets with Hausdorff dimension 1 can be very small, like the fat Cantor set which has dimension 1 but has length 0 [3]. So this 'dimension' does not necessarily line up with the intuition for "1-dimensional" in esoteric circumstances.
But even Hausdorff / Minkowski dimension does not capture the essence of some matters. For example, one might be interested in when a certain space can be mapped into another space without too much distortion (let's say by a map which respects the metric, like a bi-Lipschitz map). It can easily happen that a set has small (finite) Hausdorff or Minkowski dimension, but it cannot be embedded in a non-distorting way in any finite dimensional Euclidean space. This happens for instance with the real Heisenberg group [4]. If you are interested in this type problem then you want something like Assouad dimension [5].
The moral of the story is: the correct notion of dimension depends critically on what you want to do with your notion of 'dimension'. For sets which are very nice (smooth manifolds) all "reasonable" notions of dimension will coincide with what you expect; but beyond this there is an infinite zoo of ways to define dimension which are all reasonable in various ways, but capture genuinely different notions of 'size'.
[1]: https://en.wikipedia.org/wiki/Koch_snowflake
[2]: https://en.wikipedia.org/wiki/Weierstrass_function
[3]: https://en.wikipedia.org/wiki/Smith%E2%80%93Volterra%E2%80%9...
You can track a point to create a line, you can shift the line to make a plane, you can move the plane to create a space, you can change the space to create time, you can observe time to create bliss, you can reflect on bliss to create thought, you can use thought to create an idea and you can use that idea to make a thingy.
You can create as many dimensions you like, all perpendicular to each other - one way or another. But I promise you this; even though the line is needed for the thingy, the line is blissfully unaware of this fact.
I think that's oversimplifying an important point. If you build a Hilbert curve in a 1x1 square, the vertices of the curve always have rational coordinates. So all points on its line segments must always have at least one rational coordinate. There's no way it can cross through every point in a square region of R^2.
A better way to say it might be "the gaps are reduced towards zero and the curve will pass arbitrarily close by every point in its envelope". That still explains why its Minkowski dimension must be 2.
swayvil•14h ago
magackame•8h ago
brettermeier•8h ago
omnicognate•7h ago
robin_reala•7h ago
cvoss•2h ago
Many browsers have similar settings.