They start by discussing the difference between teachers who teach a subject ... and teachers who will discuss changing the foundation of a subject and the implications of big changes. Which is of course required for critical thinking.
But that's a BIG step up in skill from what normal teachers bring to the table. At that point you should be so versed in the subject that you can discuss how the subject is constructed, and why (e.g. the connection between calculus and war). Which, at minimum, requires knowledge of the subject itself, it's history including it's failed history (which paths were not taken or abandoned, like say the axiom of Choice, and why they were abandoned), current research directions (like what the arguments are for and against various kinds of large cardinal numbers. Hell, what large cardinal numbers even are, continuum hypothesis, ...).
I have a master's in Math and I've had 4 teachers, in 27 years, who had anything approaching that level of knowledge. I remember each of them vividly. And I agree with the article: with such teachers you learn 10x what you learn with "normal" teachers. But they are so uncommon that they are literally a rarity in pure math university departments (which have also gotten worse by choosing cheaper over better candidates). Frankly if you have that level of knowledge you leave the teaching profession unless you're insane, because you can do so much better.
In other words: AI can be a pretty sizeable improvement on the average teacher and this paper is the traditional argument against AI. The argument goes "AI doesn't (yet) beat the very best humans at X, so it is totally unusable for X", when AI easily beats average humans. If anything, this is an argument to have those very best teachers switch to teaching AIs, and get rid of the average ones.
And of course, there's the undertone in the article that teaching children provides a measure of social control over society. Which of course is also already a problem. Every subject has extremely controversial parts, like the first applications of calculus (which is to calculate ballistic trajectories. In other words, to kill people from as large a distance as possible. THAT is why we founded calculus, that is what it does very, very well). And if it's that controversial for math ... well, in social sciences papers European scholars started arguing for a holocaust (removing bad genes by terminating incurable patients) at the beginning of the 20th century, when Hitler was a baby crying on his mother's lap. In fact, Autism's invention/discovery and popularization by Hans Asperger had the singular purpose to "purify the genes of the great German people", not to help patients suffering from it. His words, not mine. In other words, really discussing a subject requires coldly and matter-of-factly discussing incredibly bad political ideologies, including when such ideologies are held by scientists/teachers (and pointing out just how bad they can get, how much damage they can do, and how science enables such ideologies to incredible damage)
Society is a construct and we teach children to be a part of that. If Society is shit we teach our children shit. That's it.
Can AI retrieve the knowledge for that? I guess that's possible.
Can AI make it meaningful and actually transfer that knowledge?
To learn something and truly take it as your own human contact is very much wired into our DNA. Trying to replace that with some form of text will rob an essential part in learning that has consequences that we can't properly measure. We can only see what happens with humans as they have less contact, increased loneliness and lack of role models which would likely increase.
Honestly: where in the job market is such knowledge or are such skills actually appreciated?
My life experience says that at least in academic teaching these skills are more appreciated than nearly anywhere else in industry, but if you know better, I'm interested to get to know your perspective.
jruohonen•7h ago