The educational system is currently being offered something fundamentally unclear in relation to AI. A system that makes mistakes. Imagine an honest dialogue between a minister of education and the chief developer of a company working in artificial intelligence.
Yes, the system can sometimes say things that are not true. It can sometimes inadequately praise the user, reinforcing incorrect assumptions. How capable it is of making reasonable judgments — we do not know. In principle, this is not clearly defined. What will happen tomorrow is unknown, but everyone expects very large progress. And what happened yesterday is: “we made two revolutionary steps one after another over the last two years.”
“Over the last couple of years, almost everything that specialists in artificial intelligence were taught has become significantly outdated.”
You see — the educational system has nothing to rely on. Absolutely nothing is stable or acceptable as a foundation.
Nevertheless, the situation directly and very strongly concerns the educational system and the qualification system. Everyone clearly understands that some large-scale reform is needed. But what exactly should be reformed? A system that already today is: “unknown what,” “unknown how,” “unknown to what extent”… yet it is clear that everything is changing and becoming obsolete very quickly.
On the one hand, it seems reasonable to simply wait — perhaps one year, perhaps two, perhaps five — until the situation stops changing so rapidly. On the other hand, it is clear that during these two or five years young people will significantly move away from the educational system toward artificial intelligence, or possibly toward something else as well.
At the same time, almost all teachers face the threat of a personal career catastrophe. A very important part of their work is rapidly losing value before their eyes. This directly threatens severe demotivation and large-scale institutional breakdown.
In my view, it is impossible to take any reasonable large-scale reform actions in a situation that itself is unclear, undefined, and rapidly changing. And of course, a similar situation has already emerged among other professionals — programmers, doctors, consultants, patent specialists, digital artists, video creators, and others.
This is happening despite the fact that programmers used to consider themselves among the most intellectually advanced. Educators were accustomed to acting as the most confident and authoritative. Patent professionals were the ones who knew the most about inventions. Now it's all just collapsing !
Today, however, a customer may suddenly write several pages of source code. A student may understand a topic much more deeply than a teacher. An inventor entering the patent system may already have studied dozens of nearby inventions in their field. And similar situations are appearing across many professions.
In my opinion, all relevant authorities that are unable to make adequate, rational decisions—beyond all sorts of alarmist concerns about future risks—must recognize and accept the situation of a temporary UNCERTAIN state of emergency and act accordingly. To avoid making foolish decisions in one direction, which will in any case prove inadequate.
It is difficult to come up with procedures that are good for both the caterpillar and the butterfly.
By searching for the author “Kokhan, Serhii G.” on Zenodo, you will find a much larger article that expands and details this post, along with additional explanations on different aspects of the topic.