After this was pointed out pandemic "planning" in the UK simply went from per-week to monthly plannings following the same broken model...
It still turned out to be crazily wrong and over predicted every, single, metric, by orders of magnitude that it was tasked with simulating.
Not too mention it couldn't load configs correctly. Work correctly on the national academic supercomputer. Or gracefully present any results/findings.
This was signed off _blindly_ by the cluster admins, academics, policy advisors and international "experts". And there was significant push back for over a week once this had been demonstrated that there must be a problem with the test methodology (simply running and *checking* the output multiple times). Ask me how I know there wasn't.
The whole field of pandemic modelling I'm sure has come on leaps and bounds in recent years, but it's a shocking sad truth most/all UG computing students with a 1st could have done a better job than these experts at the top of their field.
groundzeros2015•21m ago
I would not draw any of the conclusions the author did. This 3 variable growth and decay model has nothing to do with what happened.
COVID was a good example to me of a “nerd trap” all these simulators, mapping tools, exponential graphs got built. Having access to data can give one an illusion of understanding or at least a distraction from what matters.
firesteelrain•12m ago
There is one positive thing despite RTO becoming a thing and that is remote work became more recognized even if its trending more hybrid.