Also maybe, if this approach could yield stats on if some import was needed or not ?
As it is, top-level imports IMHO are only meant to be used for modules required to be used in the startup, everything else should be a local import -- getting everyone convinced of that is the main issue though as it really goes against the regular coding of most Python modules (but the time saved to start up apps I work on does definitely make it worth it).
The lazy import approach was pioneered in Mercurial I believe, where it cut down startup times by 3x.
It's dirty, but speeds things up vs putting all imports at the top.
import argparse
parser = argparse.ArgumentParser()
parser.parse_args()
import requests
Is an annoying bodge that a programmer should not have to think about, as a random exampleFor most teams I would be pretty skeptical of a internal Python fork, but the Python devs at HRT really know their stuff.
But yes, like you I had a great experience
Oh no. Look I'm not saying you're holding it wrong, it's perfectly valid to host your modules on what is presumably NFS as well as having modules with side effects but what if you didn't.
I've been down this road with NFS (and SMB if it matters) and pain is the only thing that awaits you. It seems like they're feeling it. Storing what is spiritually executable code on shared storage was a never ending source of bugs and mysterious performance issues.
I'd say if you see
from typing import Final
[...]
__all__: Final = ("a", "b", "c")
Its probably 99% safe to pull that from a quick run over of the AST (and caching that for the later import if you want to be fancy)Of course, should one be doing a star import in a proper codebase?
davidteather•3h ago