Edit: I guess the idea is that this is automatically discovering non-Python system dependencies and attempting to include them as well? Either way, the developers should probably get in touch with the people behind https://pypackaging-native.github.io/ which has been trying to identify and solve problems with using the standard Python ecosystem tools in the "PyData ecosystem". (This effort has led to proposals such as https://peps.python.org/pep-0725/.)
> Manylinux requires tools called auditwheel for Linux, delocate for MacOS, and delvewheel for windows; which do something like ldd to list the shared libraries.
From the auditwheel readme: https://github.com/pypa/auditwheel :
> auditwheel show: shows external shared libraries that the wheel depends on (beyond the libraries included in the manylinux policies), and checks the extension modules for the use of versioned symbols that exceed the manylin
> auditwheel repair: copies these external shared libraries into the wheel itself, and automatically modifies the appropriate RPATH entries such that these libraries will be picked up at runtime. This accomplishes a similar result as if the libraries had been statically linked without requiring changes to the build system. Packagers are advised that bundling, like static linking, may implicate copyright concerns
PyInstaller docs: https://pyinstaller.org/en/stable/ :
> PyInstaller bundles a Python application and all its dependencies into a single package. The user can run the packaged app without installing a Python interpreter or any modules. PyInstaller supports Python 3.8 and newer, and correctly bundles many major Python packages such as numpy, matplotlib, PyQt, wxPython, and others.
conda/constructor is a tool for creating installers from conda packages: https://github.com/conda/constructor
Grayskull creates conda-forge recipes from PyPI and other packages: https://github.com/conda/grayskull
conda-forge builds for Windows, Max, Linux, amd64, and arm4. and emscripten-forge builds conda packages for WASM WebAssembly.
SBOM tools attempt to discover package metadata, which should include a manifest with per-file checksums. Can dependency auto-discovery discover package metadata relevant to software supply chain security?
dvc is a workflow tool layered on git that supports Experiments: https://dvc.org/doc/start/experiments/experiment-tracking :
> Experiment: A versioned iteration of ML model development. DVC tracks experiments as Git commits that DVC can find but that don't clutter your Git history or branches. Experiments may include code, metrics, parameters, plots, and data and model artifacts.
A sufficient packaging format must have per-file checksums and signatures. https://SLSA.dev/ says any of TUF, Sigstore.dev, and/or OCI containers with signatures suffice.
From https://news.ycombinator.com/item?id=37808036 :
> conda-forge maintainer docs > Switching BLAS implementation: https://conda-forge.org/docs/maintainer/knowledge_base.html#...
rattler-build supports CPU levels and CUDA levels. Thus conda-forge packages may be more performant on modern CPUs and GPUs than the average PyPI package: https://news.ycombinator.com/item?id=41306658
gnat•6mo ago
Vagrant and Docker behind the scenes. Very cool, and a welcome step up from a tarball.