E.g. for Rust: Crate is published crates.io -> triggers automatic docs build on docs.rs -> Dash clients can now pull docsets through a proxy that builds the docsets built on the static HTML bundles built for docs.rs.
The docs should probably be pinned to the version of the tool you have installed. Aside from that, pinning to a specific container hash (not tag) allows you to audit it and trust that malicious instructions haven’t been introduced into the docs.
```shell # Pin to commit hash for production security # Get current hash from: https://hub.docker.com/r/keminghe/py-dep-man-companion/tags docker pull keminghe/py-dep-man-companion@sha256:2c896dc617e8cd3b1a1956580322b0f0c80d5b6dfd09743d90859d2ef2b71ec6 # 2025-07-22 release example
# Or use latest for development docker pull keminghe/py-dep-man-companion:latest ```
"Stop getting out-of-date Python package manager commands from your AI. Cross-reference latest official pip, poetry, uv, and conda docs with auto-updates."
What I think would be great is either you hosting a central server permanently available to public and somehow convincing major AI service providers to query your servers for solving that narrow scope of tasks, or rather do something similar for open source models available on Hugging Face or something.
Where this actually shines is with local LLMs (Ollama, etc) - smaller models, no API costs, fully offline, and the AI gets fresh docs without waiting months for model retraining cycles. Your point about convincing major providers to integrate something like Dash (https://kapeli.com/dash) would definitely be the ideal solution though.
I definitely hear you on the broader ecosystem approach. Anything you've been working on in the same space?
WhatsName•1d ago
keminghe•1d ago