Hey HN, we built ai-bom because we kept finding undocumented AI stuff in production. Devs ship LLM calls, agent frameworks, MCP servers without anyone reviewing it - shadow IT but for AI.
We also built an n8n community node (npm install n8n-nodes-trusera) that lets you scan all your n8n workflows for AI components directly inside n8n. As far as we know this is the first tool that does this - n8n is huge for AI automation but completely invisible to security tools.
Existing SBOM tools (Trivy, Syft, Grype) don't catch any of this. They scan packages and deps but miss things like a LangChain agent calling GPT-4 with a hardcoded API key, or an n8n workflow running 12 AI nodes nobody knew about.
We wrote 13 scanners that detect LLM providers, agent frameworks (LangChain, CrewAI, AutoGen), model files, MCP configs, n8n AI nodes, hardcoded credentials, Docker AI containers, and more. Outputs CycloneDX SBOM and SARIF for GitHub Code Scanning.
pipx install ai-bom
ai-bom scan .
Takes about 45 seconds on a typical repo.
Part of the motivation was EU AI Act Article 53 (Aug 2025) requiring orgs to keep an AI component inventory. But the real use case is security teams just trying to figure out what AI is running in their infra.
Curious to hear:
- What AI patterns are we missing?
- How do you track AI usage in your org today?
- Anyone dealing with EU AI Act compliance yet?
trusera•1h ago
We also built an n8n community node (npm install n8n-nodes-trusera) that lets you scan all your n8n workflows for AI components directly inside n8n. As far as we know this is the first tool that does this - n8n is huge for AI automation but completely invisible to security tools.
Existing SBOM tools (Trivy, Syft, Grype) don't catch any of this. They scan packages and deps but miss things like a LangChain agent calling GPT-4 with a hardcoded API key, or an n8n workflow running 12 AI nodes nobody knew about.
We wrote 13 scanners that detect LLM providers, agent frameworks (LangChain, CrewAI, AutoGen), model files, MCP configs, n8n AI nodes, hardcoded credentials, Docker AI containers, and more. Outputs CycloneDX SBOM and SARIF for GitHub Code Scanning.
Takes about 45 seconds on a typical repo.Part of the motivation was EU AI Act Article 53 (Aug 2025) requiring orgs to keep an AI component inventory. But the real use case is security teams just trying to figure out what AI is running in their infra.
Curious to hear: - What AI patterns are we missing? - How do you track AI usage in your org today? - Anyone dealing with EU AI Act compliance yet?