I built drift-guard to prevent this. It's a pure CLI tool (zero token overhead, no MCP server) that snapshots your design tokens and DOM structure, then checks for "drift" at the code level.
How it works: 1. init scans your CSS/HTML and snapshots all design tokens (colors, fonts, spacing, shadows, radius, layout, effects) + a structural fingerprint of your DOM. 2. rules generates rule files for 5 AI tools (.cursorrules, CLAUDE.md, AGENTS.md, copilot-instructions.md, .clinerules). 3. check compares current state against the snapshot and exits with code 1 if drift exceeds the threshold.
Key design decisions: - Zero token overhead: Pure CLI. - Static Analysis: Uses css-tree and cheerio (no headless browser, < 1 second). - Stale snapshot warning: Warns if baseline is older than 7 days. - Structure + Style: Monitors both CSS tokens and DOM hierarchy. - Pre-commit hook: Blocks drifted commits before they land.
GitHub: https://github.com/Hwani-Net/drift-guard npm: npm i -g @stayicon/drift-guard
I'd love feedback! Is "Design Drift" something you've experienced with AI coding tools?