Since launching LLMS Central (https://llmscentral.com) a few months ago, we're now tracking hundreds of AI bot visits daily across our network. The data is fascinating.
### What We're Seeing
*Daily Bot Traffic (Across Our Network):* - 300-500+ AI bot visits per day - GPTBot (ChatGPT) dominates at ~60% of traffic - Claude, Perplexity, and Google's AI bots make up most of the rest - Peak crawling hours: 2-4 AM UTC (training runs?)
*Real Patterns Emerging:* - Technical documentation gets 5x more AI bot traffic than average content - Blog posts with code examples are crawled 3x more frequently - Sites with llms.txt files see 40% more organized crawling - Most sites have zero visibility into AI bot activity
*Surprising Findings:* 1. AI bots are WAY more active than most people realize 2. They're not just training - they're actively crawling for real-time answers 3. Different bots have different content preferences (Claude likes long-form, Perplexity loves news) 4. Traditional analytics completely miss this traffic
### Technical Details
*Stack:* - Next.js 15 (App Router) - Firebase Firestore for analytics - 2KB tracking script (async, zero perf impact) - Real-time user-agent detection + IP verification
*Bot Detection:* - User-agent parsing (GPTBot, Claude-Web, etc.) - IP range verification (OpenAI, Anthropic, Google) - Behavioral analysis (crawl patterns) - 99%+ accuracy
*Privacy:* - No PII collected - GDPR compliant - Users control data retention - Open source tracking script (coming soon)
### Why I Built This
I noticed my technical blog posts were getting cited by ChatGPT, but Google Analytics showed nothing. Turns out AI bots don't show up in traditional analytics because they're not "users" - they're crawlers.
After manually parsing server logs for weeks, I realized: 1. This should be automated 2. There should be a standard for AI bot permissions (like robots.txt) 3. Sites need visibility into which AI systems are using their content
So I built LLMS Central - both a tracking platform AND a centralized repository for llms.txt files (the proposed standard for AI bot permissions).
### Features
1. *Real-time bot tracking* - See which AI crawlers visit your site 2. *Page-level analytics* - Know which pages AI bots prefer 3. *AEO scoring* - Measure Answer Engine Optimization (like SEO, but for AI) 4. *Multi-engine preview* - See how ChatGPT vs Claude would cite your content 5. *llms.txt generator* - Like robots.txt, but for AI (proposed standard)
### Try It
*Preview tool (no signup):* https://llmscentral.com/aeo-preview
*Full tracking (free tier):* https://llmscentral.com/dashboard
### The Data Keeps Growing
What started as a personal project is now tracking hundreds of domains. Every day we see: - New AI bots appearing (just detected Meta's AI crawler last week) - Crawling patterns evolving (bots are getting smarter about what they crawl) - Sites realizing they have zero visibility into AI usage of their content
The most common reaction: "I had no idea ChatGPT was crawling my site this much."
### Questions
1. Should there be a standard for AI bot permissions (like robots.txt)? We're pushing llms.txt, but curious about alternatives. 2. How should sites monetize AI training data? Or should they? 3. Is "Answer Engine Optimization" (AEO) the future of SEO? 4. What data would YOU want to see about AI bot traffic?
Would love HN's feedback on the technical approach, privacy considerations, and what data would be most valuable to track.