Is there a way to opt my websites out of ai data collection?
embedding-shape•30m ago
Add HTTP Basic Auth in front of your website, then share the credentials with people who are allowed to view your website. Make sure you don't hand our credentials to employees of OpenAI, Anthropic, xAI or Microsoft.
wolttam•29m ago
Any measure you put in place can/will be ignored by the actors who never planned to respect your wishes in the first place.
That's just how the web works, though.
cortesoft•12m ago
This is true for measures that require the actor to respect your wishes, but doesn't apply to measures that force them to.
ghostlyy•28m ago
partial answer: the major labs (Anthropic, OpenAI) do respect robots.txt for their named crawlers, so blocking ClaudeBot/GPTBot in robots.txt works for those specific bots. What you can't easily opt out of is the indirect ingestion via Common Crawl, scraped datasets, and unnamed crawlers. agents.txt doesn't change that picture.
The Allow-Training vs Allow-RAG split in the default is the useful part of the file. They're different operations with different costs to the site owner. Training is a one-time bulk ingest. RAG is a runtime fetch per query. A site owner might reasonably allow one and not the other.
ninjin•1m ago
[delayed]
sschueller•5m ago
Well Claude still thinks it shouldn't read AGENTS.md [1] so they probably also don't care about agents.txt on a web server...
kennywinker•33m ago
Is there a way to opt my websites out of ai data collection?
embedding-shape•30m ago
wolttam•29m ago
That's just how the web works, though.
cortesoft•12m ago
ghostlyy•28m ago
ninjin•1m ago
sschueller•5m ago
[1] https://github.com/anthropics/claude-code/issues/6235