Like you, I would have preferred that the UI for the choice didn’t make opt-in the default. But at least, this is one of the rare times where a US company isn’t simply assuming or circumventing consent from existing users in countries without EU-style privacy laws who ignore the advance notification. So thank you Anthropic for that form of respect.
“Previous chats with no additional activity will not be used for model training.”
So, I guess they weren’t. You can switch off and keep that the case.
When you see this kind of thing it makes you wonder what else they'll try to do to get around your opt-out.
owebboy•5h ago