What is the NYT's stance here? Is it pure spite? I guess their lawyers told them this is the winning move, and perhaps it is. But it just seems so blatantly wrong.
If you look at Reddit's r/ChatGPT, you'll quickly notice that the median use of ChatGPT is for therapy.
Is the NYT really ok with combing through people's therapy logs?
jaimex2•7mo ago
They don't care. This is purely for a business upper hand.
OpenAI should probably encrypt the chats and lock itself out going forward. Collect whatever metrics they need on the fly before locking.
heavyset_go•7mo ago
OpenAI would never lock themselves out of free training data.
goatlover•7mo ago
Is there an expectation of privacy using ChatGPT? Do users think nobody is ever going to be looking at their logs?
conception•7mo ago
If you are a paying member and are not sharing prompts, yes?
noman-land•7mo ago
Stop having this expectation. It's factually incorrect.
edgineer•7mo ago
When my expectation of privacy is violated, I'll have learned a little more about such violations, but I won't drop my expectation not to be violated.
edgineer•7mo ago
What I mean is there are two meanings to "expectation of privacy": the Bayesian prior, and the legal stance. I have an expectation of privacy in my home but I still close the shades.
tashoecraft•7mo ago
I find it interesting you are blaming the NYT on this and not ChatGPT for keeping these logs in the first place. If openAI didn't keep logs, then there would be nothing to search, and a more harmful actor couldn't accomplish something far more nefarious. Saying that there could be confidential information in the logs, so that means we shouldn't access it, should also mean the logs shouldn't be kept.
nrds•7mo ago
As explained in literally the first paragraph of TFA, the court ordered openai to start keeping these logs. They didn't do it by choice.
ycombinatrix•7mo ago
Did you consider reading the article before writing out this comment? Please do next time.
vintermann•7mo ago
Now is the time to go have a chat with ChatGPT about how much NYT sucks. Maybe it can help come up with insulting things to call their lawyers too.
bn-l•7mo ago
> Instead, only a small sample of the data will likely be accessed, based on keywords that OpenAI and news plaintiffs agree on. That data will remain on OpenAI's servers, where it will be anonymized, and it will likely never be directly produced to plaintiffs.
sandspar•7mo ago
If you look at Reddit's r/ChatGPT, you'll quickly notice that the median use of ChatGPT is for therapy.
Is the NYT really ok with combing through people's therapy logs?
jaimex2•7mo ago
OpenAI should probably encrypt the chats and lock itself out going forward. Collect whatever metrics they need on the fly before locking.
heavyset_go•7mo ago
goatlover•7mo ago
conception•7mo ago
noman-land•7mo ago
edgineer•7mo ago
edgineer•7mo ago
tashoecraft•7mo ago
nrds•7mo ago
ycombinatrix•7mo ago