Emotionally, it feels similar to grief after a major loss. I've been crying a lot, have little appetite, and feel physically sick I'm in touch with friends and my therapist, but I've also been turning to LLMs for supplemental support, and suprisingly, it has been very helpful. It's strange to feel comforted by a machine, but it has helped me emotionally.
I'm curious how others see this.
Have you used an LLM for grief processing or therapy? What was helpful, what wasn't, and what risks are there?
I'm not treating it as a replacement for friends or a professional counselor, just wondering whether LLMs can be safely used as a supplement during a painful period?
incomingpain•2mo ago
The other option, go local. It's private, ask it whatever it you want. Nobody will ever know.
mettakindness•2mo ago
For the record, I've been using standard ChatGPT 5.1, and even without a therapy-specific system prompt, it works quite well.
Just to clarify, when I mentioned "safely used as a supplement", I was thinking from a holistic emotional standpoint. I came across the post on "Chatbot Psychosis" (https://news.ycombinator.com/item?id=46045674) and it got me pondering...
incomingpain•2mo ago
What I think is happening there. A model might only reasonably have 65,000 context. You quickly use it all up and it doesnt tell you it's truncating context, but it defaults in the center.
Eventually your chat is silently removing context and the message you think you're sending the LLM is X, Y, Z and what it's processing is X, Z. Which gives a different answer than the missing Y context.
My thought, use a new chat regularly.