

Ran into this, it’s just unbelievably sad.
“I never properly grieved until this point” - yeah buddy, it seems like you never started. Everybody grieves in their own way, but this doesn’t seem healthy.


Ran into this, it’s just unbelievably sad.
“I never properly grieved until this point” - yeah buddy, it seems like you never started. Everybody grieves in their own way, but this doesn’t seem healthy.
That would require an AI to be able to correctly judge maladaptive thinking from healthy thinking.
No, LLMs can’t judge anything, that’s half the reason this mess exists. The key here is to give the LLM enough information about how you talk to yourself in your mind for it to generate responses that sound like you do in your own head.
That’s also why you have a therapist vet the responses. I can’t stress that enough. It’s not something you would let anyone just have and run with.
Seems like you could just ditch the LLM and keep the therapist at that point.