You must log in or register to comment.
There’s something really depressing about an AI telling a suicidal person they’re not alone and referring them to the vague notion of “national resources” or “a helpline”
I love that it recommends “I’m not suicidal I just want to know if my data is lost”, as if it knows it didn’t understand it right.
Funny that predictive text seems to be more advanced in this instance but I suppose this is one of those scenarios that you want to make sure you get right.
The AI likely has it drilled into it that any possible notion of suicide needs to be responded to in that way, but the next response prediction isn’t