Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yeah, my issue is that I suspect an LLM based app may be easily "jailbroken" (since they tend to be highly agreeable due to their training) and turned into an enabler rather than a helper.

Even if some LLM therapists are good, with zero friction to go "doctor shopping" will result in a great many patients picking the bad ones that make them feel better, rather than the good ones that make them do better.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: