Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I just kind of wish the behavior for "hallucinations" just didn't have such confident language in the context... actual people will generally be relatively forthcoming at the edge of their knowledge or at least not show as much confidence. I know LLMs are a bit different, but that's about the best comparison I can come up with.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: