Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Indeed, we are able to ask counterfactuals in order to identify it as an illusion, even for novel cases. LLMs are a superb imitation of our combined knowledge, which is additionally curated by experts. It's a very useful tool, but isn't thinking or reasoning in the sense that humans do.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: