Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Well, he is making an analogy that real internal experience cannot be confirmed externally, however convincing the performance, but this is the only way we know about the internal experience of all things, including ones we typically assign "real" consciousness to (humans, dogs) and ones we don't (amoeba, zygotes, LLMs).

To be clear I'm not for a moment suggesting current AIs are remotely comparable to animals.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: