Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

To perhaps stir the "what do words really mean" argument, "lying" would generally imply some sort of conscious intent to bend or break the truth. A language model is not consciously making decisions about what to say, it is statistically choosing words which probabilistically sound "good" together.


>A language model is not consciously making decisions about what to say

Well, that is being doubted -- and by some of the biggest names in the field.

Namely that it isn't "statistically choosing words which probabilistically sound good together". But that doing so is not already making a consciousness (even if basic) emerge.

>it is statistically choosing words which probabilistically sound "good" together.

That when we do speak (or lie), we do something much more nuanced, and not just do a higher level equivalent of the same thing, plus have the emergent illusion of consciousness, is also an idea thrown around.


"Well, that is being doubted -- and by some of the biggest names in the field."

An appeal to authority is still a fallacy. We don't even have a way of proving if a person is experiencing consciousness, why would anyone expect we could agree if a machine is.


>An appeal to authority is still a fallacy

Which is neither here, nor there. I wasn't making a formal argument, I was stating a fact. Take it or leave it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: