Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

One must always assume anything an LLM outputs is false until proven otherwise, not the opposite.


Ok, but that criterion is true for human-generated output as well.


If you truly never believe anything anyone has ever told you until you prove it yourself, I'm amazed you've somehow learned to type in English on a website with accounts.

We trust people all the time for many reasons. Authority, past experience, logic (why would they lie about something in context), etc. and we get by alright. Knowing it's another conscious, mortal human existing in society makes it much easier to know when someone is likely being truthful.

Obviously there are all sorts of caveats to that. The main difference is that LLMs don't lie. They don't tell the truth either. They just generate stuff with no meaning. There's no way to ever put any trust in that, the same way I wouldn't trust my ice maker to make sure my dog gets enough water while I'm on vacation.


people assert their opinions as facts, they parrot bullshit they heard or read online and make no attempt to verify anything, they willingly spread lies in support of their political or spiritual beliefs, they lie or exaggerate to make themselves look good or to make people they don't like look bad, or they even just straight up lie for no reason. to say anything said by an LLM is a lie until proven otherwise and then act like it's ridiculous to apply the same thing to humans is nuts.


LLMs have no concept of facts or lies, or right or wrong.

And contrary to your rather morbid view of society, in general most WP editors try hard to get it right. And support rather than undermine each other (don't let the more contentious topics distract you from the much larger pool of contributions)

More to the point, with humans you can demand they provide a source and at scale the iterative process should get to the right answers for a good percentage of content. That won't work for LLMs because none of that has meaning.


False. Please prove otherwise.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: