Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There are limits to such algorithms, as proven by Kurt Godel.

https://en.wikipedia.org/wiki/G%C3%B6del%27s_incompleteness_...



True, and in the case of Solomonoff Induction, incompleteness manifests in the calculation of Kolmogorov complexity used to order programs. But what incompleteness actually proves is that there is no single algorithm for truth, but a collection of algorithms can make up for each other's weaknesses in many ways, eg. while no single algorithm can solve the halting problem, different algorithms can cover cases for which the others fail to prove a definitive halting result.

I'm not convinced you can't produce a pretty robust system that produces a pretty darn good approximation of truth, in the limit. Incompleteness also rears its head in type inference for programming languages, but the cases for which it fails are typically not programs of any interest, or not programs that would be understandable to humans. I think the relevance of incompleteness elsewhere is sometimes overblown in exactly this way.


If there exists some such set of algorithms that could get a "pretty darn good approximation of truth" I would be extremely happy.

Given the pushes for political truths in all of the LLMs I am uncertain if they would be implemented even if they existed.


You're really missing the points with LLMs and truth if you're appealing to Godel's Incompleteness Theorem


Why?


The limitations of “truth knowing” using an autoregressive transformer are much more pressing than anything implied by Gödel’s theorem. This is like appealing to a result from quantum physics to explain why a car with no wheels isn’t going to drive anywhere.

I hate when this theorem comes up in these sort of “gotcha” when discussing LLMs: “but there exist true statements without a proof! So LLMs can never be perfect! QED”. You can apply identical logic to humans. This adds nothing to the discussion.


Ah understood, yes that is a bit ridiculous.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: