Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What are the examples?

I've had several LLM search result summaries contain flat out mistakes and incorrect statements.



I’ll try to dig some up soon (I’m on my phone now). But of course the output contains errors sometimes. So do search engine results. The important thing for difficult questions is whether the right answer (or something pointing toward it) is available _at all_. Of course this assumes you can verify the answers somehow (usually easy with programming questions), but again, search engines have the same limitation.


> But of course the output contains errors sometimes. So do search engine results.

That's not true.

Search engine results are links and (non-AI generated) summaries of existing resources on the web. No search engine returns links to resources it generated as the result of the search query. Those resources can have innacurate information, yes, but the search engine itself does not returns errors.

LLMs output do not contain errors "sometimes". The output of an LLMs is never truthful nor false in itself. In the same way that the next word your keyboard suggests for you to type on a mobile device is never truthful nor false. It's simply the next suggestion based on the context.

These two methods of accessing information very clearly do not have the same limitations. A search engine provide link to specific resources. A LLM generates some approximation of some average of some information.

It's up to intelligent thinking people to decide whether a LLM or a search engine is currently the best way for them to parse through information in search for truth.


Obviously I meant that the content of the results can be inaccurate, and I assume you weren't actually confused about that.


Ok, the first example I found was when I was trying to find a way to write a rust proc macro that recursively processes functions or modules and re-writes arithmetic expressions. The best way to do this, it turns out, is with `VisitMut` or `fold`. I cannot find any results discussing these approaches with google, but ChatGPT (4) suggested it within the first couple refinements of a query.

Another recent example from my history: "can you implement Future twice for a rust struct, with different Output types"


I found the following as the first Google result for “rust proc macro AST rewrite” (and I don’t know much about Rust”): https://users.rust-lang.org/t/using-macros-to-modify-ast-to-...

And I found the following for “different future same rust struct” (first search attempt): https://stackoverflow.com/questions/65587187/how-to-instanti...

I’m not saying that LLMs can’t be useful for stuff like that, but they haven’t been that much of an improvement over Google search so far. And I always google about what an LLM suggests in any case, to verify and to get a better feeling about the real-world state of the topic in question.


okay? I didn't find any of those when I was looking originally. I certainly wouldn't claim that you can't find this information with google, just that I wasn't able to.


There isn't an expectation or claim that search engines answer anything. They just find things or don't find things.


I've had several summaries that are just 80% duplications of pages found in the 4th to 5th position in the search results.

It seriously looks like google is deranking actually useful and informative sites and then passing their content through an "LLM" to slightly reorganize it and then pass it off as it's own.

It's a copyright laundering machine put together by advertising companies so you never leave their properties. I genuinely think it's a criminal conspiracy at this point.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: