Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
dpweb
9 days ago
|
parent
|
context
|
favorite
| on:
On the Existence, Impact, and Origin of Hallucinat...
We don't understand the brain. We fully understand what LLM are doing, humans built them. The idea we don't understand what LLMs are doing is magical. Magical is good for clicks and fundraising.
allears
9 days ago
[–]
We know how we built the machines, but their complexity produces emergent behavior that we don't completely understand.
reply
emp17344
9 days ago
|
parent
[–]
This isn’t settled science, by the way. There’s evidence that the “emergent” behaviors are just a mirage.
reply
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: