Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

We don't understand the brain. We fully understand what LLM are doing, humans built them. The idea we don't understand what LLMs are doing is magical. Magical is good for clicks and fundraising.




We know how we built the machines, but their complexity produces emergent behavior that we don't completely understand.

This isn’t settled science, by the way. There’s evidence that the “emergent” behaviors are just a mirage.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: