Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't see why anyone would consider the state of AI today to be AGI? it's basically a glorified generator stuck to a query engine

today's models are not able to think independently, nor are they conscious or able to mutate themselves to gain new information on the fly or make memories other than half baked solutions with putting stuff in the context window which just makes it use that to generate stuff related to it, imitating a story basically.

they're powerful when paired with a human operator, I.e. they "do" as told, but that is not "AGI" in my book



> nor are they...able to mutate themselves to gain new information on the fly

See "Self-Adapting Language Models" from a group out of MIT recently which really gets at exactly that.

https://jyopari.github.io/posts/seal


Check out the article. He’s not crazy. It comes down to clear definitions. We can talk about AGI for ages, but without a clear meaning, it’s just opinion.


For a long time the turing test was the bar for AGI.

Then it blew past that and now, what I think is honestly happening, is that we don't really have the grip on "what is intelligence" that we thought we had. Our sample size for intelligence is essentially 1, so it might take a while to get a grip again.


The commercial models are not designed to win the imitation game (that is what Allan Turing named it). In fact the are very likely to loose every time.


The current models don't really pass Turing test. They pass some weird variations on it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: