What exactly are people expecting? All transformer models, of which ChatGPT is just one big fancy example, are just pattern matchers based on a large corpus of text trying to find the next string that completes the pattern. There's no reasoning, no understanding. It's just a big fancy parrot. Now ask your parrot to do some math. Polly want an AI cracker? We clearly haven't cracked the code on AGI yet, and transformer models probably won't get us there.
What would it look like for it to not be "just an X", where X is the computational unit at the base level? If you look at a low enough level, any system will be made up of some basic units that manipulate signals in various ways. The brain is just neurons integrating signals and firing action potentials. But that doesn't make the system "just neurons firing action potentials".