That's literally what the language model is. It might correctly generate a solution to a "novel" question/problem that is sufficiently close to one with an existing, known answer. But then again it might not. And in software development, it's going to take someone who is knowledgeable to tell the difference.
I think software engineering is going to look very different in a few years, and likely be a smaller field with lower paying jobs. But it's not going away in the near (5-10 years) future.
The dude you replied to you gave out the sort of bad take I come here for.
If anyone thinks genuinely that ChatGPT can meaningfully replace a software developer should be starting a tech business right now. You can replace basically your largest cost (staff wages, especially for expensive pesky software developers) with a $20 account that will work overtime, will never get tired, and whose performance will never fluctuate.
No excuses for not getting insanely rich. Of course it's not happening, because it's bullshit.
I am genuinely impressed and even excited with ChatGPT. It's an amazing tool, that would have been massively helpful when I needed a good NLP in the past. It will certainly be massively useful in the future. Hell, it's being a great assistant right now.
But it's not General AI, and I do facepalm at people LARPing as if it were.
General AI will in day exist, and knowledge jobs will most likely be a thing of the past at that point. How far in the future it is is anyone's guess (my prediction is in between 12 and 235 years in the future). We're just not there yet.
There's a difference between being able to do logical reasoning, and being able to do everything a competent white collar worker can do. For one, there's a token limit and memory limit, which limits the scale of the problems current iterations of GPT can do (and this limitation is not a limitation of logical reasoning ability). There's also (for GPT) no way for one to fine tune or train the model to work better in a specific domain unless you're in bed with OpenAI/Microsoft.
I think as a society we don't really have precise words for describing the different levels of intelligence, except in a "I know it when I see it" way. I don't think I'm LARPing in any way, I'm probably even less excited with it than you are given that you seem to be using it more often than I am. I'm just saying I think GPT does exhibit some logical reasoning abilities and not merely remembering statistical patterns.
> I'm just saying I think GPT does exhibit some logical reasoning abilities and not merely remembering statistical patterns.
I agree with most of your reply, except this bit.
I mean, generating a response through statistical language patterns is a sort of reasoning, and ChatGPT has been accurate enough to replace internet search for me quite often. But it also generates bullshit that an untrained eye would miss (because the bullshit it generated was statistically plausible).
When it get things wrong it generates some comically wrong behavior. I had one case where it looped through variations of the same wrong response - precisely because it is unable to do any kind of logical reasoning upon faulty or inaccurate data.
"One of the biggest differences that we saw from GPT-3.5 to GPT-4 was this emergent ability to reason better," Mira Murati, OpenAI's Chief Technology Officer, told ABC News."
I neither know how LLMs work nor how our brains work. And I don't know what could be parallel between these two.
For my very very limited knowledge of how properties can emerge from unique arrangements of constituent components (the S-R latch giving rise to state - i.e. memory - comes to mind), I would not at this point write off the possibility that a very large / very deep / very intricate neural network trained on language prevalence in very very large datasets could manifest properties that we would interpret as reasoning.
And I further wouldn't write off the we humans may owe no small part of our reasoning ability to language comprehension that we begin to ascertain from infancy.
Just because the guy said it doesn't make it true. "Emergent reasoning" is a great marketing hype-term that contains no technical specifications, like 'retina display'.
Any “emergent reasoning” produced by these LLMs is almost certainly coincidence (i.e. the long tail of the probability curve, e.g., like monkeys randomly banging out Shakespeare’s Othello).
A type of reasoning. It's still bad at mathematical reasoning and advanced programming or at least translating very complicated written instructions into working code without any human intervention. We also don't know how good it is at reasoning about the physical world although I think Microsoft was doing some research on that. Then there's theory of mind and the reasoning that goes along with it. Then there's reasoning about the future, how one's actions will affect outcomes and then reasoning about that subsequent future.
ChatGPT is impressive, but gets many things wrong. If you know what you are doing it's an amazing programming assistant. It makes me noticeably more productive. It may lead someone who doesn't know what they are doing in weird rabbit holes that will lead nowhere however.
One silly example. I was using a library I hadn't use before, and I asked how I could get certain attributes. It gave me an answer that would't compile at all, the imports didn't exist.
Then when I mentioned that it didn't work, it game me a slightly different answer, that also didn't work, and explained that the previous answer was valid for 3.x. in 1.x or 2.x the new answer was the correct one.
But there's the catch. There's no version 3.x. there's not even a 2.x. It's language model just statically got to that conclusion.
Doesn't make it any less impressive to me. It gets things right often enough, or at least points me in a good direction. I effectively learned new things using it. But it can't replace a developer.
Using ChatGPT as if it was General AI is similar to eat a meal using a hammer and a screwdriver as utensils. You can probably do it, but nobody will have a good time.
I think software engineering is going to look very different in a few years, and likely be a smaller field with lower paying jobs. But it's not going away in the near (5-10 years) future.