> The two companies reportedly signed an agreement [in 2023] stating OpenAI has only achieved AGI when it develops AI systems that can generate at least $100 billion in profits.
They didn't have a better definition of AGI to draw from. The old Turing test proved to not be a particularily good test. So lacking a definition money was used as a proxy. Which to me seems fair. Unless you've got a better definition of AGI that is solid enough to put in a high dollar value contract?
That's true, but the $100 billion requirement is the only hard qualification defined in earlier agreements. The rest of the condition was left to the "reasonable discretion" of the board of OpenAI. (https://archive.is/tMJoG)
It's kind of sad, but I've found myself becoming more and more this guy whenever someone "serious" brings up AI in conversation: https://www.instagram.com/p/DOELpzRDR-4/
I quit Google last year because I was just done with the incessant push for "AI" in everything (AI exclusively means LLMs of course). I still believe in the company as a whole, the work culture just took a hard right towards kafkaville. Nowadays when my relatives say "AI will replace X" or whatever I just nod along. People are incredibly naive and unbelievably ignorant, but that's about as new as eating wheat.
HN has big problem with reading comprehension. First of all $100B is likely what Microsoft demanded on top of what AGI is defined by OpenAI, which is “ highly autonomous systems that outperform humans at most economically valuable work” - [0]. Secondly that is no longer part of this revised agreement, replaced with a review by a panel of experts.
This is the most sick implementation of Goodhart's Law I've ever seen.
>"When a measure becomes a target, it ceases to be a good measure"
What appalls me is that companies are doing this stuff in plain sight. In the 1920s before the crash, were companies this brazen or did they try to hide it better?
that's very different from OpenAI's previous definition (which was "autonomous systems that surpass humans in most economically valuable tasks") for at least one big reason:
This new definition likely only triggers if OpenAI's AI is substantially different or better than other companies' AI. Because in a world where 2+ companies have similar AGI, both would have huge income but the competition would mean their profit margins might not be as large. The only reason their profit would soar to 100B+ would be because of no competition, right?
It doesn't seem to say 100B a year. So presumably a business selling spoons will also eventually achieve AGI. Also good to know that the US could achieve AGI at any time by just printing more money until hyperinflation lets openai hit their target.
Nice unlock to hyperinflate their way to $100B. I'd buy an AGI spoon but preferably before hyperinflation hits. I'd expect forks to outcompete the spoons though.
No. When you're thinking about questions like these, it is useful to remember that multiple (probably dozens) professional A-grade lawyers have been paid considerable sums of actual money, by both sides, to think about possible loopholes and fix them.
No. "Pro" subscriptions have nothing to do with AGI, my pet GPS tracker sells those.
We're talking about things that would make AGI recognizable as AGI, in the "I know it when I see it" sense.
So things we think about when the word AGI comes up: AI-driven commercial entity selling AI-designed services or products, AI-driven portfolio manager trading AI-selected stocks, AI-made movie going at the boxoffice, AI-made videogame selling loads, AI-won tournament prizes at computationally difficult games that the AI somehow autonomously chose to take part in, etc.
Don't worry, it'll be relevant ads, just like google. You're going to love when code output is for proprietary libraries and databases and getting things the way you want will involve annoying levels of "clarification" that'll be harder and harder to use.
I kind of meant this as a joke as I typed this, but by the end almost wanted to quit the tech industry all together.
Just download a few SOTA (free) open-weights models well ahead of that moment and either run them from inside your living-room or store them onto a (cheap) 2TB external hard drive until consumer compute makes it affordable to run them from your living room.
>This is an important detail because Microsoft loses access to OpenAI’s technology when the startup reaches AGI, a nebulous term that means different things to everyone.
https://techcrunch.com/2024/12/26/microsoft-and-openai-have-...