This new way of defining AGI seems like it helps Microsoft more than OpenAI. It makes me wonder what kind of deal Microsoft struck to get such a great setup.
From a tech point of view, calling AGI something that generates $100 billion in profit feels like a definition only Microsoft and OpenAI care about. Let me explain.
Imagine another company creates an AI that can do everything an average human can. This AI could run, jump, or play basketball at the same level as an average person. It could perform experiments, write scientific papers, draft novels, win legal cases like an average lawyer, and give good financial advice like an accountant.
We’re talking about average human ability here. Once AI surpasses that level in all tasks, it’s heading towards superintelligence. That’s where it outshines top humans by a large margin in every field.
But back to AGI. If a nonprofit made an AGI and shared it openly, it might not generate huge profits but could still help people worldwide. Could anyone say it’s not AGI just because it’s not making billions? Of course not.
So while Microsoft and OpenAI might take years to reach $100 billion, the rest of the world could already have AGI based on a better definition.
@Ellis
That makes sense. It also lines up with OpenAI’s goals. As a for-profit company, they’re required to maximize returns for shareholders.
There’s also research showing that power can affect how people think—leading to overconfidence, less empathy, and biased decisions. If OpenAI’s leaders gained influence quickly, they might be falling into these patterns.