The internet did what it always does when it hears the words “AGI is now.” As expected, it collectively lost its mind.
From the moment Jensen Huang leaned into the mic on Lex Fridman’s latest podcast and said, “I think we’ve achieved AGI,” the clip was destined for virality. It had a loaded acronym, a billionaire CEO, and just enough uncertainty to make everyone take a double take.
Except, of course, this wasn’t the AGI most people think it was.
If you think about it, Artificial General Intelligence has lived less in labs and more in our collective imagination. Think HAL 9000 from Space Odyssey or Samantha from Her. Basically, machines that don’t just complete tasks but understand intent, reason through tasks, and maybe even feel.
In academic circles, AGI is typically defined as AI that matches or surpasses human intelligence across virtually all cognitive tasks. This level of intelligence has the ability to generalise knowledge and solve unfamiliar problems without retraining.
In fact, researchers like Geoffrey Hinton, Yann LeCun, and Yoshua Bengio have spent years debating what that actually means, and crucially how far away we are from AGI. The consensus is that we aren’t there yet. Not even close.
Estimates range from a decade to several decades, with LeCun arguing AGI won’t arrive as a sudden breakthrough but as a gradual evolution of systems becoming incrementally more capable. Which is why Huang’s statement felt… off. Not wrong, just strategically framed.
On Lex Fridman’s podcast, NVIDIA CEO Jensen Huang wasn’t talking about sentient machines or human-level cognition. He was responding to a hyper-specific hypothetical question posed by Fridman on whether an AI system could start, grow, and run a billion-dollar company?
To that, Huang said we’re already there. It’s a clever reframing, undoubtedly, because if you define AGI not as “human-like intelligence” but as “economic outcome generation,” then yes, today’s agentic AI systems look surprisingly capable. Spin up an autonomous agent, connect it to APIs, let it iterate on code, do marketing and simple operations. By this definition, yes, you can plausibly build something that resembles a startup.
But in the very next sentence, even Jensen Huang hedged his bet. He admitted the odds of thousands of AI agents building something like NVIDIA are effectively zero. Which tells you everything you need to know about AGI.
Ultimately, make no mistake, Jensen Huang’s soundbite on AGI is less a scientific declaration and more a masterclass in narrative timing. So when he says “we’ve achieved AGI,” what he’s really doing is picking a definition that flatters the present. Yes, agentic AI is having a moment, but calling that AGI is like calling a Formula 1 car a spaceship.