NVIDIA CEO Jensen Huang says he expects different AI models to discuss, debate, and reason with each other as humans do now. Speaking at the VAST Data Cosmos Launch this week, Huang says there will be frontier models that will teach smaller AI models, opening up the enterprise wave of AI.
“Today’s data is, of course, voluminous, but it’s also structured, unstructured, and it’s growing incredibly, and the more we use AI, the more data we collect, the more data we can use to train better AIs. That flywheel is incredible,” he said. “One of the most important trends is moving towards multimodality. We have a lot of our knowledge embedded in language, but when you augment it with images, video, and audio, the language becomes much more robust.”
“Instead of just one-shot models, we’re now building models that could have multi-step reasoning and just like humans, reason through things,” he said. “The concept of two of us having our own intelligence; we’re talking to each other, debating, fleshing out an idea – that’s no different in the future than two large language models, discussing, debating, fleshing out ideas, and so they’re generating data for each other to learn from.”
Huang was speaking with VAST Data CEO Renen Hallak on the trend of AI multimodality, ‘where images, video and audio become more augmented in AI’s design.
“Once you can create these frontier models, it’s kind of like a teacher model,” Huang said. “So, these teacher models could teach smaller models. Open-sourcing these small language models, or distillation of models, has really opened up and activated the next part, the next wave of AI, which is enterprise.”
“The enterprise computing platform is complicated because there’s security, sovereignty, data gravity, and access control. Just because it’s inside a company, all of the data within the company is not accessible to everybody within the company. Data is proprietary, and it’s very precious to them – their gold mine. And they would like to take that asset, that incredible data, domain-specific, company-specific data, and transform it into digital intelligence.”
“It takes AI to figure out what data you should train with… enterprises and researchers, engineers currently are moving. They’re moving from human-written code to an agentic AI-driven workflow,” Huang added.
“I could totally imagine the next step being you telling an AI: this is my basic mission, and this is what a good result would look like. These are all of the data that you can access based on your access control on the VAST AI data platform. Based on those givens, that AI has to go on one of our repositories, in our company database, and it says, ‘okay, based on the work, the mission that I have, I think I need these three team members, those two team members.'”
“And so, this AI figures out its own assembly of team members, its own compute graph, and it orchestrates among the members, and the way that the AIs talk to each other are going to be kind of like the way humans talk to each other.”