Training AI Models Is a Bigger Prize Than Building Them, Oracle’s Ellison Says

Ellison pitches Oracle as the bridge between private data and giant ‘electronic brains,’ with a 50,000-GPU supercluster on the way.

Topics

  • [Image creative: Chetan Jha/MITSMR India]

    Oracle Corp. cofounder Larry Ellison said the most valuable opportunity in artificial intelligence is training and applying models at scale, not creating them, as the company courts customers with plans for massive GPU clusters on its cloud. 

    “The much larger opportunity isn’t the creation of the models themselves, it is the training of the models,” Ellison said in a keynote at Oracle AI World in Las Vegas on Tuesday, 14 October. “What will change the world is when we start using these remarkable electronic brains to solve humanity’s most difficult and enduring problems.” 

    Ellison said that “by helping us solve problems we are unable to on our own, there will be a better pedigree of scientists, engineers, teachers, chefs, brick layers, and surgeons.” 

    “We’ve never built a tool like this,” he said.

    The Oracle chief said while some feel AI will replace humans in their human endeavours, “I don’t think that’s true.”

    Oracle’s Edge

    With a market capitalization of $865.56 billion, Oracle offers business-oriented products and services, including Oracle Database, a relational database management system (RDBMS). Through Oracle Database, the Ellison-headed company sells database software, cloud computing software, and hardware.

    The software giant is a major participant in building data centers for AI training.

    Presently, the likes of ChatGPT, Gemini, and Anthropic are trained on publicly available data from the internet. 

    “For these models to reach their peak value, you need to not train them just on publicly available data, but you need to make privately-owned data available to those models as well,” he said.

    This is where Oracle is betting big. “Most of the world’s high-value data is already in an Oracle database,” he said, adding that Oracle Database makes the data available to AI models for reasoning.

    He said people like X’s Elon Musk, Meta’s Mark Zuckerberg, and OpenAI’s Sam Altman are betting fortunes on building and training AI models.

    “We trained the very first version of Grok for Elon … Almost all of these AI models are in the Oracle Cloud,” Ellison said, claiming that Oracle is involved in training more multimodal models than any other competitor.

    Oracle is expanding its partnership with Advanced Micro Devices to stand up an AI “supercluster” using 50,000 Instinct MI450 GPUs, slated to be publicly available on Oracle Cloud Infrastructure in the third quarter of 2026, with expansion planned in 2027 and beyond. 

    Oracle said it will be the first hyperscaler to offer an MI450 supercluster to the public. 

    Ellison pitched Oracle’s data footprint and database stack as a bridge between enterprise data and AI systems, arguing that models trained only on public internet data miss business-critical context. 

    The remarks come as cloud providers race to secure accelerators and customers for large-scale AI workloads. 

    Industry reports highlighted that the Oracle-AMD buildout targets surging demand for training and inference capacity and will use AMD’s Helios rack design alongside Epyc CPUs and Pensando networking.

    Topics

    More Like This

    You must to post a comment.

    First time here? : Comment on articles and get access to many more articles.