Oracle Just Told Every AI Company Their Models Are Worthless
On today’s episode of “What Crazy AI News Can Shake Up the Week,” here it goes : our boy Larry Ellison just stood in front of a room full of investors and essentially said: “All your models? Same thing. Different logos.”
And honestly? He’s not wrong. Quite frankly I was wondering when someone was going to get around to calling this out (although I do believe Claude is in a league of its own)
ChatGPT, Gemini, Grok, Claude… they’re all eating from the same buffet. Wikipedia, Reddit, news articles, the entire public internet scraped and repackaged. Ellison called them commodities. That’s Oracle’s cofounder telling billion dollar AI companies they built the same product four times. Cold.
So What’s the Play?
Ellison says the real gold is private data. The medical records, the banking data, the supply chain secrets locked inside Fortune 500 systems. And wouldn’t you know it, most of that data already lives inside Oracle databases. Convenient, right?
Oracle just dropped something called AI Database 26ai that lets AI models reason over a company’s private data using RAG (Retrieval Augmented Generation). The AI doesn’t train on your data. It just searches it in real time. Your stuff never leaves the vault. And yes I’m thinking what you’re thinking. None of this sounds kosher.
A bank asks AI to analyze every loan without exposing customer records. A hospital runs diagnostics off full medical histories without catching a HIPAA violation. A defense contractor lets AI work across classified ops without data moving an inch.
On paper? Beautiful.
But Let’s Be Real
Are we seriously supposed to believe that people are going to be cool with large language models skimming through their most sensitive data? Even with guardrails? Even with the “it never leaves the vault” pitch?
Because we’ve heard this movie before. “Your data is safe.” “We don’t store anything.” “Trust us.” And then some breach hits the news and everybody acts shocked.

Look, I’m not saying the tech doesn’t work. RAG is legit. But there’s a Grand Canyon sized gap between “technically secure” and “people actually trust it.” And right now, most folks don’t even trust these models to write a decent email, let alone rifle through their medical records.
The Numbers Are Wild Though
I’ll give Ellison this. Oracle is backing the talk with receipts. I checked it out with CLAUDE hahah……$523 billion in contracted revenue still on the books. $300 billion of that from OpenAI alone. Cloud revenue hit $8 billion in one quarter. OCI grew 66%. GPU revenue up 177%.
Ellison called it the largest and fastest growing market in history. The numbers don’t lie, even if the narrative feels a little too perfect (and wrong on many levels)
The Bottom Line
Private data is this year’s flavor of the month in AI. Last year it was model size. Before that it was training data. Before that it was GPUs. Every quarter there’s a new “this changes everything” moment.
Maybe Ellison’s right and whoever controls the database controls the future of AI. Or maybe we’re watching another hype cycle dress itself up as a revolution. All we have to do is wait for tomorrow’s crazy news.
Either way, I’d keep my hand on my wallet.
