đ§ Neural Dispatch: The AI maths is broken, and a bubble that many pretend to not see, is deflating
The biggest AI developments, decoded. November 26, 2025.
Hello!
The numbers simply donât add up. And that is enough to warrant a temporary departure from the standard Neural Dispatch format, for this week. On November 20, Nvidia reported number for the third quarter fiscal 2026. Record revenue of $57.0 billion, up 22% from Q2 and up 62% from a year ago, the tech giant says. âBlackwell sales are off the charts, and cloud GPUs are sold out. Compute demand keeps accelerating and compounding across training and inference â each growing exponentially. Weâve entered the virtuous cycle of AI. The AI ecosystem is scaling fast â with more new foundation model makers, more AI startups, across more industries, and in more countries. AI is going everywhere, doing everything, all at once,â says Jensen Huang, founder and CEO of NVIDIA. If you look at just this summary (as most of you would, with limited attention spans, youâd be impressed).
But it was the Wall Street that read between the lines, and the Nasdaq as well as S&P 500 slidâŚand then kept sliding. A snapshot â Dow was down 0.8%, S&P 500 down 1.6%, and Nasdaq down 2.2% on the day. There was much more to Nvidiaâs earnings, which Iâll attempt to summarise to save you time (and re-emphasise that to call this AI conversation a âbubbleâ wouldnât exactly be out of place).
Nvidia has almost $19.8 billion worth of unsold chips (these would be primarily GPUs) sitting in warehouses, and that number is up significantly in three months â the inventory amount was around $10 billion then. There is a reason why I put Huangâs entire quote in the previous paragraph. Apparently, the official line is that demand is through the roof and supply is limited. You know it as well as anyone, shortage claims cannot be true, if thereâs so much inventory sitting around. Itâs common sense. The only question to be asked is â are customers simply not buying, or are they buying without actually having the money to buy? Nvidia though explains this as a build-up for future demand.
Nvidia has also reported that there are $33.4 billion in unpaid bills, or accounts receivable as the terminology goes, and that number is up89% in a year. This means customers who bought chips havenât paid for them yet. The average payment window is now 53 days, instead of 46 days a year prior. That extra week represents a few more billion, and when they may arrive is anyoneâs guess. A more concerning scenario for the chipmaker emerges if many of Nvidiaâs customers start to behave like under-capitalised AI startups struggling to pay for hardware in case venture funding dries up or if the AI bubble deflates before these companies achieve profitability.
Over the past month, weâve seen absolutely brash attempts at a circular economy to keep the AI bubble pressurised somehow. The latest in that chapter also featured Nvidia which is participating with its own $2.5 billion âinvestmentâ in Elon Muskâs xAI which wants to raise $20 billion. I was reading somewhere, this would involve an interesting financing structure with a special purpose vehicle (SPV) that will purchase the Nvidia hardware, which xAI will then lease for a five-year term. This isnât the first circular AI deal and neither will it be the last.
Last time on Neural Dispatch: Microsoftâs AI chaos, Perplexityâs moment, and Firefoxâs cool approach
The bottomline is, the same pile of dollars are being circulated between different AI companies, hand holding each other in the hope that the bubble isnât discovered, and gets counted as revenue at each corporate stop this wad of cash makes on its journey.
American investor and hedge fund manager Michael Burry made a rather blunt post X after Nvidiaâs earnings release. He wrote, âThe idea of a useful life of depreciation being longer because chips from more than 3-4 years ago are fully booked confuses physical utilisation with value creation. Just because something is used doesnât mean it is profitable.â He pointed out that airlines keep old planes around and in service, which come in handy during the festive period rush, but they are only marginally profitable. The reality is, Nvidiaâs CFO had pushed back on the GPU accounting (which Iâve explained above) and in a statement said the useful life of Nvidiaâs GPUs is a significant total cost of ownership advantage over rivals â and points to A100 GPUs that were shipped 6 years ago still being utilised at full capacity by customers. But it isnât that simple. The A100s consume as much as 3x more power per compute (the unit is FLOP, or Floating-Point Operations Per Second) than the H100s that followed it. And that in itself is approximately 25x less power efficient than Blackwell generation chips. A debate is raging â should depreciation be 3 years, 5 years, or 7 years? Compulsion more than choice?
Do check out my other newsletter, Wired Wisdom: Gemini 3 is here, EAâs F1 realignment, and Windows 11âs agentic disaster waiting to happen
THINKING
âWeâre doing a 500 megawatts, gigawattsâŚItâs going to cost eight bazillion trillion dollars.â
- Elon Musk, at the U.S.-Saudi Investment Forum 2025
With Elon Musk, it is difficult to know if he was genuinely confused, or it was just an artificially induced fog. But this was Musk, introducing xAIâs planned 500 MW AI data centre partnership with Saudi Arabia. And of course, this is powered by Nvidia, which is why CEO Jensen Huang was almost sweating when he said âstop itâ as Musk stumbled between megawatt and gigawatt. Not a casual occasion to stumble, for the man who many believe is the saviour of humanity (of course that mission will also be powered by Nvidia, but I digress).
The Context: That is the whole AI bubble, condensed into one beautifully unhinged exchange. They thought no one would notice in the cloud of big numbers and excitement. A CEO whoâs raising tens of billions for compute doesnât know (or pretends not to know) the difference between megawatts and gigawatts. Understandably, the CEO of the worldâs most valuable semiconductor company visibly panics because the quiet part â that no one really knows where this is going or how much it will cost â has just been said out loud, at an event filled with sovereign wealth funds. Has âfake it till you make itâ morphed itself into âbuild it till the grid collapses and hope the ROI eventually materialisesâ for the AI era?
A Reality Check: We find ourselves amidst a moment where AI companies are committing trillions of dollars to data centres without a clear business model beyond âAGI will pay us back at some point.â Power costs worldwide are going up, as is the demand for water. Something simply has to give, at some point. AI companies and startups are being funded with billions to be ready to buy GPUs that donât exist yet, to train models nobody knows how to monetise, for customers who arenât sure why they need them.
Muskâs quote isnât a joke â itâs as close as weâll ever get to an accidental confession from the AI bros. The AI boom today is powered by physics, marketing and spreadsheets that print whatever number keeps the funding round alive. Nobody has any idea what true power requirements will be (thatâll after all depend on usage, and no one knows that too), how many chips are actually needed, or what the returns look like. And yet everyone keeps buying compute because everyone else is buying compute. This is how bubbles form, through collective delusion wrapped in technical jargon. And you canât blame Jensen for sweating, because this is a market built on curating expectations, and the worst possible thing is someone admitting they⌠donât know what theyâre talking about.
Neural Dispatch is your weekly guide to the rapidly evolving landscape of artificial intelligence. Each edition delivers curated insights on breakthrough technologies, practical applications, and strategic implications shaping our digital future.



