According to CRN, Broadcom’s fiscal year 2025 was a blockbuster, with total revenue growing 24% to a record $64 billion. AI was the undisputed star, with AI semiconductor revenue exploding 65% year-over-year to $20 billion. CEO Hock Tan revealed that the previously unnamed customer who signed a $10 billion custom chip deal is AI giant Anthropic, and that Anthropic just placed another $11 billion order this past quarter. The company’s overall AI-related backlog for its custom XPU chips and networking gear is now a staggering $73 billion, which it expects to deliver over the next 18 months. For the current fiscal first quarter, Broadcom forecasts AI revenue to double year-over-year to $8.2 billion.
The AI Juggernaut Is Fully Fueled
Here’s the thing about these numbers: they aren’t hopeful projections. They’re already in the bank. That $73 billion AI backlog is basically guaranteed revenue, and it represents nearly half of Broadcom‘s total consolidated backlog of $162 billion. Tan even cautioned analysts that the number will probably grow as more bookings come in. This isn’t just about selling chips to Nvidia’s competitors; it’s about being the foundational infrastructure provider for the entire AI data center build-out. Their custom AI accelerators (XPUs), networking switches like the Tomahawk 6, and optical components are all seeing “record” orders. They’re not just riding the wave; they’re selling the picks and shovels at an unprecedented scale.
VMware’s Steady Growth And The Flat Reality Everywhere Else
While AI is going parabolic, the rest of Broadcom’s story is a tale of two businesses. The VMware acquisition is clearly paying off, with infrastructure software revenue up 26% for the year to $27 billion, driven by strong adoption of VMware Cloud Foundation. That’s solid, dependable growth. But then look at the non-AI semiconductor business—which includes things like broadband, wireless, and enterprise chips—and it’s basically flat, up a mere 2% in the quarter. Tan forecasts it to stay flat. This stark divide shows how completely the tech spending narrative has shifted. Enterprise and consumer electronics are in a rut, but if you’re selling anything for an AI data center, the checkbook is open. It’s a huge bet on that trend continuing unabated.
Shooting Down The Custom Chip Hype
One of the most interesting parts of the call was Tan directly addressing the hype that big AI players like Microsoft and others might design their own chips en masse. His message was basically: don’t believe the hype. He argued that developing a competitive custom accelerator is a “multi-year journey” and that LLM companies have to ask themselves where to best allocate resources. Why pour billions into silicon R&D when you have to compete against the relentless evolution of merchant GPUs from Nvidia and custom specialists like, well, Broadcom? He called the concept of widespread customer tooling an “overblown hypothesis.” It’s a confident stance that defends their entire custom AI business model. He’s betting that even the biggest tech giants will find it more efficient to outsource this complex, fast-moving hardware problem.
What It All Means
So, what’s the takeaway? Broadcom has successfully pivoted its massive scale to become an AI infrastructure powerhouse. The Anthropic deal reveal underscores that they’re a trusted partner for the biggest private AI labs. And that $73 billion backlog provides insane visibility and de-risks their growth story for at least the next year and a half. But it also creates a dependency. The question isn’t about 2026—that’s already set. The question is what happens after that $73 billion backlog is fulfilled. Does the AI spending frenzy sustain? Does the flat non-AI business ever recover? For now, though, Broadcom isn’t just participating in the AI boom. It’s building the very backbone of it, and getting paid tens of billions upfront to do so.
