A Chip Company Worth Watching Goes Public
Cerebras Systems is not a dark horse anymore — it’s a company with $510 million in revenue, a $20 billion deal with OpenAI, and an IPO filing that makes a pretty convincing case that the AI chip space has room for more than one serious player.
As someone who spends most of their time building bots and thinking about the infrastructure that makes them run, I’ve been watching Cerebras for a while. Not because of hype, but because the hardware question underneath every AI project is real. When you’re building anything that calls a model at scale, you start caring very quickly about what’s actually doing the compute — and who controls it.
The Numbers That Matter
Let’s start with what Cerebras actually reported. For 2025, the Sunnyvale-based company posted $510 million in revenue with a net income of $87.9 million. Strip out certain one-time items and that net income figure climbs to $237.8 million. That’s not a startup burning cash and hoping for the best. That’s a company that has figured out how to sell chips and make money doing it.
The IPO is targeting roughly $2 billion in raised capital. For context, Cerebras filed confidentially about six months before going public with the paperwork — a move that suggests they were watching market conditions carefully before committing. That kind of patience is actually a good sign. It means the people running this company are thinking strategically, not just chasing a valuation window.
The OpenAI Deal Changes the Conversation
The detail that really caught my attention was the OpenAI contract. Cerebras sealed a $20 billion deal to supply OpenAI with servers powered by its chips over three years. The specifics are significant: the contract calls for Cerebras making 250 megawatts available each year between 2026 and 2028, with OpenAI holding the option to purchase an additional 1.25 gigawatts.
That’s not a pilot program. That’s a long-term infrastructure commitment from one of the most resource-hungry AI companies on the planet. When OpenAI signs a multi-year, multi-billion dollar hardware deal with you, it signals something important — that your chips are good enough to run serious workloads, and that a major buyer is willing to bet their compute roadmap on you.
For bot builders and developers, this matters more than it might seem. The chips powering inference at scale directly affect latency, cost, and availability. More competition in the chip space — real competition, backed by real revenue — is good for everyone building on top of these models.
What This Means for the AI Infrastructure Space
Nvidia has dominated AI compute for years, and that dominance is real. But Cerebras has been building a different kind of chip — the Wafer Scale Engine, which takes a fundamentally different architectural approach by putting an entire chip on a single wafer. The result is a processor with dramatically more on-chip memory and bandwidth than traditional GPU designs.
Whether that architecture wins long-term is a separate question. What the IPO filing tells us is that it’s winning enough contracts right now to build a serious business. And a serious business going public means more scrutiny, more transparency, and more pressure to keep delivering.
A Builder’s Take
From where I sit — writing code, wiring up APIs, thinking about how bots actually run in production — the Cerebras IPO is a signal worth paying attention to. Not because you need to buy the stock, but because it confirms that the AI hardware market is maturing fast.
We’re moving out of the phase where one company controls the entire compute stack and into something more competitive. That means more options for cloud providers, more pressure on pricing, and eventually more choices for developers who care about where their inference is actually running.
Cerebras going public with $510 million in revenue and a locked-in deal with OpenAI isn’t just a financial story. It’s a sign that the infrastructure layer of AI is getting more interesting — and more contested — by the month. For anyone building in this space, that’s worth keeping an eye on.
🕒 Published:
Related Articles
- Conseils pour le développement de bots : Leçons tirées de 12 bots
- Huawei’s FP4 Flex: Why Bot Builders Should Care About Atlas 350
- Anthropic’s $30 Billion Run Rate Means Your Bot Infrastructure Just Got More Expensive
- Runway Coloca Dinheiro Onde Estão Seus Modelos Com Fundo de $10M Para Criadores de Vídeo AI