Cerebras Launches $2B IPO Roadshow on Nasdaq

Cerebras Systems has kicked off a $2 billion IPO roadshow targeting a Nasdaq listing under ticker CBRS, anchored by a $10 billion compute contract with OpenAI.

Cerebras Launches $2B IPO Roadshow on Nasdaq

Cerebras Systems has started its IPO roadshow, targeting a Nasdaq listing this month under the ticker CBRS. The company is seeking to raise $2 billion at a valuation between $22 billion and $25 billion - roughly triple the $8.1 billion it commanded just six months ago. Morgan Stanley is leading the underwriting.

TL;DR

  • $2 billion raise targeted at a $22-25 billion valuation; Nasdaq listing expected April 2026 under ticker CBRS
  • Anchored by a $10 billion compute contract with OpenAI covering 750 megawatts through 2028 - the largest AI infrastructure deal ever awarded to a non-Nvidia supplier
  • CFIUS concern over UAE investor G42 blocked the original October 2025 IPO attempt; now cleared after G42 was removed from the cap table
  • Latest Series H (February 2026) raised $1 billion at a $23 billion valuation, led by Tiger Global

How Cerebras Got to the Roadshow

The story of this IPO is really a story about two things: a chip that doesn't look like other chips, and a contract that changed the company's financials overnight.

The Wafer Scale Engine 3 - WSE-3 - occupies an entire 300mm silicon wafer. Conventional processors are cut from wafers into individual dies. Cerebras skips that step completely. The result is a chip with 900,000 AI-optimized cores and 44 gigabytes of on-chip SRAM, compared to roughly 0.05 gigabytes for a Nvidia H100. Cerebras says the WSE-3 delivers performance equivalent to about 62 H100 GPUs. The critical advantage isn't raw compute - it's latency. No inter-chip communication means no waiting.

That architecture is well suited for inference at scale, which is exactly what OpenAI needed.

MetricCerebras WSE-3Nvidia H100
Die size46,225 mm²826 mm²
AI cores900,000528 tensor + 16,896 CUDA
On-chip memory44 GB SRAM~0.05 GB
Memory bandwidth21 PB/s~0.003 PB/s
Peak AI perf (FP16)125 PetaFLOPS~2 PetaFLOPS
Equivalent H100s~621

On January 14, 2026, Cerebras announced a compute deal with OpenAI valued at over $10 billion - 750 megawatts of capacity running through 2028. OpenAI described the arrangement as a way to speed responses for tasks that currently require more processing time. Andrew Feldman, Cerebras CEO, put it plainly: "broadband transformed the internet, real-time inference will transform AI."

That deal closed the narrative gap that had made earlier investors nervous. Before it, Cerebras was a compelling technology story with a customer concentration problem.

The G42 Problem - and Its Resolution

The first IPO attempt, withdrawn on October 3, 2025, collapsed over the company's ties to G42, an UAE-based technology conglomerate with links to Abu Dhabi sovereign wealth and, historically, to Chinese technology firms. At the time, G42 accounted for 83 to 87 percent of Cerebras revenue in the first half of 2024. The Committee on Foreign Investment in the United States flagged the relationship as a potential channel for advanced AI chips to reach sanctioned parties.

Cerebras resolved the issue by restructuring G42's stake to non-voting shares and removing the investor from the updated S-1 filing entirely. CFIUS cleared the transaction, and the company revived its public listing plans in December 2025, followed quickly by the $1 billion Series H in February 2026.

Who Benefits

The obvious winners are the early investors sitting on large markups. Benchmark Capital has backed Cerebras since the early rounds. AMD invested strategically - the company has an interest in seeing Nvidia alternatives succeed. Tiger Global led the Series H at the $23 billion valuation and will be looking for the IPO to validate that number or exceed it. Fidelity, Coatue, and Altimeter Capital also participated in recent rounds.

The Cerebras WSE-3 chip - a full wafer-scale processor held by a cleanroom-suited engineer The WSE-3 occupies an entire 300mm silicon wafer - the full disc visible here - rather than being diced into individual chips like conventional processors. Source: spectrum.ieee.org

For the broader AI chip market, a successful Cerebras IPO matters beyond the company itself. Rebellions raised $400 million ahead of its own public listing, and a string of chip startups closed billion-dollar funding rounds in rapid succession last year. Cerebras going out successfully would set a pricing benchmark and signal that public markets are willing to value AI infrastructure companies at frontier multiples, not just the big hyperscalers.

The OpenAI relationship is the most durable structural advantage. A three-year compute contract worth over $10 billion isn't a pilot. It's a revenue anchor. That single deal shifts the narrative from "impressive technology, unclear business model" to "high-growth infrastructure supplier with a named anchor customer."

As Daniel covered previously, OpenAI has been actively shopping its inference workloads - Cerebras was among the vendors tested, and it evidently impressed. The $10 billion contract is what came out of that evaluation.

Who Pays

Public market investors will be buying in at roughly 85 times Cerebras' annualized revenue run rate as of mid-2024, based on about $272 million in revenue growing at 245 percent year-over-year. That's an aggressive multiple even by AI standards.

Cerebras CEO Andrew Feldman Andrew Feldman co-founded Cerebras after selling his previous company, SeaMicro, to AMD in 2012. He has described the WSE-3's inference speed advantage as structurally important for next-generation AI applications. Source: datascience.uchicago.edu

The customer concentration issue has improved but hasn't disappeared. G42 is gone from the cap table, but OpenAI now represents the dominant revenue relationship. If that contract renews on worse terms in 2028, or if OpenAI decides to build its own inference infrastructure - which it has shown every sign of wanting to do - Cerebras faces a cliff. Investors buying at $23 billion are basically pricing in diversification that hasn't happened yet.

The competitive picture is also more complex than the roadshow narrative suggests. Nvidia isn't standing still. The H200 and forthcoming Blackwell Ultra chips close some of the inference speed gap. AMD and Intel are both pushing hard on inference acceleration, and both have deeper relationships with existing enterprise customers. Groq, which has been running inference-focused chips at commercial scale for longer, is a direct competitor for the same OpenAI-type workload. Cerebras' edge is real - but it exists at the frontier of what these workloads require today, not necessarily tomorrow.

There's also the question of what $2 billion in proceeds actually buys. Manufacturing at wafer scale is expensive. TSMC fab time for 300mm full-wafer production doesn't come cheap, and yield management at that die size is an engineering problem that never fully goes away. Capital intensity will remain high.

"Broadband transformed the internet, real-time inference will transform AI." - Andrew Feldman, CEO, Cerebras Systems

Counter-Argument

The bear case is real but it may be overstated. The WSE-3's architecture solves a problem - inter-chip latency at inference scale - that becomes more acute as models get larger and reasoning chains get longer. The OpenAI deal isn't just revenue; it's a validation signal that the largest AI buyer in the world chose Cerebras for a three-year strategic commitment. At $2 billion raised, the company has time to diversify its customer base before the OpenAI contract comes up for renewal.

What the Market Is Missing

The IPO valuation debate misses the structural shift underneath it. This isn't a chip company going public - it's a bet on a specific theory about how AI workloads will be served over the next five years. If inference becomes the dominant workload over training (and the direction of the field points that way), then Cerebras' wafer-scale architecture has a persistent advantage that conventional GPU clusters can't easily replicate. The $23 billion number reflects that theory more than the current revenue.

Whether the public market agrees will be visible in the first day of trading.


Sources: Cerebras IPO overview - accessipos.com | Cerebras targets April IPO to raise $2B - aicerts.ai | Cerebras IPO analysis - techi.com | OpenAI signs $10B Cerebras compute deal - TechCrunch | Cerebras 2026 IPO filing analysis - IndexBox | Cerebras revives IPO after CFIUS clearance - Tech Startups | WSE-3 chip specs - Tom's Hardware

Last updated

Cerebras Launches $2B IPO Roadshow on Nasdaq
About the author AI Industry & Policy Reporter

Daniel is a tech reporter who covers the business side of artificial intelligence - funding rounds, corporate strategy, regulatory battles, and the power dynamics between the labs racing to build frontier models.