
Pretty darn strong. Here's the company's quarterly results breakdown:
In the table, we can see greater than 100% growth from the December 2024 quarter to the December 2025 period. We also see gross margin improvement in recent quarters, though Cerebras did manage better gross margin results in early 2024 when it was far smaller.
Setting aside quarters due to accounting wiggles, Cerebras's most recent two quarters yielded incredibly impressive growth (+31% and +26%, sequentually, respectively), with net losses that are more than tolerable for a company as close to the cutting edge of AI compute as Cerebras appears to be.
Comparing calendar 2024 with 2025 yields the following metrics:
We can see from the numbers that both sides of Cerebras's business are doing well. People want to buy its chips, and the company is seeing quickly rising demand for use of its chips to handle inference. That's a double threat.
Wait, but what about customer concentration? Looking backwards, Cerebras has not done a brilliant job diversifying its customer base. Looking forward, it has.
Let me explain. If we read the S-1 filing regarding 2025 results, the picture is about as bleak as it was back in 2024:
A substantial portion of our revenue is driven by a limited number of customers. Group 42 Holding Ltd (together with its affiliates, "G42") accounted for 24.0% and 85.0% of our total revenue for the
years ended December 31, 2025 and 2024, respectively, and in the year ended December 31, 2025, Mohamed bin Zayed University of Artificial Intelligence ("MBZUAI") accounted for 62.0% of our total revenue.
While I don't want to overstate my knowledge of the inner workings of the Emirati economy, it is worth mentioning that Peng Xiao is both Group CEO of G42 and a member of the MBZUAI board of trustees. Other people also hold roles at both enterprises. So when we consider Cerebras's 2024 and 2025, we see results that are incredibly proscribed to not merely the MENA region, or even the UAE, but Abu Dhabi industry itself.
Looking ahead, the picture changes rapidly. In December of 2025, Cerebras signed a massive deal with OpenAI. Announced publicly in January of this year, "OpenAI and Cerebras have signed a multi-year agreement to deploy 750 megawatts of Cerebras wafer-scale systems to serve OpenAI customers." Per OpenAI, the capacity will come online in tranches.
Cerebras also signed a deal in March with Amazon Web Services (AWS), which will see the cloud platform "become the first hyperscaler to deploy Cerebras systems in its data centers." The deal includes the creation of a "co-designed, disaggregated inference-serving solution that will integrate AWS Trainium3 chips with Cerebras CS-3 systems, connected via high-bandwidth networking, to partition inference workloads across Trainium3 and CS-3." Sounds great. If you want to get access to market demand, being present in AWS is a big deal. (Just ask OpenAI!)
The OpenAI deal has big bones. The AWS agreement could matter, too. Cerebras notes that it has $24.6 billion worth of remaining performance obligations (RPOs), with a "significant amount of the balance [being] attributable to the Company's obligations pursuant to a master relationship agreement with OpenAI."
Does this resolve the revenue concentration concerns? Partially! Deals with OpenAI and AWS certainly make Cerebras less reliant on its historically-critical MENA customers. But the proof will come in its revenue diversifying in practice (results), and not merely theory (forecasts).
When will we learn more? Given that Cerebras likely waited to refile to go public until both its OpenAI and AWS deals were locked in. The company didn't want a repeat of its first run at the public markets.
Thanks to its IPO refiling timing, we can expect Cerebras to provide some information about its Q1 2026 results before it prices. That means newer, fresher information on the OpenAI deal's impact on its results, if any. We'll still be staring at the very first months of the arrangement, meaning we might not see much revenue from it at this juncture.
What's the real bet here? That purpose-built chips for handling AI inference become more popular over time. While the venerable GPU has a lot going for it, we're seeing major clouds build their own chips (Amazon, Google, Microsoft, Meta) for a reason. Yes, derisking from a single supplier source is a goal. But so too are chips that are more efficient at a specific AI task, not merely performant for all.
The underlying bet to that wager is that demand for AI compute continues to scale. As we discussed this morning, the compute crunch is showing little progress towards loosening. How long the world will prove compute-constrained is up to your judgment, and your interest in snapping up Cerebras shares in its IPO will likely hinge on how bullish you are on future compute demand.
All told, Cerebras's bet on big fucking chips is coming good, and the company has a solid shot at real revenue diversification in the coming years. Precisely how to price Cerebras we can leave to the market. But I don't think it will take too long to get Cerebras public, and its backers liquid.