
There are literally dozens of startups all gunning for a piece of the AI semiconductor pie that Nvidia has had almost entirely to itself. The vast majority of these companies will never get a product out or will be acquired by a bigger player, perhaps even NVIDIA.
But if there is one company positioned to give Nvidia a run for its money, it is Cerebras Systems. The company's Wafer Scale Engine chip is the size of a pizza box and has 850,000 cores. By contrast, an NVIDIA GPU has about 5,000 cores and is about two inches square.
The cores are called Sparse Linear Algebra Cores, or SLA. The cores are designed specifically for AI work and are optimized for the type of math that is essential to neural network calculation. These cores are designed to do nothing but SLA math. That makes them a one-trick pony but it's a good trick.
Cerebras's claim to fame is parallel processing, with 850,000 cores all on one chip about 10 inches square and all connected by a high-speed interconnect. AI training is highly dependent on parallelism, putting it in Cerebras's wheelhouse.
Sure, you can build an NVIDIA cluster with 850,000 cores, but you'd need a data center the size of a sports arena and a nuclear power plant to run it. NVIDIA servers contain eight GPUs max. After that, if the training application needs more cores, it has to go out to another physical machine over networking and that slows things down. Networking connections are fast, but they can't match the interconnect between CPU cores, thus making them bottlenecks.
The result is tremendous training speed. In 2024, Cerebras announced a benchmark in molecular dynamics simulations. Data from third-party benchmark firm Artificial Analysis showed a single Cerebras CS-2 system with one WSE-2 chip ran 748x faster than Frontier, the world's fastest supercomputer at the time.
Mind you, that benchmark ran on a single CS-2 chip in a single rack using 27 kilowatts of power. Frontier has 37,000 GPUs and CPUs in rows of cabinets and consumes 21 megawatts of power. It's not even close.
Now Cerebras is going public, claiming a $23 billion valuation backed by a $20 billion Master Relationship Agreement with OpenAI for 750 MW of inference compute capacity. And unlike a lot of AI startups, Cerebras has revenues to show for its work. For 2025, it saw 76% year-over-year revenue growth to $510 million in 2025, but it also had a non-GAAP net loss of $75.7 million.
However, the devil is in the details. The vast majority of Cerebras's revenue comes from four sources, starting with two related entities in the United Arab Emirates: the Mohamed bin Zayed University of Artificial Intelligence (MBZUAI), which accounted for 62.0% of total revenue in 2025, and Group 42 (G42), which accounted for 24.0% of 2025 revenue.
This makes Cerebras extremely vulnerable to the instabilities of the Middle East. If the UAE gets drawn into the war with Iran, that might impact potential business, even though Cerebras's involvement has nothing to do with energy production.
The other source of revenue are from OpenAI, which signed a multi-year, $20 billion agreement with Cerebras, and Amazon Web Services, which did not disclose its revenue. It goes without saying that no vendor wants to be beholden to just four customers, especially when two of them are in one of the most unstable parts of the world.
Its finances appear to be pretty stable, but that can be deceiving. It reported $237.8 million in GAAP net income for 2025, a dramatic improvement over the $481.6 million loss in 2024. But it achieved this through a one-time, non-cash accounting trick related to one of its customers. When this paper gain is excluded, the company actually posted a non-GAAP net loss of $75.7 million.
Its deal with OpenAI is complicated. In 2025 it signed a Master Relationship Agreement (MRA) with OpenAI valued at over $20 billion. Under the MRA, OpenAI is contractually obligated to purchase 750 MW of AI inference compute capacity, with an option to expand to 2 GW by 2030.
Cerebras has a potent technology. What it doesn't have are customers and reliable income. An IPO in such a state is a very risky proposition. Being a public company is a distraction it does not need when it should be focused on getting its customer base out of the single digits.