Cerebras IPO: Nvidia Competitor Goes Public
In recent years, the AI revolution has introduced new players and exciting technological solutions to the semiconductor industry. Among the most promising is Cerebras Systems, a California-based startup that recently announced its intention to go public.

But what makes this company so special, and why does it pose a potential challenge to Nvidia's dominance in the AI chip market? A challenge to Nvidia?
Who is this new player on the market?
Cerebras Systems is an innovative AI chip manufacturer known for its "wafer-scale engine" technology. (The term "engine" here is used metaphorically, as the WSE is the "driving force" behind computational processes). The company recently filed for an initial public offering (IPO) and plans to list on the Nasdaq under the ticker symbol "CBRS".
This move signifies entry into an extremely competitive market where Cerebras must contend not only with giants like Nvidia but also with cloud service providers developing their own AI chips.
How is Cerebras performing financially?
Cerebras Systems' financial performance shows impressive growth. Let's look at the numbers:
- In 2022, sales were $24.62 million.
- In 2023, this amount jumped to $78.74 million.
- In the first half of 2024, they already reached $136.4 million.
This means they achieved a 220% revenue increase from 2022 to 2023!
However, like many rapidly growing tech companies, Cerebras is still operating at a loss. The good news is that these losses are showing a decreasing trend:
- In the first half of 2023, the net loss was $77.8 million.
- In the same period of 2024, this decreased to $66.6 million.
The gross profit margin has also improved: it was 11.7% in 2022, rising to 33.5% in 2023.
An interesting fact is that a significant portion of Cerebras' revenue comes from a single customer. G42, a company based in the United Arab Emirates, accounted for 87% of revenue in the first half of 2024. This company placed a massive order worth $1.43 billion for Cerebras products through the end of 2025.
What makes Cerebras' technology special?
Cerebras' latest product, the WSE-3 (Wafer-Scale Engine 3), appears to be a truly revolutionary solution. But what does this mean exactly?
Imagine a traditional computer chip is like an apartment in a building. The Cerebras WSE-3, in comparison, is like converting the entire building into one single, enormous apartment!
A Wafer-Scale Engine (WSE) is a specialized, high-performance computer chip that utilizes the entire area of a semiconductor wafer, instead of being diced into multiple smaller chips. Traditional methods involve cutting a wafer into many smaller chips (dies) that operate separately. In contrast, the WSE functions as a single massive chip, providing significantly increased computational power and speed, particularly for artificial intelligence (AI), machine learning, and big data processing.
Some impressive stats about the WSE-3:
- It contains 900,000 AI-optimized cores. That's 52 times more than leading GPUs (Graphics Processing Units).
- It packs 4 trillion (!) transistors onto a single chip measuring 46,225 mm². This is 57 times larger than leading GPUs.
What does this mean in practice? The Cerebras chip can deliver performance on a single device that previously required entire server farms. This not only saves space and energy but could potentially revolutionize the training and operation of large AI models.
Metric | WSE-3 | Nvidia H100 | Cerebras Advantage |
---|---|---|---|
Chip Size | 46,225 mm² | 814 mm² | 57 X |
Cores | 900,000 | 16,896 FP32 + 528 Tensor | ~52X (AI Optimized) |
On-Chip Memory (SRAM) | 44 Gigabytes | ~50MB | ~880 X |
Memory Bandwidth | 21 Petabytes/sec | ~3TB/s | ~7,000 X |
Fabric Bandwidth | 214 Petabits/sec | 900 GB/s (NVLink 4) | ~30 X (vs NVLink 4)* |
*Note: Fabric bandwidth comparison needs careful source check; adjusted based on common specs
How does the competition stack up in the AI chip market?
The AI chip market has become a real battleground in recent years. Although Nvidia remains the dominant player, it faces numerous challengers:
- AMD plans to sell $2 billion worth of AI chips in 2024.
- Intel, Google, and Qualcomm are also developing their own AI-optimized processors.
- Cloud service providers like Amazon, Microsoft, and Google are developing AI chips in-house to reduce dependence on external suppliers.
The market is poised for huge growth: estimates suggest it could reach a value of $311.58 billion by 2029, representing an annual growth rate of 20.4%.
The Asia-Pacific region is expected to be the fastest-growing region for AI chips, thanks to investments from countries like China, South Korea, and Japan.
Summary
Cerebras Systems' move to go public is an exciting development in the world of AI technology. With their innovative solution, they could pose a real challenge to Nvidia, although achieving financial stability remains a goal. What is certain is that the AI chip market is undergoing revolutionary changes, and the coming years promise exciting developments in this field.