Highlights:

  • The company claims that the WSE-3 enables customers to train AI models more quickly than traditional chips allow.
  • Cerebras’ WSE-3 enhances the efficiency of AI clusters by minimizing data movement.

Recently, a startup specializing in wafer-sized chips for artificial intelligence applications, Cerebras Systems Inc., has filed to go public.

This move comes as no surprise. In July, Cerebras submitted a confidential draft of the filing to the Securities and Exchange Commission. Last week, sources indicated that the company aims to raise between USD 750 million and USD 1 billion in its initial public offering, with a valuation of up to USD 8 billion.

Cerebras’ flagship product, the WSE-3 AI chip, contains four trillion transistors arranged into nearly 1 million cores and 44 gigabytes of high-speed SRAM memory. Cerebras claims that the chip offers around 50 times more cores and 880 times more memory than the largest commercially available graphics card.

The company asserts that the WSE-3 can enable customers to train AI models more quickly than traditional chips allow. Additionally, it promises to deliver improvements in power efficiency.

Advanced AI models usually operate on multiple graphics processing units (GPUs). In training and inference tasks, these GPUs frequently need to exchange data with one another. The speed of data transfer between the individual chips directly influences the performance of the overall AI cluster: the quicker a piece of data reaches a GPU, the sooner processing can commence.

Data movement is also a key factor in the power consumption of computing clusters. Transferring information between GPUs requires a substantial amount of electricity, especially in large AI setups with thousands of chips.

Cerebras’ WSE-3 enhances the efficiency of AI clusters by minimizing data movement. Its high transistor count enables it to handle large AI models that would typically require distribution across multiple graphics cards. Running an entire neural network on a single WSE-3 eliminates the need to transfer data between GPUs located in different parts of a server rack.

Cerebras encloses the WSE-3 in a compute appliance called the CS-3, which is approximately the size of a mini fridge. The system integrates a single WSE-3 chip along with cooling equipment, power delivery modules, and other supporting components. Customers can scale up to 2,048 CS-3 units within a single AI cluster.

Cerebras has yet to achieve profitability but is witnessing significant revenue growth. Between 2022 and 2023, its sales more than tripled, reaching USD 78.7 million. This upward trend carried into 2024, allowing the company to generate USD 136.4 million in just the first six months of the year.

However, these sales have been highly concentrated. The New York Times reports that 87% of Cerebras’ revenue for the first half of the year originated from the UAE-based AI company G42.

In its recent IPO filing, the chipmaker outlined its strategy to sustain revenue growth by acquring more customers and improving its technology. This initiative will focus on creating “new products and form factors.” The filing also highlighted Cerebras’ recently introduced cloud service for inference workloads as another potential avenue for revenue growth.

The chipmaker intends to list its shares on the Nasdaq using the ticker symbol “CBRS,” with Citigroup and Barclays serving as the lead underwriters.