Cerebras Systems Inc., an emerging player in the artificial intelligence (AI) computing space, is making bold moves to challenge Nvidia Corp. with the introduction of a groundbreaking new chip. This advanced processor, the company claims, is set to outperform competitors in running AI models and generating rapid responses, offering a significant leap in AI capabilities.
The Silicon Valley-based startup is providing this cutting-edge chip as part of comprehensive computing systems that can be purchased and operated by data center operators. Additionally, Cerebras offers a pay-as-you-go service where the computing power is managed by the company itself. With a confidential plan for an initial public offering (IPO) already in the works, Cerebras is positioning itself to capture a share of the rapidly expanding AI market. Currently, the largest tech companies are collectively investing tens of billions of dollars into AI infrastructure, with Nvidia emerging as the primary beneficiary due to the essential role of its graphics processing units (GPUs) in AI development.
However, Cerebras’ founder and CEO, Andrew Feldman, believes that his company’s technology will disrupt the industry by making AI systems significantly more responsive, likening this advancement to the shift from dial-up to high-speed internet. “Until now, we were in the dial-up era,” Feldman remarked during a product announcement event in San Francisco. “No amount of GPUs can be combined to achieve what we’re delivering.”
Cerebras’ approach is distinguished by its unique use of massive chips, each crafted from a single silicon wafer. In contrast, even the most powerful conventional processors are much smaller, with multiple chips typically derived from one wafer. This innovative design enables Cerebras’ chips to surpass traditional processors in capability. However, this also necessitates custom-designed computers to house the oversized chips, as standard hardware is not compatible.
One of the critical advantages of Cerebras’ technology lies in its integrated memory usage. Unlike GPUs and other processors that rely on external memory interfaces, Cerebras’ chips incorporate memory directly, which significantly enhances processing efficiency.
Despite these innovations, Cerebras faces the daunting task of competing against Nvidia, a company with a substantial lead in AI infrastructure. Other challengers, including Intel Corp., have struggled to gain traction in this market. For Cerebras to succeed, it must convincingly demonstrate to the computing industry that it can reliably produce and deploy its technology on a large scale.
Cerebras is also setting up its data centers to offer AI computing as a service and is working to sell its chips to major cloud providers like Microsoft Corp. and Amazon.com Inc. Although the startup has engaged with these tech giants, it has not yet secured them as customers.
When asked about the potential market share Cerebras could capture from Nvidia, Feldman confidently replied, “Enough to make them angry.”