Orply.

Cerebras Raises $5.55 Billion in Year’s Biggest IPO

Ed LudlowAndrew FeldmanBloomberg TechnologyThursday, May 14, 20266 min read

Cerebras chief executive Andrew Feldman used the AI chipmaker’s $5.55 billion IPO to argue that public investors are valuing the company as a fast-inference infrastructure supplier, not merely another semiconductor listing. In a Bloomberg Technology interview before trading began, Feldman said demand is concentrated around speed, claimed Cerebras is about 15 times faster than its nearest competitor, and pointed to large relationships with OpenAI and AWS as evidence of commercial traction, while acknowledging that the AWS agreement is still being finalized.

Cerebras is being valued as a fast-inference infrastructure supplier

Cerebras raised $5.55 billion in the year’s biggest IPO so far, pricing its shares at $185, above the top end of the marketing range. Before trading began, Bloomberg’s terminal indicated the shares could open at $350. Ed Ludlow put the implication to Cerebras directly: the market appeared to be “pricing in” the company as a major player in AI infrastructure. Andrew Feldman’s first response was brief: “Pretty good day, huh?”

Feldman described the listing as “the biggest tech” IPO among the largest in history and “the biggest semi IPO in history.” He treated it less as a finish line than as a financing and validation event after “a decade of work,” saying the company was ready to “get back to work” on the next phase.

That next phase, in Feldman’s account, is inference: serving AI model outputs quickly enough that users do not wait. He said demand for “fast inference” is “extraordinary” because AI has become useful and “everybody wants it to be fast.” Cerebras, he claimed, is “not by a little bit” faster than competitors but “by more than an order of magnitude,” putting the company at 15 times faster than the nearest competitor.

15×
Feldman’s claimed speed advantage over the nearest competitor

The offering’s own demand showed how aggressively investors wanted exposure. Feldman said the IPO was more than 25 times oversubscribed, with far more demand for stock than shares available at both the institutional and retail levels. Asked why Cerebras had not done more for retail investors, he said “nobody got what they wanted” and that hard allocation decisions had to be made. He did not describe a specific retail program or disclose how much stock went to retail buyers, but said Cerebras was “very comfortable” with the final allocation and believed it had acted “with integrity.”

25×+
Feldman’s description of IPO oversubscription

The commercial proof Feldman offered was concentrated in large AI and cloud relationships. In the previous four months, he said, Cerebras had announced a deal with OpenAI worth more than $20 billion for 750 megawatts of compute, as well as a major engagement with AWS under which Cerebras equipment would be deployed in AWS data centers.

He also said there were “dozens” of other relationships in a category that used to count as a large deal: $10 million to $50 million. Ludlow pressed the practical investor question: performance claims and “dollar per token” metrics can look strong “on paper,” but public-market investors still need to know when engagements become recognized revenue.

The AWS agreement has not yet become a completed master deal

The AWS agreement remains at binding-term-sheet stage while the master agreement is negotiated. Andrew Feldman said Cerebras had signed a binding term sheet with AWS, as described in the company’s S-1, and was working through the master agreement.

With organizations of AWS’s size, Feldman said, “it takes a little time to dot all the I’s and cross the T’s.” He nonetheless described AWS as a potentially “enormous channel” and partner for bringing Cerebras technology to large and medium-sized enterprises globally.

Feldman’s channel argument rests on AWS’s enterprise reach. He described AWS as one of the preferred cloud providers for “just about every enterprise on earth,” and said embedding Cerebras’s solution into AWS’s offering, including as part of Bedrock, would be “a huge win” for the company.

The investment question is therefore not only whether AWS wants access to Cerebras systems, but how quickly a binding term sheet becomes a completed commercial agreement and revenue-producing deployment.

Vertical integration is central to the performance and margin case

Cerebras’s architecture is not being sold as a conventional chip-in-a-box story. Ed Ludlow described the company as “full stack” and “fully vertically integrated,” building the supercomputer “top to tail.” He contrasted that with Nvidia’s role in GPUs and trays, with companies such as Dell or Supermicro assembling systems around them. Ludlow cited Dell margins in the low teens, Nvidia margins in the mid-70s, and Cerebras margins around 40% to 41%, asking why owning the whole system should pay off over time and what margin outlook investors should expect.

Andrew Feldman pointed to two levers. First, scale should improve the cost structure. The company did about half a billion dollars in sales last year, he said, which meant it put about $250 million into the supply chain. “Obviously that’s not an efficient spot,” he said. As Cerebras grows, Feldman expects more leverage with suppliers and lower cost of goods.

Second, he said the company has an opportunity to increase prices because demand for fast inference is “overwhelming this minute.” He did not provide a target gross margin, but said that over the long run Cerebras would be “really proud” of where gross margins settle as the company reaches scale.

The same logic shaped his answer to a terminal client’s question: if Cerebras’s differentiation is custom wafer-scale silicon, why not sell the silicon rather than the whole server?

Feldman argued that the product cannot be separated so cleanly. He said that for the entire 70-year history of the compute industry, prior attempts to build a chip of Cerebras’s size had failed. For a general audience, he put the contrast simply: Cerebras’s chip is “the size of a dinner plate,” while traditional chips are “the size of a postage stamp.” Others, he said, had tried to copy the approach and failed.

The chip alone, in Feldman’s explanation, does not produce the claimed performance. Packaging, system design, power delivery, and I/O are all part of the result. A strong chip can lose performance if an original design manufacturer or system vendor fails to deliver enough power or I/O, he said. He connected that reasoning to Nvidia’s move to control I/O through NVLink: Nvidia, in his account, did not want others “to nibble away” at performance.

Cerebras’s answer is to control the system. Feldman compared the question to asking Porsche why it does not just sell engines. A 911, he said, is a beautiful car because of “the engine and everything else they put in it.”

You don’t just get 15 or 18 or 20 times faster than the competition because you built a good chip.

Andrew Feldman

The IPO proceeds are for capacity

Asked how Cerebras would use the $5.55 billion in IPO proceeds, Andrew Feldman gave the straightforward answer: increase capacity. Ludlow’s question used the rounded figure of $5.5 billion and asked how flexible Cerebras could now be in allocating capacity to new customers. Feldman said the company wants to bring on many new customers and can be “aggressive” because demand for its offering is high.

The public-market thesis Feldman described depends on Cerebras converting that demand into deployed capacity: fast inference as the need, large customers and cloud channels as distribution, vertical integration as the performance claim, and scale as the path to better margins. Ludlow’s pressure points were the execution risks around that thesis: completing the AWS master agreement, moving engagements into revenue, sustaining or improving margins, and allocating scarce capacity after an offering whose demand far exceeded supply.

The frontier, in your inbox tomorrow at 08:00.

Sign up free. Pick the industry Briefs you want. Tomorrow morning, they land. No credit card.

Sign up free