Cerebras Raises $5.55 Billion as AI Infrastructure Demand Lifts Tech Markets
Ed Ludlow
Caroline Hyde
Ryan Vlastelica
Jensen Huang
Tyler Kendall
Michelle Giuda
Börje Ekholm
Andrew Feldman
Tom Hale
Tasos Vossos
Carmen Arroyo
Bailey LipschultzBloomberg TechnologyThursday, May 14, 202615 min readCerebras raised $5.55bn in the year’s largest US IPO while Cisco shares jumped on a higher hyperscaler-orders forecast, putting both a new AI compute listing and an incumbent networking supplier in the market’s AI infrastructure trade. Cerebras CEO Andrew Feldman argued that the company’s wafer-scale systems, OpenAI deal and AWS engagement show it can become a major compute supplier; Bloomberg reporters pressed the harder question of how much of today’s AI infrastructure demand will turn into broad, durable revenue.

Cerebras used the IPO window to price itself as an AI infrastructure contender
Cerebras entered the public market with the largest US IPO of the year so far, pricing at $185 a share and raising $5.55 billion. Its shares were indicated around $350 shortly before trading, after earlier indications as high as $400. Ed Ludlow described the setup as “mayhem” and cautioned that indicated opening prices on the Bloomberg Terminal do not always match where trading begins.
Bloomberg’s Bailey Lipschultz said the order book showed “untapped demand”: Bloomberg had reported the deal was more than 20 times oversubscribed, with $10 billion of demand before launch. Buy-side conversations, he said, were already framing the question relative to Nvidia’s scale: if Nvidia is worth $5 trillion, some investors were asking whether Cerebras could trade closer to $100 billion.
The central investor question was whether Cerebras is merely the next semiconductor listing to benefit from AI enthusiasm, or whether it has credible evidence of becoming a major compute supplier. Ludlow described the company as a full-stack, vertically integrated supercomputer maker: unlike Nvidia, which sells chips and increasingly trays while server partners such as Dell or Super Micro assemble systems, Cerebras sells the broader package.
Andrew Feldman framed the IPO as the result of a decade of work and called it “the biggest semi IPO in history.” Pressed on whether the market was now pricing Cerebras as a major player, Feldman pointed first to a recently announced OpenAI deal “north of $20 billion” for 750 megawatts of compute, and then to what he called a major engagement with AWS, under which Cerebras equipment would be deployed in AWS data centers.
He also said there were “dozens” of other customers or prospective customers in what used to be considered a large-deal range of $10 million to $50 million. His core product claim was performance in fast inference: Cerebras, he said, is “15 times faster than the next nearest competitor.”
Ludlow pressed Feldman on the distinction between announcements and revenue. Feldman said the AWS relationship was governed by a binding term sheet described in the company’s S-1, and that Cerebras was working through the master agreement. He said large-enterprise agreements take time, but argued that AWS could become an “enormous channel” because its cloud offering reaches large and medium-sized enterprises around the world.
The company’s prior concentration risk also remained in the background. Lipschultz said that when Cerebras first tried to go public, “call it two years ago,” the concern was that G42 was effectively its only customer. He said the company had since broadened that base to include OpenAI and others, but the investor question remains how broad the demand can become if the current compute buildout is still in its early innings.
Feldman’s explanation of Cerebras’s technical strategy centered on why it sells systems rather than just silicon. Asked why the company could not simply sell its wafer-scale chip, he said every previous effort over the computer industry’s 70-year history had failed to build a chip of that size. For the audience, he described Cerebras’s chip as “the size of a dinner plate,” compared with traditional chips “the size of a postage stamp.”
The system-level argument was that performance comes from the chip, packaging, power, input/output, and system design together. Feldman said an outside system vendor or ODM could “nibble away” at performance by failing to deliver the right amount of power or I/O, and compared the approach to Porsche selling a complete 911 rather than just engines.
Ludlow also pushed on margins. He contrasted Dell’s low-teens margins and Nvidia’s mid-70s margins with Cerebras at roughly 40% to 41%, and asked why vertical integration would improve the long-term economics. Feldman said Cerebras did about half a billion dollars in sales last year and put roughly $250 million into the supply chain, which he characterized as an inefficient scale point. As the company grows, he said, it expects more leverage in the supply chain, lower cost of goods, and possible pricing power because demand for fast inference is “overwhelming.”
The IPO allocation drew scrutiny from Bloomberg Tech’s audience. Ludlow asked why the company had not done more for retail investors. Feldman said the deal was more than 25 times oversubscribed and that “nobody got what they wanted.” He said Cerebras was comfortable with the final allocation and believed it had acted with integrity.
The use of proceeds, in Feldman’s telling, is capacity. With $5.5 billion raised, he said the company would use the money to increase capacity and bring many new customers on board.
Cisco became the market’s clearest AI infrastructure stock of the day
Cisco’s results gave the market a more established AI infrastructure story to trade. Shares rose about 15%, on track for their biggest jump since 2011; at the open, a 17% gain had put Cisco on track for its best day since 2002. The number driving the reaction was $9 billion: Cisco’s expected orders from hyperscalers in calendar 2026, up from a previous $5 billion target. Ludlow said about $4 billion of that would translate into revenue this year.
Bloomberg equities reporter Ryan Vlastelica said analysts were broadly positive because Cisco now appears to have a place in the AI infrastructure “firmament.” The demand is not for GPUs directly, but for the physical networking layer of data centers: optical systems, cables, switches, routers, and related infrastructure. Vlastelica said investors have increasingly recognized this year that optical and networking companies are a major part of the AI data-center buildout.
Cisco’s position matters partly because it is a legacy company with a broad installed product suite. Vlastelica said the latest numbers gave analysts additional confirmation that Cisco is well positioned as AI data centers continue to expand. Ludlow distilled the point more plainly: networking is “your cables, your switches, your routers,” and those components go into real data centers.
There was also a cost-discipline thread. Ludlow noted Cisco is cutting about 4,000 roles and asked how the company had framed that amid enthusiasm for AI. Vlastelica said Cisco presented the cuts as part of a focus on AI, with investment priorities shifting toward that end market.
For Vlastelica, the historical resonance was unavoidable. Cisco was one of the defining companies of the dot-com infrastructure buildout, then took years to surpass its dot-com era highs. He called it “poetic justice” that AI is now helping bring Cisco back to record levels, making it part of the current technology buildout much as it was part of the internet buildout.
Klarna also traded sharply higher after reporting first-quarter net income of $1 million, compared with a $99 million loss a year earlier, and revenue up 44% to $1 billion. Ludlow attributed the revenue growth to higher interest income, debit-card signups, and partnership fees. But in the day’s market narrative, Cisco and Cerebras were the more consequential AI infrastructure signals: one an incumbent networking supplier being rerated, the other a new public-market test of demand for specialized AI compute.
The Trump-Xi meeting stabilized some language but did not resolve the Taiwan tension
The US-China discussion turned on a contrast between diplomatic signaling and material policy change. Nvidia CEO Jensen Huang appeared in Beijing and called the meeting between President Trump and Xi Jinping “one of the most important summits in human history.” Asked whether he had obtained specific customers for Nvidia’s H200 chips, Huang said the trip was not about that: he was there “to support the President and to represent the United States.”
The more sensitive issue was Taiwan. Bloomberg’s Tyler Kendall said Chinese state media described Taiwan as a “highly dangerous situation” that could lead to “clashes” between the US and China. Kendall characterized the overarching aim of the Beijing meetings as stabilizing US-China ties rather than overhauling them. He also noted a mismatch between readouts: Taiwan appeared prominently in the Chinese account but was not mentioned in the US readout, and Secretary of State Marco Rubio appeared to downplay the extent of the concern.
Michelle Giuda, CEO of the Krach Institute for Tech Diplomacy at Purdue and a former Assistant Secretary of State for Global Public Affairs in the first Trump administration, said there was nothing new in China’s position on Taiwan. Beijing has long treated Taiwan as central to US-China relations, she said, and Xi has been pressuring the US on arms sales to Taiwan for some time. She also said the US position had not changed: maintaining the status quo.
Ludlow brought the Taiwan issue back to semiconductors, noting the US dependence on Taiwan for semiconductor supply and specifically TSMC’s manufacturing footprint. Giuda said US-Taiwan ties remain strong and “run deep,” with semiconductors as a critical piece. She also pointed to TSMC’s buildout in the United States as part of the broader collaboration.
The meeting also produced claims around Iran. Ludlow cited a Fox News interview clip reviewed by Bloomberg in which Trump said Xi had offered help on Iran and pledged not to send weapons. Kendall said the US side described Xi as saying China would not supply Iran with weapons and would help bring Iran to the negotiating table, potentially affecting discussions around the Strait of Hormuz. But Kendall emphasized that the Chinese readout did not indicate China was willing to go further in helping ease the conflict, beyond saying the leaders had extensive dialogue about the Middle East.
Giuda called it unsurprising that Iran was discussed, given that it was “looming over” the 36-hour summit. She described any step by Xi to stop supporting the Iranian regime as positive from the US perspective, while placing the issue inside a larger strategic competition: the United States seeking what she called a “golden age of America,” and China pursuing “China rejuvenation.” The question, in her framing, was whether the summit helped the United States move incrementally faster toward its vision.
The presence of major US technology CEOs — including Elon Musk, Tim Cook, and Huang — was itself a signal. Kendall said Xi told executives that “China’s door to the outside world will only open wider,” and that Chinese state media reported the executives highly value the Chinese market. But he said there were not yet tangible steps showing that increased openness. Some members of the delegation already have substantial China exposure, while companies such as Micron have also faced Chinese restrictions over cybersecurity concerns.
Asked what it takes to assemble a delegation of this kind, Giuda said the preparation behind such summits is normally extensive, especially around agendas and readouts. But she called this specific delegation unprecedented. In her view, Trump’s nontraditional style and the speed of tech executives left more room for improvisation: “a lot of planning,” but also “a lot of room for ad libbing.”
Ericsson’s China argument was interdependence, not decoupling
At the Spark Global Leadership Summit in Napa Valley, Ericsson CEO Börje Ekholm described China as both a critical market and a benchmark competitor. Asked by Caroline Hyde about US-China relations and Ericsson’s China exposure, Ekholm traced the Chinese telecom sector from a state-prioritized industry with nearly 100 vendors three decades ago to a market consolidated around two formidable competitors, with Huawei the largest.
Huawei, he said, began as low-cost and relatively simple, but is now a “phenomenal competitor.” Ericsson benchmarks itself against Huawei and believes it must beat China on both technology and cost position. For the rest of the world to compete with China, Ekholm said, it must lead on technology.
Hyde pressed him on bans and national industrial policy, noting that Sweden had pushed out Chinese competition and that Ekholm had previously argued against that kind of approach. His answer was not that China should be treated as a normal market, but that telecom cannot be understood without scale and technological learning curves.
China’s domestic scale matters, he said, because telecom is a scale business. But the deeper point was product development: China’s economic development and use cases led data consumption to grow faster there than elsewhere, which meant Chinese carriers needed certain network technologies earlier. He cited massive MIMO — a radio technology for increasing network capacity — as an example. If a supplier is not present when those needs emerge, it does not develop the technology at the same pace.
Ekholm said Ericsson made a strategic decision years ago to define three “home markets” outside China: the US, India, and Japan. The US was large and a front-runner market; India was large and is now becoming a technology front-runner; Japan is large in telecom and an early adopter. Winning in those markets, he said, helps Ericsson combat China’s scale.
Supply chain flexibility was the other response to geopolitics. Ericsson built a US factory, commissioned around 2020, and Ekholm said a large portion of what Ericsson supplies to the US is manufactured in the US. That footprint, together with R&D, helped the company manage supply constraints, tariffs, and geopolitical exposure.
On AI, Ekholm’s thesis was that telecom becomes more important as AI moves from data centers into the physical world. He called it industrial AI or physical AI: inference at the edge of the network will be needed to meet latency requirements, and everything will have to be connected. Since everything cannot be connected by wires, he said, terrestrial cellular networks will become the backbone for scaling AI into the physical economy. For Ericsson, that means new kinds of traffic on the network.
Oura wants to move from wellness signals toward medically validated prediction
Oura CEO Tom Hale said the ring maker has used AI for years, but the most accurate predictions today remain short-term: a user may be getting sick in a couple of days, or a cycle may be coming in a couple of days. The next step, in his telling, is longer-range prediction that helps people understand how behavior can change future health outcomes and potentially the cost of healthcare.
Hale framed AI as a way to address scarcity in healthcare. There is “not enough care to be provided for everybody,” he said, and AI can lower cost, broaden access, and provide medical information. But he put boundaries around that claim: the information must be high-quality, scientifically validated, accurate, and good.
Oura currently operates as a general wellness product, a distinction Hyde raised in connection with possible FDA-regulated features such as blood pressure. Hale said Oura wants to be aligned with the FDA and on a path of scientific and medical validation. He said the company is working on multiple submissions, including blood pressure.
The company is running what Hale called a blood pressure profile study involving almost 300,000 people taking cuff measurements and comparing them with ring predictions. The purpose, he said, is to support the claim that Oura’s readings are accurate, validated, and clinically relevant. Until then, he said, the device will likely remain in wellness mode: it can provide insight, but not diagnosis.
Asked whether the lack of a confirmed FDA head affects Oura’s process, Hale said not really. He said the company is working with CDRH, the FDA division focused on medical devices, and described the collaboration with the agency’s rank-and-file staff as strong.
Hyde also asked about Apple, both as collaborator and competitor. Hale said nearly two-thirds of Oura ring wearers also have a second wearable, most often a wrist wearable and most often an Apple Watch, making the products highly complementary. He acknowledged Oura has hired significant Apple talent, especially in health and hardware, and said the company also has long-standing ties to Apple’s software ecosystem.
On the broader software-versus-hardware question, Hale argued that Oura’s hardware base has become more attractive in the AI era. A year earlier, he said, people questioned why the company would take on hardware at all. Now investors tell Oura they see resilience against AI disruption in software. Hale’s line was blunt: “you can’t vibe code atoms.” Hardware cannot simply be summoned into existence, which he said has made Oura’s combination of hardware and software advantageous.
Supply-chain pressure, tariffs, and inflation were present but not decisive in Hale’s account. He said Oura had not seen inflation materially affect demand, though there had been some supply-chain inflation. The company had announced an intention to build a US factory, which he described as underway, and because Oura manufactures around the globe, it had been able to manage tariff issues effectively.
Big tech is moving the AI funding race into global debt markets
Alphabet’s AI spending is pushing it beyond the US bond market. After a $17 billion domestic sale, the Google parent is tapping the Japanese yen market for the first time. Bloomberg’s Tasos Vossos said the move reflects the scale of tech-sector borrowing: among US non-financial corporates this year, 40 cents of every dollar raised came from the tech sector.
Even the US credit market, the deepest and largest in the world, can become crowded when hyperscalers fund large AI capital-expenditure programs. Vossos said Alphabet and Amazon have therefore been diversifying into every major market they can access: euros, pound sterling, Swiss francs, and now yen.
The key difference from prior waves of large corporate issuance, he said, is repetition. In the past, companies might issue tens of billions of dollars in a single day to fund an acquisition; once the M&A was financed, the borrowing event passed. AI capex is different because spending is expected to run into the hundreds of billions of dollars over the next few years. Investors may want to participate in the debt deals, but they also know another large deal may be close behind, potentially weakening the previous one. Vossos described that pattern — massive deal after massive deal — as something not previously seen in the history of the credit market.
The same financing and hardware pressure showed up in shorter updates. Hon Hai, also known as Foxconn, reported a 19% jump in quarterly profit and said the US will gradually become its largest AI server production hub. Ludlow said the company has become a key AI hardware player by assembling servers that house Nvidia accelerators. Adtek, an optical connectivity specialist, was said by sources to be exploring a Hong Kong IPO that could raise at least $500 million at a $3 billion to $4 billion valuation.
Musk’s Grok push on Wall Street is still more trial than adoption
Bloomberg’s Carmen Arroyo reported that xAI is trying to expand Grok’s corporate client roster by leaning on Wall Street firms and investors with existing ties to Elon Musk’s companies, while SpaceX, now parent of xAI, prepares for an IPO. The firms testing Grok include Apollo, Morgan Stanley, and Valor Equity. Apollo helped finance xAI’s access to chips, Morgan Stanley has long worked with Musk, and Valor has backed Musk companies for years.
Arroyo said the strategy has been driven by John Sholkin, who had been chief revenue officer at xAI and is stepping back into an advisory role. The goal is to get Grok used alongside other AI models at corporate clients, but she said adoption is moving slowly because OpenAI and Anthropic are fighting for the same enterprise market.
Ludlow asked whether the firms were using Grok for on-desk work. Arroyo said xAI is trying to get there, but Grok has been falling behind on coding and financial implementation. In response, the company has been moving more staff internally to train Grok for financial modeling and to strengthen the sales effort. Her conclusion was that Grok is not yet ready for that Wall Street use case, even as xAI works to prepare it.