Compute Supply, Power, and Capital Are Defining the AI Buildout
Caroline Hyde
Rene HaasHannah MillerJulia Fanzeres
John Serafini
Seth Fiegerman
Dina Bass
Niccolo Masi
Sylvia JablonskiChris BrittBloomberg TechnologyThursday, May 7, 202615 min readArm’s warning on smartphone weakness sat alongside a stronger claim from chief executive Rene Haas: handset softness is concentrated in lower-end devices, while data-center demand is accelerating because agentic AI workloads need CPU orchestration. Bloomberg Technology’s May 7 program used that contrast to trace a broader AI-infrastructure market in which demand is less in question than the ability to secure compute capacity, power, supply chains and capital. Anthropic’s lease of SpaceX compute and CoreWeave’s financing questions pointed to the same constraint: available infrastructure, not appetite for AI, is becoming the limiting factor.

The AI-infrastructure story is the through line, not every earnings print
Bloomberg’s market frame was broad: the Nasdaq 100 was trading at a record, on pace for a sixth straight weekly gain, while individual earnings reactions were uneven. The source also included Warner Bros. Discovery, Block, DoorDash, Datadog, Chime, CoreWeave, IonQ, Arm, and HawkEye 360. The material that carried the strongest common thread was narrower: AI demand is still lifting technology markets, but the constraints are shifting toward compute capacity, power, supply chains, capital, and whether companies can turn technical claims into commercial revenue.
Arm was the clearest example of that tension. Its shares were down about 8% intraday after the company warned of sluggishness in smartphones, even as its chief executive described data-center demand as accelerating.
Rene Haas said Arm had “definitely seen a slowdown” in smartphones, but argued the company is less exposed than others because its smartphone economics are weighted toward premium devices. The weakness, in his account, is concentrated in the lower end of the market, where royalty contribution to Arm is smaller. Premium devices tend to use Arm’s Version 9 architecture and carry richer royalty rates.
That distinction matters because Haas did not present the quarter as being defined by handset softness. He said quarterly revenue was around $1.5 billion, a figure he described as not long ago having been close to an annual revenue number for the company. On-screen data showed fourth-quarter total revenue of $1.49 billion, up 20% year over year; licensing revenue of $819 million, up 29%; and royalty revenue of $671 million, up 11%.
| Metric | Fourth-quarter result | Year-over-year change |
|---|---|---|
| Total revenue | $1.49 billion | +20% |
| Licensing revenue | $819 million | +29% |
| Royalty revenue | $671 million | +11% |
The main growth claim Haas made was about data centers. Arm’s data-center business doubled year over year, he said, and demand for the company’s new Arm AGI CPU has accelerated sharply. His explanation was not simply that AI requires more chips. He framed the CPU as increasingly central to the way agentic AI workloads run inside data centers.
“Agentic workloads,” as Haas described them, mean agents placing queries on data centers and needing answers back quickly. The work of agent management, orchestration, and scheduling, he said, is CPU work rather than GPU work. His claim was categorical: “Only a CPU” can manage those tasks, not an accelerated GPU. That is the basis for Arm’s view that CPU demand is rising alongside, not instead of, accelerator demand.
Arm’s move into selling its own chip marks a shift from the company’s historical model. For 35 years, Haas said, Arm delivered product as IP: blueprints customers use to build chips based on Arm technology. Its data-center customers and partners already include Amazon’s Graviton, Google’s Axion, Microsoft’s Cobalt, and Nvidia’s Vera. With the Arm AGI CPU, Haas said demand now extends beyond IP licensing to the physical CPU product itself.
The scale changed quickly. Haas said Arm had visibility to about $1 billion of orders in its forecast when it discussed the launch, and that figure doubled over the prior five weeks to $2 billion. In his words, “Demand is certainly not a problem.”
The disclosed partners at launch included Meta, OpenAI, Cerebras, SK Telecom, Rebellions, SAP, and F5 Networks. Haas also emphasized the systems layer: Arm worked with Supermicro, Lenovo, and ASRock on racks customers can order. The AGI CPU, he said, can deliver twice the performance in the same power as a comparable x86 rack, with 36-kilowatt air-cooled racks as the reference design. On-screen notes also said the chip draws 300 watts, that Meta would be the first major consumer, and that Taiwan Semi would produce the chips.
Supply is the constraint Arm is now managing. Haas said Arm had supply for the first $1 billion of orders and is working through the additional $1 billion of demand with TSMC and memory suppliers SK Hynix, Micron, and Samsung. But he rejected the idea that unmet demand would evaporate if capacity arrived late. This demand, he said, is “not perishable”; the need for compute does not disappear because a short-term window closes.
The long-range target attached to that thesis is much larger. Haas said Arm has discussed a $15 billion target by fiscal year 2031 — calendar 2030 for viewers — and said the company is confident it is on track to reach that figure “in a very, very short time.” He described the opportunity as transformational relative to Arm’s current revenue base.
Haas also addressed his expanded role inside SoftBank, where he is taking on leadership of SoftBank International alongside his role as Arm CEO. He described the assignment as orchestration across a broader chip and data-center ecosystem, including SoftBank’s announced work on a 10-gigawatt data-center facility in Portsmouth, Ohio with the U.S. Department of Energy and SoftBank Energy, as well as portfolio companies such as Ampere and Graphcore. His view was that the role may look like “two jobs,” but that coordination across those assets could benefit both Arm and SoftBank’s broader semiconductor ambitions.
Anthropic is leasing compute from a rival because the capacity is available now
The same AI compute shortage that supports Arm’s data-center story is pushing frontier-model companies into unusual partnerships. Seth Fiegerman described Anthropic’s agreement with Elon Musk’s SpaceX as an “odd bedfellow” arrangement: Musk has criticized Anthropic, and Anthropic competes with xAI to build powerful AI models. Yet both sides have reasons to transact.
Anthropic needs compute immediately. Fiegerman said the company has repeatedly stated that demand has surged over the last few months. SpaceX’s Colossus 1 data center, by contrast with multi-gigawatt facilities planned years into the future, is operational. That availability is the key differentiator.
On-screen notes said the partnership grants Anthropic entry to SpaceX’s data-center infrastructure, secures over 300 megawatts of compute power, and enables Anthropic to raise usage limits for its AI products. Fiegerman described the deal as leasing about 300 megawatts — not the scale of some five-gigawatt transactions being discussed elsewhere, but meaningful because it is available now.
SpaceX and xAI also gain revenue. Fiegerman said SpaceX is planning to go public imminently, while xAI’s Grok is not getting as much business as Musk’s venture might want. Leasing unused capacity to Anthropic supplements revenue even though Anthropic remains a competitor.
The relationship also contains a speculative next step. Caroline Hyde raised the possibility of orbital data centers, and Fiegerman said Anthropic is saying it will work with SpaceX, or “SpaceX AI,” on multi-gigawatt orbital data-center capacity. He was careful to add that this does not exist today, though Musk has said it is a priority.
The rivalry and personal politics remain part of the deal. Fiegerman said there is an “enemy of my enemy” element, given Musk’s conflict with OpenAI. He also said Musk visited senior staff at Anthropic the prior week and decided “they’re not evil,” after having previously suggested Anthropic was evil or “misanthropic.” Musk’s support, as Fiegerman described it, is conditional: he has indicated that if his view changes, he can cut or alter the deal.
Tech layoffs are rising, but the labor signal is still narrow
AI’s effect on employment was treated less as a simple replacement story than as a capital-allocation story. On-screen data put the number of planned tech-sector cuts so far this year at 85,411, up 33% from the same period in 2025 and a three-year high for 2026, even as overall private-sector layoff announcements declined. The data cited came from Challenger, Gray & Christmas.
Julia Fanzeres said Challenger’s data show tech leading layoff announcements, with headlines from companies including Microsoft, Meta, and Snap. For the second month in a row, she said, artificial intelligence was the stated reason. But she drew a distinction between AI directly replacing workers and companies cutting jobs to fund AI spending. In the current pattern, the necessary spend on AI is “taking over,” and companies need the extra money.
Hyde sharpened that distinction: the anxiety may not be only that AI takes a worker’s job directly, but that the salary line is removed and redeployed into AI investment. At the same time, she noted that since Challenger began tracking layoffs attributed to AI in 2023, such cuts have amounted to only about 3.5% of all announced layoffs.
Fanzeres said the figure remains small in the broader macroeconomic picture. Initial jobless claims were still near decade lows despite a recent rebound, and continuing claims were near two-year lows. That suggests either that announced cuts have not translated significantly into the broader labor data, or that they are not large enough to move the macro needle.
The unresolved question is whether tech layoffs are a leading indicator or a narrower sector adjustment. Fanzeres said it is difficult to separate genuine AI-driven restructuring from “AI washing.” But she argued the label does not erase the economic fact: jobs are being cut. The broader question is whether the economy remains in a “low hire, low fire” environment. Even if companies are using AI language opportunistically, removed jobs still matter.
Quantum companies are being asked to move from science project to revenue
IonQ’s stock was down nearly 6% intraday despite first-quarter revenue above expectations, a higher-than-expected second-quarter outlook, and raised full-year guidance. Niccolo Masi framed the move against a much longer market run: IonQ was up roughly 70% from the same time the prior year. His focus, he said, is running the business and proving that quantum is at a strategic and financial inflection point.
De Masi said IonQ raised full-year guidance to more than double last year’s revenue at the high end, after tripling revenue the prior year. He described IonQ as the first seven-, eight-, and then nine-figure revenue quantum company in history, and said his goal is to be first to ten figures of revenue.
The technical claim centered on fault-tolerant quantum computing. Hyde noted IonQ’s long-term goal of reaching 80,000 logical qubits and a nearer milestone of 256. De Masi said IonQ had launched what he called the world’s first “shovel-ready blueprint” for fault-tolerant computing the prior week, named the “walking cat architecture.” The 256-chip milestone, he said, is the company’s movement toward that architecture.
He described the design as modular and scalable, using “all-to-all communication” through Bell quantum teleportation within the chip. That modularity, he said, allows IonQ to build more powerful individual computers and data-center-sized arrays. The commercial strategy is broader than computing alone: de Masi said IonQ is building an ecosystem across computing, networking, sensing, and security. About a third of revenue is international, he said, and about a third of customers are taking more than one product.
Sylvia Jablonski made the corresponding investor argument. Referring to IonQ, she cited 755% revenue growth and said companies in disruptive themes are beginning to move from “science projects” to commercial businesses. That transition, in her view, is what retail investors needed to see.
The same framework applies across the AI supply chain. Jablonski said investor appetite is moving toward the “picks and shovels” of AI: memory, photonics, infrastructure, power, and related buildout. She mentioned Corning, Lumentum, and Coherent in photonics, as well as nuclear energy and other power sources required to support data movement and compute growth. Her view was that these picks-and-shovels businesses have a long runway because the buildout is still early.
She also described a more discriminating market for software. After the “ChatGPT moment,” Jablonski said, every company in the S&P 500 tied itself to AI, and many stocks benefited simply from using the term. The market has since concluded that “not everybody is AI.” Datadog, up about 30% intraday after raising guidance and showing strong growth, was presented as an example of software that appears to be earning renewed investor confidence because its AI positioning is tied to actual infrastructure and product use.
Geopolitics complicates that long-term trade without displacing it. Hyde asked how investors should think about the Strait of Hormuz and other geopolitical risks that can affect helium and chips. Jablonski said investors in disruptive themes need a buy-and-hold posture because they will not be able to predict geopolitics. But she also warned that if the market turns, higher-growth sectors may be hit.
HawkEye’s IPO puts defense technology into the public-market version of the AI trade
Defense technology was presented as adjacent to the AI infrastructure theme rather than separate from it. Jablonski said appetite is moving into “cousins of AI” beyond straight chips, including modern warfare, drones, unmanned systems, and government-funded AI for defense. Palantir was cited as an example of the kind of AI capability that could benefit from government spending.
HawkEye 360’s IPO gave that theme a public-market test. The satellite surveillance firm priced at the top end of its range, raised $416 million, and had shares indicated to open at $30 to $32 versus a $26 IPO price. The offering was described as 25 times oversubscribed.
| IPO or business fact | Detail |
|---|---|
| IPO proceeds raised | $416 million |
| IPO price | $26 |
| Indicated opening range | $30 to $32 |
| Oversubscription | 25 times |
| Fiscal 2025 revenue | $117.7 million, up from $67.6 million a year earlier |
| Current constellation | More than 30 satellites |
John Serafini positioned HawkEye as a durable “all-weather” company. The value proposition, he said, is providing signals intelligence to the warfighter during geopolitical volatility, while also operating in a way that creates value across environments and business conditions.
HawkEye operates more than 30 satellites. Serafini said that constellation gives the company a strong revisit rate over anywhere on Earth and low data latency, allowing it to deliver information to customers quickly. He described use cases in battlefield settings and the South China Sea, including detecting and tracking “dark vessels.” The core capability is geolocating, analyzing, processing, and delivering RF data to warfighters on a timely basis.
On-screen fast facts said IPO proceeds would go to debt and acquisition payments. Serafini said U.S. government work represents about 75% of the business today, while international business has at times been as much as 50% of HawkEye’s heritage capability. He also signaled more acquisition activity, saying the company combines organic growth of more than 70% in 2025 and adjusted EBITDA profitability of more than 20% with inorganic opportunities.
Asked why HawkEye did not stay private, Serafini argued that the IPO process provides validation for a company supplying intelligence to warfighters and adds resources to the balance sheet. In his framing, the public listing is not only a financing event but also a credentialing event for government and allied customers.
Chime says AI is already changing how its software is built
Chime made the source’s most direct claim about AI changing internal production. Shares were down nearly 8% despite earnings that beat expectations and a raised full-year revenue outlook. Bloomberg Intelligence highlighted Chime’s first-quarter outperformance, product traction in MyPay and Chime Card, and a 2026 EBITDA outlook up 9% at the midpoint.
Chris Britt said Chime added 700,000 new active members in the quarter, reaching 10.2 million, a company high. He cited third-party data from J.D. Power showing Chime opening more checking accounts than any bank in America, almost 50% more than the number two player. He said those dynamics translated into 25% year-over-year top-line growth, an 18% adjusted EBITDA margin, and GAAP profitability for the first time.
Chime’s business has seasonality. Britt said the first quarter typically outperforms because tax refunds drive outsized spending and deposits into Chime accounts, while the second quarter naturally falls off. He argued investors should focus on year-over-year results and said the company raised full-year revenue and adjusted EBITDA guidance because of its product portfolio.
Inside the company, Britt said Chime has embedded AI into its workflow through an internal “software factory” called Archimedes. Employees can create agents that work simultaneously to build products from conception to rollout. He said more than 80% of the code Chime shipped last quarter was done with AI, making it a major accelerant to launch velocity.
For customers, Chime has an AI copilot called “J.” Britt said the goal is to move banking away from backward-looking reports on spending toward proactive behavioral guidance: paying off high-interest debt first, opening savings accounts, investing regularly, and avoiding missed bills.
Hyde pressed Britt on employee anxiety in a week when fintech peers including PayPal and Coinbase were discussing AI-heavy operating models. Britt said Chime already operates efficiently, with more than $1.5 million of gross profit per employee, and described the company as right-sized. Employees, he said, are excited about AI because it allows technology companies in established industries to create better products faster.
Britt also said Chime is not seeing broad stress in its own consumer data. Member spending was up double digits year over year in categories including entertainment, streaming services such as Netflix, home delivery such as DoorDash and Instacart, and big-box purchases. Balances were higher, partly because of tax refunds but also beyond that. Chime has not seen an uptick in unemployment benefits, Britt said, and because it is the primary account for millions of members, he believes it would see labor-market pressure if it were hitting its customer base. His view was that everyday consumers remain close to full employment, while more of the “gloom and doom” is concentrated around white-collar work that he said is starting to be replaced by AI.
CoreWeave’s demand question is giving way to a financing question
CoreWeave entered its earnings report with shares up about 79% year to date, supported by AI demand for compute capacity. Dina Bass said investors want continued demand signals, which remain strong across the market. Large cloud providers including Google, AWS, and Microsoft had all indicated that AI capacity demand remains extremely strong. Anthropic’s agreement to access capacity at a SpaceX data center reinforced the same point: AI companies are looking for available capacity wherever they can find it. CoreWeave itself signed deals in April with Anthropic, Meta, and Jane Street.
The issue is not whether customers want capacity. Bass said the question is how CoreWeave pays to expand it. Meeting demand means acquiring, building, and developing more AI cloud capacity, and that is expensive. The process has been “a little bumpy” for CoreWeave and others in the industry.
Customer concentration is another investor concern CoreWeave has tried to address. Early on, Bass said, investors worried that a large percentage of revenue was tied to Microsoft. Later, reports that OpenAI might not sell as much as hoped affected CoreWeave and other capacity providers. CoreWeave responded by emphasizing that it has other customers. Deals with Meta, Anthropic, Jane Street, and others are part of that diversification message.
CoreWeave is also trying to sell beyond raw compute. Bass said part of its Nvidia relationship involves having Nvidia market other services CoreWeave can sell in addition to compute power. That matters because the company’s valuation depends not only on AI demand but on whether it can finance capacity growth, diversify customers, and move up the software-and-services stack.