Uranium Enrichment Is the Missing Link in AI’s Power Supply
In a Stanford CS153 Frontier Systems lecture, General Matter chief executive Scott Nolan argues that AI’s infrastructure constraint is moving upstream from chips and data centers to electricity. For high-uptime, low-carbon data-center power, Nolan says the long-term answer points toward nuclear, but the decisive U.S. bottleneck is not reactors themselves; it is uranium enrichment, a capability he says the country has largely lost and that General Matter was founded to rebuild.

The AI bottleneck moves upstream from chips to electricity
Frontier AI is often described through the model pipeline: data, compute, algorithms, pre-training, mid-training, post-training, agents, and another loop through the system. Anjney Midha argued that this pipeline is only the center of the factory, not the whole factory. Compute sits inside data centers, and data centers require electricity. If power is not available where the data center is ready to run, the compute bottleneck has merely moved upstream.
Midha framed the shift as a systems problem rather than a model-lab problem. In the course’s “frontier AI factory” schematic, the AI pipeline sat in the middle of a stylized campus, while a “mobile data center” supplied compute and “clean nuclear energy” supplied electricity. The drawing was deliberately idealized, Midha said; real data centers and power generation are not usually co-located. But the visual point was explicit: model capability depends on systems outside the model lab.
ChatGPT’s late-2022 breakout, in Midha’s account, made language models legible to everyday users, while the supporting supply chain was not ready for the resulting demand. Chips take time to tape out; data centers take time to stand up. In early 2023, he said, there was a major compute crunch and, for a short window, an energy crunch. The deeper concern was that a consumer application would eventually be followed by an enterprise application that made demand durable.
Midha pointed to Claude 4.6 as that later “Groundhog Day” moment. People returned from winter break, he said, and began using it at work in ways that made enterprises and businesses ask for more. The implication was not simply that model capability had advanced. It was that the rest of the infrastructure stack had to absorb a new class of usage.
Scott Nolan made the energy version of that argument more directly. He began from the people running or supplying frontier AI systems. Sam Altman was quoted on screen from Senate testimony: “Eventually the cost of intelligence, the cost of AI, will converge to the cost of energy. And...the abundance of it will be limited by the abundance of energy.” Jensen Huang, in a quoted exchange with Joe Rogan, called energy “the bottleneck.” Elon Musk was quoted saying, “The limiting factor for the deployment of AI is essentially electrical energy. We're very soon going to be producing more chips than we can turn on.”
As Nolan framed it, chips do not cease to matter. But as chips and models become cheaper, electricity remains the fundamental input consumed by inference and training. He connected that to a broader view, attributed to Balaji Srinivasan, that monetary costs should be denominated in joules. The frontier AI cost structure, on this view, increasingly collapses toward energy.
The argument has moved beyond infrastructure specialists. Nolan used a Financial Times headline saying that Big Tech is spending billions on U.S. data centers for AI, but those plans face a problem: access to power. He treated that as a sign that the upstream constraint is now visible in mainstream business coverage, not just among AI operators.
The demand problem, in Nolan’s view, is not a routine infrastructure adjustment. An electricity-usage projection chart, attributed to the IEA via Kamiya and Coroama, showed multiple forecasts bending sharply upward after 2025 toward 2030. A second chart, attributed to Leopold Aschenbrenner’s “Situational Awareness,” compared American and Chinese electricity generation from 1985 through 2030. The U.S. line was relatively flat over recent decades; China’s rose steeply. AI demand appeared as a new wedge against that history.
Nolan’s interpretation was blunt: the United States has been close to a standstill on grid expansion, and AI requires something closer to a near-vertical ramp. The country cannot rely on the same pace of infrastructure buildout that prevailed over the last 20 years.
We have to go from almost a complete standstill on grid expansion to nearly vertical.
The resulting chain began with the simplest statement on screen: “Electricity is the bottleneck to AI.” Nolan treated that as the first step in a longer chain, not the conclusion.
Stranded power bought time, but it no longer solves the scale problem
Before the latest surge in AI-related demand, stranded energy could satisfy a meaningful portion of compute-adjacent demand, according to Nolan. He defined stranded energy as power supply without nearby demand: a hydroelectric dam in a rural area, isolated geothermal, wind in West Texas, or other resources where generation exists but local consumption and transmission do not.
That market was first exploited by Bitcoin miners. Mining did not require large amounts of fiber connectivity or conventional data-center siting. A mining site could tolerate remote locations and limited connectivity. Nolan cited Crusoe as an example of a company that began with that logic and later moved toward larger AI infrastructure projects, including work associated with the Stargate project in West Texas.
Midha lingered on Bitcoin mining because he thought the cultural narrative around crypto obscures the infrastructure lessons. He argued that Bitcoin mining was a “dress rehearsal for AI.” Whatever one thinks about crypto’s promises, the industry forced builders to learn how to site compute near cheap or otherwise unusable power, build modular infrastructure, and exploit energy resources that conventional users could not absorb. He warned against dismissing infrastructure progress because it was financed by, associated with, or first deployed for crypto.
Nolan sharpened that distinction by describing stranded-electricity utilization as a “primitive” rather than merely a pivot. A primitive is a useful building block even if the first application changes. In Crusoe’s case, he said, stranded gas that would otherwise be flared could be run through a turbine to generate electricity. Even if the value of mined Bitcoin were discounted to zero, the company believed it was reducing emissions and building valuable infrastructure capability. From there, the same company could move into enterprise cloud deployments, larger projects, and eventually AI compute.
That pattern resembled, for Nolan, the way SpaceX reduced the commercialization of space to a more fundamental variable: launch capacity, denominated as dollars per kilogram to orbit. If launch became cheaper, more downstream activity could exist. For General Matter, he said, the analogous primitive is uranium enrichment.
The important limit, in Nolan’s account, is that the stranded-power phase is largely exhausted. He said most of the great stranded resources without nearby demand have already been claimed. Even where stranded capacity remains, the chunks are too small to satisfy today’s projected AI demand. The question has shifted from finding unused power to creating “massive net new power production.”
That shift changes the design problem. Data centers do not merely need cheap electricity; they need extremely high uptime. Nolan’s uptime slide listed data-center tiers from Tier 1 at 99.671% uptime to Tier 4 at 99.995%. Solar or wind could power data centers in principle, he said, but hitting those uptime levels with today’s grid-scale batteries would make costs very high. As a result, operators have moved toward natural gas turbines in recent years.
But turbines have become their own bottleneck. Nolan said turbine lead times are now a few years and have increased drastically. Producers are not ramping production fast enough to keep up with demand. That pushes buyers toward sources that can provide baseload electricity without depending on scarce turbine supply.
The immediate future, in Nolan’s telling, is awkward. The next couple of years may be the hardest: stranded wind has been found, gas pipelines can be connected, turbines are sold out for years, and industrial-scale grid interconnect and power electronics equipment also face delays. Nuclear can matter on the time scale of five to ten years, but the intervening period still requires bridge strategies.
Baseload requirements push hyperscalers toward nuclear
Once uptime, carbon emissions, and safety are considered together, Nolan argued, the power-selection logic points strongly toward nuclear.
The capacity-factor comparison he used put nuclear well ahead of other major power sources. The 2024 figures, attributed on screen to the U.S. Energy Information Administration and the Department of Energy’s Office of Nuclear Energy, defined capacity factor as actual generation compared with the maximum a plant could generate without interruption. Nuclear was listed at 92.3%. Geothermal followed at 65%, natural gas combined cycle at 59.9%, coal at 42.6%, hydro at 34.5%, wind at 34.3%, solar at 23.4%, and simple-cycle natural gas at 17.2%.
The safety and emissions comparison came from an Our World in Data chart shown in the lecture. It compared energy sources by deaths from accidents and air pollution per terawatt-hour and by lifecycle greenhouse-gas emissions. Nuclear was shown at 0.03 deaths per terawatt-hour, among the lowest values on the chart, and with very low emissions. Nolan described nuclear as the lowest-carbon option among those shown and essentially tied with wind for safest.
That combination, he said, is why hyperscalers are looking to nuclear. The examples shown were Google and Kairos Power signing a 500-megawatt advanced nuclear reactor deal; Meta becoming the latest large technology company turning to nuclear power for AI needs; and Three Mile Island reopening to power Microsoft data centers. Nolan did not present those as proof that nuclear is an immediate answer. He cautioned that nuclear is not an overnight answer and not a one-year project. He described it as something that can begin to move the needle over a five-to-ten-year time frame.
In the meantime, the race is for stranded power, turbines, and perhaps solar plus enough battery storage for buyers less sensitive to cost. Long term, he said, everyone is looking to nuclear.
That led to the second link in the bottleneck chain: “Nuclear is the bottleneck to electricity. Electricity is the bottleneck to AI.” Nolan’s formulation was qualified by context: this applies to land-based scaling. Space-based data centers may offer a different answer for a company that can actually deploy them, but he did not treat that as a general solution.
When asked later about space, Nolan said SpaceX may have a uniquely plausible path to sun-synchronous data-center satellites, though there are technical challenges. He said he would not bet against SpaceX on technical challenges, but framed orbital data centers as “uniquely their solution.” In his view, every other company will mostly compete on scaling power on land, which means scaling nuclear.
The same point governed his answer on timing. The Department of Energy contract discussed in class extended through 2034, but Nolan said General Matter’s goal is much faster: to be online before the end of the decade and scaling rapidly from there. He expects one-off small modular reactor deployments in the next couple of years, with “tens, not hundreds” of SMRs before the larger hockey stick in the early 2030s into 2035. Gigawatt-scale reactors take five to ten years in the U.S., and he called that optimistic.
The missing nuclear capability is enrichment
Nolan’s central company-specific claim was that nuclear’s bottleneck is not reactors in the abstract, but fuel — specifically enrichment.
Reactors are not perpetual-motion machines. They require fuel and must be refueled, often every year or two for existing reactors, with some advanced reactor designs targeting five-to-ten-year refueling cycles. The nuclear fuel supply chain has five steps: mining and milling, conversion, enrichment, deconversion, and fuel fabrication.
The supply-chain decomposition made the analogy to the AI pipeline concrete. In the same way Midha’s factory schematic decomposed AI capability into data, compute, algorithms, training stages, and deployment, Nolan decomposed nuclear energy into the industrial steps required before a reactor can run. The bottleneck was not “nuclear” as a general category; it was the middle step in the fuel chain.
Reactors consume U-235, the fissile material that sustains the chain reaction, releases neutrons, makes heat, heats water, runs a steam turbine, and produces electricity. To make fuel with enough U-235 by weight, mined uranium product is converted from U3O8 to UF6, a gas. That gas is enriched through a refining or separation process. The enriched gas is then converted back into a solid and fabricated into pellets or another fuel form.
General Matter works on the enrichment step. Nolan stressed that its facility is not a reactor. It performs no nuclear reactions, and it avoids criticality by keeping material spread out so it cannot form a critical mass. He also described the process as not a chemical reaction but separation.
The reason enrichment matters is that the U.S. barely participates in it today. Nolan’s market-share map showed Russia’s TENEX at 43%, Europe’s Orano and Urenco at 41%, China’s CNNC at 16%, and U.S. Centrus at less than 0.1%. Nolan said this means the United States cannot produce its own nuclear fuel at scale and still relies on European firms and, even under sanctions, Russia, because it needs the supply.
| Region or company | Enrichment market share shown |
|---|---|
| Russia / TENEX | 43% |
| Europe / Orano and Urenco | 41% |
| China / CNNC | 16% |
| USA / Centrus | <0.1% |
This was the third link in the chain: “Enrichment is the bottleneck to nuclear. Nuclear is the bottleneck to electricity. Electricity is the bottleneck to AI.”
Nolan also argued that enrichment is a cost bottleneck. A slide using the UxC fuel cost calculator showed high-assay low-enriched uranium, or HALEU, fuel cost structure, with enrichment highlighted as the largest component. The assumptions on the slide included 19.75% HALEU, U3O8 at $65 per pound, conversion at $60 per kgU, enrichment at $300 per SWU, deconversion at $18 per kgU, and fabrication at $1,500 per kgU. Nolan’s claim was that even if the current period is an arms race to stand up data centers quickly, cost will eventually matter as margins compress. Enrichment is therefore both a capacity and cost lever.
The U.S. did not always occupy this position. Nolan said the country had roughly 86% of worldwide enrichment capacity in the 1980s, with government-run sites rooted in the Manhattan Project and Cold War infrastructure. After the fall of the Berlin Wall, the U.S. began trading more with Russia and Europe. Under the “Megatons to Megawatts” program, Russian warheads were downblended and run in U.S. reactors. American enrichment technology at the time was relatively expensive and not globally competitive, so U.S. operators struggled to make money and shut down plants. The last of those closed in 2013.
Nolan described the current problem as path dependency: the U.S. had a technology, it became uneconomic relative to foreign supply, the geopolitical environment encouraged trade, domestic capability was shut down, and the need to bring it back arrived faster than expected.
Midha summarized the pattern as “back to the future” for uranium enrichment: the U.S. had done it historically, stagnated for roughly 20 years, and is now being pushed back onto the path by AI-driven energy demand. Nolan agreed, but rejected a simple restoration model. As with space, he said, restarting the industry does not mean rebuilding Saturn V or reassembling the Space Shuttle. It means starting clean sheet, leveraging decades of progress, and questioning everything.
Industrial startups become consequential at the scaling stage
General Matter became a company in January 2024, after Nolan spent much of 2023 investigating the missing enrichment step. He said he had begun thinking about the problem in December 2022, worked on it through the fall of 2023, and became convinced that it was both important and unlikely to be solved by someone else quickly enough.
The company began by working on advanced fuel for advanced reactors. Nolan said that when General Matter started, there was no Russian uranium ban and no AI data-center boom in the current form. The Biden administration had already supported nuclear-fuel programs, and the current administration has continued a focus on energy production. Nolan repeatedly described the support as bipartisan and cross-administration.
A World Nuclear News item shown in the lecture stated that the Department of Energy awarded $2.7 billion to strengthen U.S. uranium enrichment. The visible text said General Matter, American Centrifuge Operating, and Orano Federal Services were each awarded $900 million in funding for uranium enrichment services, with another $100 million to Global Laser Enrichment. Midha emphasized the speed: in his classroom framing, a company founded in January 2024 and “close to 100” people by the time of the lecture had received a $900 million DOE award.
Nolan clarified that the project is larger than the government contract. He described it as a multibillion-dollar project in which DOE support helps accelerate capacity, while General Matter expects to bring more private capital than the contract amount. The contract supports what the company was already directly trying to do: build enrichment capacity as quickly as possible.
The operating lesson Nolan drew was less about a founder origin story than about how hard-technology companies scale. General Matter selected people from national labs, other nuclear-energy companies, Tesla, and SpaceX because the company wanted to break into a capital-intensive, incumbent-dominated, stagnant industry using a playbook similar to companies that had done that before.
Nolan’s SpaceX experience gave him a model for that playbook. He joined first as an intern when the company had roughly 35 people and worked on propulsion systems as a structural and thermal analyst on a small team. The early operating style, as he described it, was scrappy: build test stands, design primitive engines, make combustion work, and construct test fixtures good enough to learn quickly. The optimization function was schedule first, some cost discipline, and a hard line on safety. The point was not to build the fanciest test stand; it was to build something that worked, test it, learn, and move to the next iteration.
He also described a lesson he thinks he learned the hard way. Nolan said he left SpaceX because he believed the rocket and engines were working and the company might become less exciting; in retrospect, he called that conclusion wrong. He had thought a 100-person company was already large and that the best startup learning happened at three people. He now sees roughly 100 people as the stage where companies enter the hard work of scaling: moving from ideas and product-market fit to operationalizing, manufacturing, and building teams.
That lesson shaped General Matter’s early execution. Nolan described the first few months as “legitimately 100 hour weeks,” with people living and sleeping at headquarters to compress years of planning into a few months. The work included deciding whether to apply for the DOE program, preparing the plan, and finding the right site.
The company is headquartered in Los Angeles, but the manufacturing and enrichment facility will be in western Kentucky, in Paducah. Nolan said Paducah was the site of the last U.S. commercial enrichment before it shut down in 2013. General Matter originally looked there for old buildings it could use, then found 100 undeveloped acres at the south end of the Department of Energy site that fit its needs.
The site mattered because, in Nolan’s view, nuclear requires not just technology and capital but a supportive community and supportive location. He described the DOE as extremely supportive of a new entrant trying to help solve the problem.
When Midha asked whether the current federal government was asleep at the wheel on critical infrastructure, Nolan said “definitely not.” He described congressional support beginning in 2022 and 2023, first for HALEU — high-assay low-enriched uranium used by advanced reactors — and then for LEU, the low-enriched uranium consumed by the existing grid. Within DOE, he said, many people have spent their careers on nuclear not because the sector has been growing, but because they believe in it.
Nolan’s employment expectations were large, but he framed them as estimates. He said General Matter would likely create “hundreds and hundreds” of jobs, including close to 500 in Los Angeles over the next few years and that much or more in Kentucky. He suggested the company could be around 1,000 people for the work ahead. He compared that uncertainty to SpaceX’s early expectations: apparently some people once thought SpaceX might only ever be 200 people, while it later grew close to 20,000.
Midha used that as a counterpoint to the claim that AI simply eliminates jobs. In his framing, AI’s infrastructure bottlenecks are already creating new jobs in engineering, construction, finance, manufacturing, and physical infrastructure. Nolan was more modest but still optimistic: General Matter has dozens of open roles, wants to hire hundreds of people, and cannot find enough good people quickly enough.
The desired end state is broader than one company’s supply contract. Nolan said General Matter’s long-term goal is similar to SpaceX’s: bring a strategic technology back to the United States, use it to scale an industry, and lower the cost structure enough that more downstream activity becomes possible. For enrichment, that means producing fuel for existing reactors, advanced reactors, allies, and potentially reducing the incentive for more countries to build enrichment capabilities themselves. He framed that as beneficial for power production, clean energy, lower fossil-fuel emissions, and nuclear nonproliferation, while acknowledging the geopolitical complexity is beyond his pay grade.
Public narratives lag the engineering fundamentals
A recurring tension was that public narratives often distort hard-technology work. Midha raised it first in relation to crypto, then nuclear. Bitcoin mining, in his telling, became culturally entangled with the failures and excesses of crypto, but some of the underlying infrastructure work translated directly into AI. Nuclear has suffered a different version of the same problem: confusion, politics, and social divisiveness have made its fundamentals less legible.
Nolan’s advice was to ignore the surface narrative and go deeper into the problem. If the objective is baseload power that is clean, scalable, and safe, his recommendation was to look at the numbers. The numbers, in his reading, support nuclear despite the salience of a few famous accidents.
Nolan said that, based on the analyses he was relying on, Three Mile Island had no direct measurable deaths, and Fukushima perhaps one fatality, while the tsunami that caused it killed thousands. His broader claim was that nuclear’s safety record is much better than public perception suggests. He also put some responsibility on the nuclear industry itself, saying it failed to make the case strongly enough and wasted time.
The calibration problem, as Midha framed it, lies between two mistakes: pretending everything is fine while the building burns, and panicking after something goes wrong in the physical world and overcorrecting for a decade. Nolan’s answer was not to chase the emotional state of the moment. He said builders need to be aware of timing — an idea that is 20 years too early may be a poor use of effort — but should focus on fundamentals.
His framework was: find an important problem that is not getting solved, that is not going to get solved by someone else, and where one’s skill set is especially useful. That work might happen at a startup, an existing company, in government, or in a nonprofit. The venue matters less than the fit between problem importance, neglectedness, urgency, and the builder’s ability to contribute.
Germany served as Nolan’s cautionary example for nuclear perception. In his account, Germany shut down working nuclear reactors with the intention of replacing them with renewables, but did not replace them with renewables at the baseload level. He said the replacement was “almost entirely” coal, natural gas, and other fossil fuels, with consequences for carbon emissions and air quality. He contrasted Germany’s air quality with France’s, which has a high percentage of nuclear power on the grid, and called the German path self-defeating: shutting down affordable, clean energy after it is built and replacing it with biomass and fossil fuels.
That does not, in his view, prove that every nuclear build is easy. Upfront cost remains a major issue. Nolan identified reducing the cost of building nuclear plants as the next major problem for the industry, citing startups and advanced reactor companies pursuing factory-built small modular reactors, shippable systems, and cheaper deployment. But those designs still require fuel — including more enriched fuel for smaller form factors — and the United States does not currently produce enough of it domestically.
On public opinion, Nolan said nuclear perception has improved quickly. He described opinion charts as having moved from mostly negative to positive over the past few years. General Matter chose the problem because it believed it was the right one, but consensus moved toward the company faster than expected.



