Orply.

AI Is Making Scientific Throughput the New National Advantage

Darío GilRisa WechslerStanford HAIFriday, May 15, 202613 min read

Dario Gil, the U.S. Department of Energy’s Under Secretary for Science, used his AI+Science keynote to argue that AI is shifting scientific advantage from access to instruments and computing toward the throughput of integrated discovery systems. He presented DOE’s Genesis initiative as the national-scale architecture for that shift, linking data, AI models, high-performance computing, experimental facilities, and industry partners into closed-loop workflows. Gil’s case was that the test is not more papers, but whether faster scientific cycles can produce measurable gains in productivity, security, and industrial capability.

Scientific advantage is shifting from access to throughput

Risa Wechsler introduced Darío Gil as the U.S. Department of Energy’s Under Secretary for Science, overseeing what she described as the nation’s largest portfolio of basic research in the physical sciences, including AI and advanced computing, quantum science, fusion energy, and high-energy physics. Gil’s central claim was that AI-enabled science is changing the basis of scientific advantage. Access to instruments and computation still matters, but leadership will increasingly depend on throughput: how fast an integrated scientific system can move from measurement to understanding to action. Genesis, the Department of Energy initiative he outlined, is meant to be the national-scale architecture for that shift.

For decades, Gil said, scientific leadership depended on pairing new knowledge with systems capable of turning that knowledge into impact: electrification, aviation, semiconductors, and other technology regimes. That model still matters, but it is being reconfigured around “speed, scale, and integration.”

The core change is from a staged process into a continuous system. The older model moved through hypotheses, experiments, analysis, shared results, and repetition. Even where institutions had integrated teams and advanced tools, handoffs between people, instruments, data systems, and computing constrained progress. Those constraints are beginning to break as AI models generate hypotheses, guide experiments as they run, and update themselves from new results.

That turns throughput into the new strategic bottleneck. In Gil’s formulation, scientific leadership is no longer defined primarily by access to instruments or computation, though both remain essential. It is defined by how quickly a scientific system can move from measurement to understanding to action. Systems that cannot increase that rate, he warned, “will fall behind, regardless of the strength of their underlying science.”

The cycle is no longer measuring in weeks or months, but in minutes or hours.

Darío Gil

The distinction matters because Gil was not describing AI as a layer added after the experiment. He described analysis moving into the experiment itself. Models interpret data in real time, identify signals, adjust parameters, and guide the next step before the previous one is complete. In that setting, a single experimental session can contain many iterations. Gil called this the difference between “sequential science and continuous science.”

He grounded the claim in Department of Energy facilities. At the Spallation Neutron Source at Oak Ridge National Laboratory, AI-assisted pipelines are interpreting scattering data during beamline operations, allowing researchers to adjust parameters during the run rather than days or weeks later. At SLAC National Accelerator Laboratory, particle physics and materials science workflows now incorporate AI-driven analysis and control at speeds that enable immediate decisions in ultra-fast experimental environments. At Brookhaven National Laboratory, he pointed to both the National Synchrotron Light Source II and the Relativistic Heavy Ion Collider, where AI is increasingly embedded in instrument operations to optimize beams and guide experiments in real time.

Gil identified three requirements for this new model. The first is usable data: high-quality, standardized, accessible data suitable for both training and inference. The second is coherent scientific infrastructure: interoperable platforms, instruments, and workflows that let data and models move across systems. The third is continuous integration of computation, theory, and experimentation into closed-loop systems that reduce cycle time and expand the search space researchers can explore.

AI amplifies scientific infrastructure rather than replacing it

Darío Gil repeatedly tied AI’s current power to decades of prior investment in experimental platforms, curated data, and domain expertise. DOE’s AI-enabled systems, he said, are guided by more than 40,000 scientists and engineers across the national laboratories, whose judgment and domain knowledge remain part of the system rather than external to it.

In fusion energy, AI can help predict plasma behavior during operations, enabling real-time optimization that strengthens the viability of future fusion systems. In materials science, AI-driven models can screen and predict key properties before a sample is synthesized, turning work that could require months of bench experimentation into hours or days of computational exploration. But none of this begins from scratch. Gil treated fusion ignition at the National Ignition Facility in 2022 as an example of integrated science already at work: simulations, high-energy-density physics, and experimental design operating together.

The change, in his account, is not integration itself. It is the speed and continuity at which integration can now happen. “We’re no longer analyzing yesterday’s results,” he said. “We’re increasingly shaping today’s experiments as they unfold.”

His clearest example was the Protein Data Bank, established in part at Brookhaven in 1971. Over roughly 50 years, the global scientific community determined about 200,000 protein structures through experimental work. Using that data set, modern AI systems expanded the number to more than 200 million structures in a few years. Gil’s lesson was not that AI made the earlier infrastructure obsolete, but that it amplified the data, experimental work, and curation that made the AI advance possible.

200,000 to 200M+
protein structures, from five decades of experimental determination to AI-expanded predictions in a few years

That same logic shaped Gil’s description of U.S. advantages. He pointed to advanced computing capacity, including leading high-performance computing systems; scientific and engineering talent across universities, industry, and the national labs; and capital markets that can translate scientific advances into applications. The federal role, as he described it, is narrower and more structural: supporting long-horizon and high-risk science, building shared infrastructure, and anchoring the innovation ecosystem.

For DOE, that role runs through foundational research, infrastructure, and 17 national laboratories. Gil said DOE operates 28 user facilities that support tens of thousands of researchers each year and serve as shared platforms for discovery across academia, industry, and government.

Genesis is meant to turn integration into an operating architecture

AI makes deeper scientific integration possible, Darío Gil argued, but not inevitable. The difference between possibility and execution is architecture: governance, data readiness, interoperable platforms, and coordinated teams. That is the purpose he assigned to the Genesis mission, which he said was launched by executive order in November by President Trump.

Genesis, as Gil described it, is intended to mobilize DOE national laboratories with industry, academia, and philanthropy to build a unified discovery platform connecting high-performance computing, AI, quantum technologies, experimental facilities, and robotic laboratories. He compared its potential role in science to the microscope in medicine and the rocket in the U.S. moon effort. Once complete, he said, it would be “the most complex and powerful scientific instrument we have built.”

The goal is not only faster research in the abstract. Gil said the platform is designed around three pillars. The first is energy, with emphasis on reliable and affordable energy systems. The second is discovery science across materials, chemistry, biology, physics, engineering, and related fields. The third is national security, including supply chains, advanced manufacturing, and mission-ready materials for defense and industry.

He put a numerical ambition on the mission: double the productivity and impact of American science and engineering within a decade. He also said Genesis is intended to deliver 10x to 100x acceleration across many scientific and engineering domains while enhancing security through AI-enabled solutions and advanced threat mitigation for high-consequence missions.

Genesis is already moving from concept to operations, in Gil’s telling. He cited funding for the American Science Cloud, which will host and distribute AI models and scientific data to the broader research community, and the Transformational AI Model Consortium, which will build and deploy self-improving models for science, engineering, and energy missions using DOE data, facilities, and expertise.

He also described Synapse-I, with leadership from Argonne across multiple DOE national labs and user facilities, as an early example of the approach. Synapse-I is a multi-laboratory platform and real-time AI engine for large-scale imaging data, where experiments can generate terabytes to petabytes of information. The system combines AI, advanced computing, and experimental systems to analyze data as it is produced and guide experiments in real time, replacing manual steps with adaptive decision-making.

In the discussion after the keynote, Gil gave a more concrete version of the productivity claim. A team had shown him AI reconstruction techniques for x-ray imaging that could produce real-time 3D reconstruction with nanometer resolution over a 100-micron field of view. What used to take six months, he said, now happens at 10 hertz.

The mission’s early scale is being measured in partners, applications, and national challenges

Darío Gil used participation metrics to argue that Genesis is no longer merely aspirational. On the industry side, DOE had signed memoranda of understanding with 38 companies, including technology and computing companies working with DOE and national labs on AI tools, high-performance computing, and scientific workflows. He also described the Genesis Mission Consortium as a public-private partnership vehicle meant to connect DOE, national laboratories, private-sector organizations, and academic institutions through matchmaking, funding coordination, technical working groups, and a broader community for shared innovation.

Workforce development is also part of the architecture. Gil said DOE issued a request for information earlier in the year seeking strategies to train scientists and engineers working at the intersection of AI and scientific discovery. The department received more than 270 submissions from academia, industry, and the research community.

Genesis has defined an initial set of 26 national science and technology challenges across energy systems, advanced materials, biotechnology, quantum science, and other areas. Gil stressed that these were meant to produce measurable outcomes, not function as abstract themes. The examples mattered because he was presenting Genesis as workflow infrastructure, not simply as another grant program.

ProjectInstitution namedClaimed change
SPARCLos AlamosMaterials qualification timelines compressed from a decade to months
LUMINAArgonneGrid operators move from evaluating 30 contingency scenarios in five minutes to 43,000, covering 99% of outage risk probability
Synapse-IArgonne leadership across multiple DOE labs and user facilitiesLarge-scale imaging data analyzed as produced, with experiments guided in real time
Examples Gil used to make Genesis concrete

SPARC, in Gil’s account, matters for defense, energy, and aerospace systems that operate in extreme environments where existing materials fail. LUMINA was his example from grid planning and resilience, where the scale of contingency analysis changes during stressed operating conditions. Synapse-I supplied the imaging analogue: large experimental data streams turned into real-time decisions.

A $293 million RFA is the mechanism Gil highlighted for assembling integrated, multidisciplinary teams around national challenges. The point is not to have universities, labs, and companies operate in parallel, but to build shared end-to-end workflows. The response exceeded DOE’s expectations: an informational webinar drew nearly 5,000 participants, compared with a typical 200 to 500 for funding opportunity sessions, and the first round of Phase 1 applications received more than 8,000 submissions. Gil called that the largest number of submissions in the department’s history, more than three times the previous record. More than 800 institutions applied, spanning industry, national labs, and academia, across all 50 states, Puerto Rico, and Washington, D.C.

8,000+
Phase 1 applications Gil said DOE received for the first Genesis RFA round

Those numbers mattered in Gil’s argument because he treated Genesis as a coordination mechanism. It created, in his words, “a moment of synchronicity across the whole country.” He said he did not think there was a single U.S. research institution unaware of the mission or unaffected by the need to think through whether and how to respond.

Productivity cannot mean more papers

Darío Gil rejected paper counts as the measure of scientific productivity. Asked how DOE would measure the ambition to double the productivity and impact of American science and engineering, he began by saying measurement is itself a scientific question and that the department would work with the community to define the right metrics. But he was explicit about one thing the metric should not be: more publications for their own sake.

Definitely I'm not recommending that we double, triple, or quadruple the number of papers that we produce. That is definitely not the goal.

Darío Gil · Source

Higher-quality and higher-impact work mattered more than paper counts. Gil offered two levels of measurement. At the program level, productivity could mean moving a major scientific or engineering target earlier in time or lowering its cost. If a fusion-energy program had a goal of producing electricity from a fusion pilot plant in scalable fashion by 2035, and a methodological change made 2032 credible at a given dollar number, that shift would be a productivity measure. Across a portfolio of programs, success could be measured by whether goals move “to the left” relative to what would have been possible under previous methods.

At the workflow level, productivity could be measured more directly: sample preparation throughput, analysis time, reconstruction speed, and other operational measures inside specific scientific systems. Gil pointed again to the x-ray imaging example: a reconstruction task he said previously took six months could now run at 10 hertz.

The Protein Data Bank example supplied another kind of benchmark: a long historical rate of progress followed by an inflection point caused by a methodological change. Fifty years and about 200,000 structures became more than 200 million AI-expanded structures within a few years. Gil did not present this as a complete metric for scientific impact, but as an example of how measurement might identify discontinuities in capability.

He added that Phase 1 Genesis applicants were asked to assess how they would measure productivity and impact improvements from their proposed methodologies. In that sense, measurement is not being treated as an external audit added later; it is being made part of the design of the projects themselves.

National coordination does not have to mean centralization

Darío Gil defended the distinctive strengths of the U.S. research system while arguing that decentralization is compatible with national ambition. In response to a comparison with China’s model of national priority-setting, he said the United States has a “famously decentralized” system for science and engineering discovery, and that it is good at it. But that does not prevent the government from naming ambitious national goals and asking the research community to respond.

In his account, DOE’s 26 challenges arise from the department’s congressionally chartered missions, and those challenges could later expand through work with agencies such as NIH and NASA. The model he advocated is a hybrid: set national challenges, then allow bottom-up solutions.

Gil contrasted Genesis with older national projects remembered as Apollo or the Manhattan Project. Genesis is not meant to gather the best scientists into one place and direct them centrally. It is meant to use distributed talent and distributed institutions while creating moments of coordination around shared priorities.

He also resisted the idea that Genesis should be the only national mission in science and technology. His claim was narrower and more pointed: if the United States cannot rally around the proposition that computing will transform the practice of science and engineering, then it is hard to imagine what other national endeavor would be obvious enough to command alignment. He described Genesis as a mechanism for institutional innovation and national alignment, and suggested that learning how to operate it could make future national missions easier to organize.

Research-to-industry transfer has to begin at the beginning

Accelerating research is not enough if discoveries cannot become industrial capability. Darío Gil accepted that concern when asked whether Genesis could help the United States translate AI-for-science gains into industries in areas such as solar, batteries, critical minerals, manufacturing, and refining.

Scientific investment, in Gil’s answer, is justified not only by curiosity and understanding, but also by its eventual translation into technology, employment, productivity, and economic impact. Supply-chain disruptions during the pandemic and other periods had sharpened awareness that the United States had lost its edge in critical industries. He cited semiconductors and major plants being built in Arizona, Texas, and memory manufacturing in New York as examples of the renewed need for secure supply chains.

The structural answer is to co-create the work from the start among academia, national labs, and industry. That includes startups, because some may become future leaders, and established companies, because they bring scale-up, investment, and manufacturing capacity. Gil rejected a linear model in which universities and labs do early science and industry later receives the output. The compression and acceleration now occurring, he argued, require collaboration from the beginning.

That institutional design is meant to include finance and venture capital as well as research performers and industrial firms. Gil did not claim that Genesis alone solves the industrial-base problem. He described the mission as focused on the front end of the process, but designed so that industry participation from the beginning increases the odds that scientific advances can become real manufacturing and new industries in the United States.

The frontier, in your inbox tomorrow at 08:00.

Sign up free. Pick the industry Briefs you want. Tomorrow morning, they land. No credit card.

Sign up free