Scientists Call for National Strategy on High-Performance Computing

A group of leading computer scientists is urging the United States to increase its investment in high-performance computing (HPC), a technology critical for scientific discovery, economic competitiveness, and national security. In a new policy forum published in Science, they argue that the U.S. lacks a comprehensive strategy to address mounting technical and geopolitical challenges in this vital field.
"We need to reimagine how the HPC ecosystem should look from the hardware level, through the software stack, the connections to distributed computing resources, and all the way up to applications in science, engineering, manufacturing,” Ewa Deelman, a principal scientist at USC Viterbi’s Information Sciences Institute (ISI) and the paper’s first author, said.
HPC systems are advanced computing ensembles that harness the power of tens of thousands of processors to solve complex problems beyond the capabilities of regular computers. These supercomputers enable breakthrough research in everything from weather forecasting and drug discovery to cryptography and aircraft design, making them essential tools for innovation.
While the U.S. recently succeeded at deploying its first exascale supercomputers—a type of HPC system capable of a quintillion calculations per second—the question of what comes next remains uncertain, Deelman said.
Continued technical advances in HPC are needed to power scientific discovery. Yet the paper authors point to several challenges the U.S. faces in this area: fast-paced changes in computing hardware, physical limitations in chip manufacturing, shifting market dynamics driven by artificial intelligence, growing concerns about energy consumption, and so on. The rise of AI has also intensified demands on HPC systems while introducing new computational requirements.
Geopolitical challenges compound these technical hurdles. In the global competition for technological sovereignty and compute power, other nations are finding success with coordinated national strategies. The European Union's EuroHPC initiative is building some of the world's fastest supercomputers across multiple countries; China is rapidly advancing its domestic HPC capabilities; and Japan has developed the Fugaku supercomputer through a carefully planned national program.
To maintain long term U.S. competitiveness, the paper calls for a "whole-nation" approach that would bring together academia, national laboratories, industry, and government in a coordinated effort. Researchers urge the U.S. federal government to organize a task force charged with creating a national, 10-year roadmap for HPC in the post-exascale computing era, one that remains adaptable to the rapid pace of technological change. For instance, quantum computing could reshape HPC over the next decade, requiring consideration of hybrid systems to tackle currently intractable problems, Deelman said.
But technology is only one part of the ecosystem that upholds HPC. Another critical challenge is the need for a well-trained cyberinfrastructure workforce, including researchers, software engineers, system operators, trainers, and educators.
“We need to keep raising awareness of the importance and the need for support in terms of national policies and funding,” Deelman said. “Not only for HPC, but broadly science, engineering, workforce development, and education. They are all needed for the safety and prosperity of our nation as well as the global well-being.”
The Science paper authors represent nine major institutions: the University of Southern California, the University of Tennessee, Oak Ridge National Laboratory, the University of Manchester, Lawrence Livermore National Laboratory, Duke University, the University of Utah, the University of Wyoming, and the University of California Berkeley.