Skip to content
Home » Frontier Supercomputer Powers Largest-Ever Universe Simulation

Frontier Supercomputer Powers Largest-Ever Universe Simulation

  • by

The universe, in all its vastness and complexity, is not only an awe-inspiring subject of human curiosity but also an immense challenge to study and understand. As technology and computational capabilities evolve, scientists are increasingly able to simulate the intricate processes that shape the cosmos. In early November, researchers at the Department of Energy’s Argonne National Laboratory marked a significant milestone in the field of astrophysical research with the largest and most advanced simulation of the universe ever conducted, thanks to the unparalleled power of the Frontier supercomputer located at Oak Ridge National Laboratory (ORNL).

The scale of the achievement is both awe-inspiring and transformative for the field of cosmology. These simulations, which were made possible by the computational prowess of the Frontier supercomputer, represent a major leap forward in our understanding of how the universe operates. They allow researchers to simulate not only the effects of gravity but also the complex interactions between different components of the universe, including atomic matter and dark matter, in ways that were previously unimaginable. Until now, such a comprehensive approach to simulating the universe had not been possible at this scale.

At the heart of this breakthrough is a method known as cosmological hydrodynamics. In simple terms, hydrodynamics refers to the study of fluids and gases in motion, and when applied to cosmology, it helps simulate how matter behaves under the influence of gravity. The universe contains two key types of matter: ordinary, or atomic matter, and dark matter. While atomic matter interacts with gravity as well as electromagnetic forces, dark matter interacts only gravitationally and is invisible to the types of telescopes that observe ordinary matter. Understanding both of these components is crucial for creating accurate simulations of the universe, as they are both fundamental to its formation and evolution.

Salman Habib, the project lead and division director for Computational Sciences at Argonne, explained that in order to gain a comprehensive understanding of the universe, researchers must simulate not only the gravitational effects of dark matter but also the physics of atomic matter. This includes processes like the formation of stars, galaxies, black holes, and the hot gas that permeates the space between these objects. These complex simulations are what researchers call cosmological hydrodynamics simulations—a term that reflects the scope and complexity of the task at hand.

One of the primary challenges with these simulations is that they are computationally expensive and extremely difficult to run. Traditional simulations, which focused solely on the effects of gravity, were simpler to execute but lacked the necessary realism to account for the full range of physical processes that occur in the universe. By contrast, cosmological hydrodynamics simulations must account for a vast array of variables, including the interactions of gases, radiation, and other fundamental forces over billions of years.

Simulating the universe in its full complexity is an inherently difficult task because it requires not only a vast amount of computational power but also an understanding of the intricate dynamics of matter and energy. For instance, simulating a large section of the universe that is under study by massive telescopes, such as the Rubin Observatory in Chile, requires capturing the dynamics of the universe over billions of years of expansion. The sheer scale of such simulations—both in terms of time and space—was simply not feasible with earlier computational tools. Until recently, the most advanced simulations could only approximate the effects of gravity, missing out on critical details about the formation of galaxies and other structures.

To overcome these challenges, the researchers turned to HACC (Hardware/Hybrid Accelerated Cosmology Code), a specialized simulation code that was initially developed about 15 years ago for petascale machines. HACC’s design allows it to handle large-scale cosmological simulations efficiently, making it an ideal tool for the current project. In fact, HACC was a finalist for the prestigious Gordon Bell Prize in computing in 2012 and 2013, recognizing its potential for tackling complex scientific problems at the cutting edge of computational technology.

In early November 2024, researchers at the Department of Energy’s Argonne National Laboratory used Frontier, the fastest supercomputer on the planet, to run the largest astrophysical simulation of the universe ever conducted. This movie shows the formation of the largest object in the Frontier-E simulation. The left panel shows a 64x64x76 Mpc/h subvolume of the simulation (roughly 1e-5 the full simulation volume) around the large object, with the right panel providing a closer look. In each panel, we show the gas density field colored by its temperature. In the right panel, the white circles show star particles and the open black circles show AGN particles. Credit: Argonne National Laboratory, U.S Dept. of Energy

As part of the ExaSky project, HACC underwent significant upgrades to optimize its performance on exascale supercomputers, which are capable of performing more than a quintillion (a billion-billion) calculations per second. The Exascale Computing Project (ECP) aimed to push the boundaries of computational science by improving existing software and developing new tools that could harness the power of these supercomputers. The result was a version of HACC that could run about 50 times faster than its predecessor, which was already operating on Titan, the fastest supercomputer at the time. On the Frontier supercomputer, the new version of HACC ran nearly 300 times faster than before, enabling simulations that would have been unthinkable just a few years ago.

To achieve this extraordinary level of performance, the simulation utilized approximately 9,000 compute nodes of the Frontier supercomputer, each powered by AMD Instinct MI250X GPUs. Frontier, located at the Oak Ridge Leadership Computing Facility, is currently the fastest supercomputer in the world and is specifically designed to handle the immense computational demands of simulations like the one carried out for this project. With the help of such advanced computing power, researchers were able to simulate not just large portions of the universe but also the complex, dynamic interactions that drive its evolution.

The successful execution of these simulations was not simply about reaching a record-breaking computational achievement, but also about the scientific insights that can be gained from such simulations. As Bronson Messer, the director of science at the Oak Ridge National Laboratory, explained, the simulations’ true significance lies not only in their sheer size but in the physical realism they introduce. By incorporating all the complex forces and components—such as baryons (ordinary matter) and other dynamic physics—the simulations offer a more accurate and detailed representation of the universe than ever before.

Beyond the direct scientific contributions of these simulations, the project also marks a significant step in the evolution of computational astrophysics. The HACC team, which includes researchers such as Michael Buehlmann, JD Emberson, Katrin Heitmann, Patricia Larsen, Adrian Pope, Esteban Rangel, and Nicholas Frontiere, played a pivotal role in developing the tools and methods that made this achievement possible. Their work, in collaboration with other experts in the field, has created a foundation for even more advanced simulations in the future.

Before the simulations were run on Frontier, the HACC team conducted parameter scans on the Perlmutter supercomputer at the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. They also ran tests on the Aurora supercomputer at the Argonne Leadership Computing Facility, which further helped optimize the code for the Frontier supercomputer. These efforts ensured that the HACC code could fully exploit the capabilities of exascale machines and run simulations of unprecedented scale and accuracy.

The results of this simulation provide not only an opportunity for scientific discovery but also a crucial tool for understanding the universe. The ability to simulate both dark matter and atomic matter simultaneously, at this scale, opens new avenues for exploring the cosmos. Researchers can now study the formation and evolution of galaxies, the behavior of black holes, and the role of dark matter in the universe’s large-scale structure. These insights are essential for answering some of the most fundamental questions in cosmology: What is dark matter? How did galaxies form? What is the ultimate fate of the universe?

Source: Oak Ridge National Laboratory