December 20, 2024 – by Santina Russo
Climate modelling has long faced a trade-off: Global models capture Earth’s interconnected processes involved in climate but typically achieve only moderate spatial resolutions of about 50 kilometres; in contrast, regional models can resolve local phenomena, like rainstorm cells, at a few kilometres but only over limited areas. Despite progress in recent years, the two approaches have remained distinct — until now. An international team has closed this gap with a new exascale climate emulator, achieving unprecedented global resolution. For the performance of their application, they were awarded the Gordon Bell Prize for Climate Modelling.
The new emulator achieves an astounding global resolution of 3.5 kilometers. Led by Marc Genton, Al-Khawarizmi Distinguished Professor of Statistics, and David Keyes, Professor of Applied Mathematics and Computational Science and Founding Director of the Extreme Computing Research Center, both at the King Abdullah University of Science and Technology (KAUST), the team benchmarked the emulator on CSCS’s supercomputer ‘Alps’, putting it to the test for 48 hours on NVIDIA Grace Hopper Superchips — a first for this application.
Simulation vs. emulation
An emulator builds on physical observations and climate simulations. These simulations rely on the physical equations of fluid mechanics, which describe atmospheric dynamics, ocean circulation, the effects of snow and ice, and so forth. In essence, they reproduce the processes that drive Earth’s climate. Since these equations are chaotic, which means that infinitesimally different initial or boundary conditions will diverge to totally different outcomes, hundreds or even thousands of simulations are needed to deduce the behaviour of the system, each with different conditions like temperature or wind flow starting points. This requires an immense amount of computing power.
The matter is different with emulators. Here, a statistical model is developed that reproduces the data ensemble generated by simulations — resulting in much lower computing costs. For their prize-winning project, the KAUST team based their Earth system emulator on 35 years of hourly 25-kilometre resolution simulation data from ERA5, the most recent analysis of the global climate by the European Centre for Medium-Range Weather Forecasts (ECMWF). Then, the team trained their emulator with over 54 million global locations at the ultra-high resolution of 3.5 km, yielding 477 billion hourly data per year in their emulator model.
Smart math on a fitting GPU architecture
To be able to efficiently include this vast amount of data while simultaneously increasing the model’s resolution, the team used a mathematical representation named spherical harmonics, which allowed them to selectively filter noise from the signal of interest.
Another mathematical procedure called Cholesky Factorization was optimized by adapting the range of precisions available in Alps’ Hopper GPUs — the industry’s widest — to reduce computational costs and further enhance resolution while maintaining accuracy requirements.
To benchmark the emulator’s scalability and portability, the team tested it using the full capacities of four high-performance computing (HPC) systems with different GPU architectures. On CSCS’s ‘Alps,’ it achieved the fastest speed per GPU: 93.8 teraflops per second and 0.739 exaflops per second overall.
This record performance was enabled by the Grace Hopper Superchips, which are the first to support 8-bit floating point numbers, making calculations much faster than with the 16-, 32-, or 64-bit units available on other chips. For applications that do not require the highest accuracy, using 8-bit floating points can dramatically enhance performance. This was also true for the team’s emulator, which leveraged 8-bit precision for many operations, as higher precision would provide no mathematical advantage. “This allowed us to fully exploit the unique capabilities of ‘Alps,’” says Keyes. “For us, ’Alps’ was a one-of-a-kind resource, and we were thrilled to achieve this unprecedented rate of computational performance,” states Keyes. “We are also deeply grateful for the exceptional and dedicated support provided by the CSCS team of software engineers.”
Emulators for the future
Keyes sees the prize as a sign of the growing importance of emulators in Earth system modelling. Once built on a supercomputer, such a statistical model can be queried on a standard laptop, making it ideal for addressing practical questions — from estimating temperature or rain for agricultural planning to assessing sunshine for solar energy potential and battery needs.
In addition, Keyes believes that his team’s exascale climate emulator holds significant potential for advancing climate research and supporting sensible climate policy making. “Ultimately, if there's one holdout for an optimistic future, it’s that technology can sometimes solve its own problems.”
The animation shows the surface temperatures for 24 hours, generated by climate simulations (left), and the new emulator (right).
(Image: Abdulah et al. (2024), arXiv: 2408.04440)
Reference:
S. Abdulah, A.H. Baker. G. Bosilica et al.: Boosting Earth System Model Outputs And Saving PetaBytes in their Storage Using Exascale Climate Emulators, SC24 (2024). arXiv: 2408.04440
The Climate Gordon Bell Prize
The Gordon Bell Prize for Climate Modelling is awarded every year to recognize the impact and potential impact of climate scientists and software engineers on the field of climate modelling, on related fields, and on wider society by applying high-performance computing to climate modelling applications. The prize aims to recognize innovative parallel computing contributions toward solving the global climate crisis. Winners are awarded $10,000.