Periodic Reporting for period 1 - ESiWACE2 (Excellence in Simulation of Weather and Climate in Europe, Phase 2)
Reporting period: 2019-01-01 to 2020-06-30
Numerical weather prediction and climate modelling always have been highly dependent on the available computing power and the ability to produce, store and analyse large amounts of simulated data. Increasing computational power is necessary for an increase in the achievable spatial resolution and the completeness and accuracy of physical processes that can be calculated and predicted by the models. Due to the enormous economic importance of weather and climate predictions, these simulations for decades have been routinely run on some of the most powerful supercomputers worldwide. With the transition to exascale computing, operational use of global storm-resolving models, i.e. models based on a very fine underlying mesh, spanning the globe with grid spaces of only a few km, will become possible, finally allowing to explicitly resolve vertical energy transfers in the atmosphere. This will mark a step change in the quality of weather and climate forecasting but also poses challenges that can only be tackled by a coordinated European effort. ESiWACE bundles and strengthens European activities to (1) enable leading European weather and climate models to leverage the performance of pre-exascale systems with regard to both compute and data capacity as soon as possible, and (2) prepare the weather and climate community to be able to make use of exascale systems when they become available.
One success story is the delivery of a coupled atmosphere-ocean simulation at 5km and an atmosphere only simulation at 2.5km with the German weather and climate model ICON. This demonstrates the feasibility of simulations with this flagship set-up of ESiWACE, although still at a lower throughput than the ultimately targeted simulated year per day. For IFS, the model of the European Centre for Medium-Range Weather Forecasts (ECMWF), we could demonstrate that it can make good use of the entire Piz Daint supercomputer, Europe's fastest system, by running at 1.45km nominal resolution. The throughput is still below the one SYPD we aim for, but only by a factor of 5, and not by orders of magnitude. The French NEMO model, which is being used by many European weather and climate institutions and by several of our flagship set-ups, was scaled to use up to 44% of the Mare Nostrum 4 supercomputer in Barcelona. Finally, DYNAMICO, the heart of the next-generation French model system, has reached a turnover of 10 SYPD at a resolution of 12km.
On a longer perspective, aiming especially at the real exascale systems, which will be installed towards the end of the project in 2022/2023, we prepare the models for the expected increasingly heterogeneous architectures. We have advanced the emerging technology of domain specific languages, which sacrifices universality for more legible code that can automatically be optimised for different hardware architectures. One noteworthy success here is the extension of the PSyclone DSL developed in UK to the point where it can handle the full ocean model NEMO. This allows to use the power of the DSL approach to improve a full production-mode code. Similarly, the second advanced DSL in Europe, the Swiss DAWN, has been extended for the use with the ICON model, and has improved the GPU performance of the model. In addition, we have evaluated the applicability of new methods like machine learning approaches to weather and climate simulations.
To extend our services to models beyond our flagship codes, we are reaching out to the wider scientific community by offering software support services. Together with our partners Atos and NLeSC, these software support services are offered in annual calls for applications. In the first round, we are helping four codes from Cyprus, Germany, and the Netherlands to increase their efficiency on modern supercomputers. Additional support in the use of infrastructure components will start in autumn. We view these prototype services as a role model for how to help not only selected codes and direct beneficiaries of a project but to make a whole community exascale-ready.
In addition, ESiWACE is offering workshops and trainings on the topics relevant to weather and climate scientists for using the upcoming computer systems. Unfortunately, COVID-19 required some events to be held as virtual meetings. On the positive side, this increased the participant capacity by avoiding travel costs.
Cutting-edge applications of weather and climate models on exascale computers imply the production of extremely large data sets, i.e. exabytes of data. ESiWACE develops an Earth System Data Middleware (ESDM) in a collaboration of universities, compute centres and storage vendors to efficiently store huge amounts of data without bothering the user with technical details. This is complemented by extensions for post-processing, analysis and visualisation (PAV). A first demonstrator shows how ESDM-PAV will enable seamless analysis and visualisation of the simulation results, even in-situ while the model is running.
Until the end of the project we expect to provide scientifically useful and world leading applications on then existing EuroHPC supercomputers and to contribute to initiating follow-on projects and initiatives to exploit, consolidate and advance these applications.