Skip to main content

An Information Theoretic Approach to Improving the Reliability of Weather and Climate Simulations

Periodic Reporting for period 3 - ITHACA (An Information Theoretic Approach to Improving the Reliability of Weather and Climate Simulations)

Reporting period: 2020-10-01 to 2022-03-31

There is no doubt that climate change is a severe threat to society and that the rising temperatures worldwide are caused by our emissions of greenhouse gases. However, there remain profound uncertainties about climate change, not least what its manifestation will be at the regional level. To reduce these uncertainties we need to run our climate models at much higher resolutions than we can currently. The recently launched EU Digital Europe project “Destination Earth” is committed to developing a new generation of high-resolution climate model. Such models become conceivable as we move towards exascale computing. However, such computers are not sufficient. We also need to increase the computational efficiency of our top-of-the-range computer models to allow ensemble predictions on multi-decadal timescales using models with km-scale grids. On top of this we need to find efficient ways of storing the data that is produced by a km-scale models. The ITHACA project seeks to do this.

Developing accurate climate models is vital for the well-being of society:

1. Mitigation. We need to know how rapidly our climate will change without taking mitigating action so that we can plan for appropriate mitigating actions. A specific example of considerable uncertainty currently are tipping points. Mitigating action taken after we have reached a tipping point will be ineffective.
2. Adaptation. It is vital the society worldwide becomes more resilient to our changing climate. However, currently there are considerable uncertainties in regional climate change. Many countries do not know whether they should be adapting to a hotter drier climate, or a more stormy wetter climate.
3. Geoengineering. It is sometimes mentioned that spraying aerosols (or their precursors) in the stratosphere could help offset climate change. However, at the regional level we do not know whether this will impact negatively on regional climate. Could geoengineering cause the monsoons to shut off? Could it reduce the supply of moisture to the rain forests, turning them into a carbon source for the atmosphere.

Current generation climate models are not good enough for the tools required of them. In order to enable such high-resolution models to become feasible with the dawn of exascale computing, we need to look carefully at the way current models are formulated and see if computer resources are being used effectively and efficiently. For example, in a contemporary computer model, the actual sustained speed of the code as it executes on a supercomputer is only a few percent of the possible speed of the supercomputer. It is as if 95% is being wasted.

The key question posed in ITHACA is this. What is the information content in these billions of bits? How many bits contain useful information and how many bits are effectively just noise? Can we somehow formulate our models retaining only the bits that contain useful information? The latter question is vital when we think about the amount of data that needs to be stored from a high-resolution model.
One of the most important practical outcomes from ITHACA is the work we did with the European Centre for Medium-Range Weather Forecasts showing that the representation of real numbers in the model (several billion of them) can be reduced from 64 to 32 bits without degrading forecast accuracy, but will save roughly 40% in run time.

Following the plan in the ITHACA proposal, ECMWF have reinvested this saved computer time to increase the vertical resolution of the model, which has resulted in an increase in forecast skill with no increase in computer cost (the PI believes the first time this has happened in the history of numerical weather prediction!). An important paper has been published on these results.

Improved weather forecasts saves lives around the world. Importantly the Anticipation Action programme of disaster relief agencies requires the production of reliable probabilistic forecasts. The research performed under ITHACA has had a real impact in this important humanitarian area.

We have now published a paper based on work performed entirely within ITHACA showing that a somewhat simplified version of the ECMWF (spectral) operational forecast model can be successfully run with 16-bit precision for floating point numbers.

We have studied computationally demanding parametrisations (such as the radiative transfer algorithms) and have shown how these can be solved accurately using 16-bit numerics.

We have also submitted a paper showing how 16-bit numerics can be implemented in the dynamical core of grid-point based models without loss of accuracy.

We have published papers which discuss the ability to utilise reduced precision in data assimilation systems. These systems enable initial conditions to be produced from sets of observations and are typically based on some minimisation algorithms. These can be sensitive to numerical precision. However, we have shown that providing certain algorithmic safeguards are performed – e.g. ensuring the strict orthogonality of quasi-independent perturbations – then our reduced-precision programme also works for data assimilation too.

In recent years, supercomputers capable of performing 16-bit arithmetic have been developed, driven by the AI market. In ITHACA we have studied how 16-bit AI techniques can be used to downscale the output of weather and climate models, thereby increasing their effective resolution. The PI wrote a paper on the “future of numerical weather prediction” where he speculated about the role of AI-based downscaling replacing limited area modelling. This was incorporated into a UN World Meteorological Organisation White Paper, a shortened version of which has been submitted for publication. The ITHACA team has recently submitted a paper on the use of 16-bit AI methods for downscaling.

The other big area we have been researching is into designing more efficient data storage methods, making use of the information content in the data. Modern weather and climate ensemble forecast systems have high resolution and relatively large ensemble sizes. Storing output from such systems is very problematic because datasets are becoming so large. Using concepts around information content we have developed new dynamic methods for compressing data. A paper has appeared in Nature Computational Science (and was highlighted with a News and Views commentary).
ITHACA is leading the work worldwide in low-precision numerics for high-resolution models. Our aim for the end of this project is for this work to be incorporated into some of the leading weather and climate prediction models in the world. We are working closely with the EU Destination Earth project. We expect that a next-generation weather and climate prediction model will judiciously combine single and half-precision numerics and with much of the parametrisation and Earth-System complexity represented by 16-bit AI. In this way, ITHACA will make a major contribution to the development of very high-resolution global Earth-System models, the need for which has been articulated above.

Although many of the results from ITHACA will be demonstrated using emulators of low-precision software, for the results to have full value it is important that they can be demonstrated in hardware.

We will be working hard over the final months of the project to ensure that all results are fully documented in the scientific literature.
Shallow water simulations in 16-bit arithmetics: Alternative number formats to floats