CORDIS - Forschungsergebnisse der EU
CORDIS

An Information Theoretic Approach to Improving the Reliability of Weather and Climate Simulations

Periodic Reporting for period 4 - ITHACA (An Information Theoretic Approach to Improving the Reliability of Weather and Climate Simulations)

Berichtszeitraum: 2022-04-01 bis 2023-09-30

Climate change is widely seen as one of the most important threats to humanity. As a result of our emissions of greenhouse gases, global temperatures are rising. However, the implications of our carbon emissions for regional changes in weather are poorly understood.

As discussed in an important paper published in the Proceedings of the National Academy of Sciences, I argued that we need to increase the resolution of our climate models substantially (i.e. reduce the size of the basic model grid boxes) before we can be confident in predicting regional climate change, particularly in relation to extreme events.

Clearly, to increase the resolution of climate models requires enhanced computational capability. However, this alone is not enough: we additionally need to improve substantially the computational efficiency of our climate models.

Improving the computational efficiency of weather and climate models is the primary goal of ITHACA.



Developing high-resolution weather and climate predictions is vital for a number of reasons.

For weather prediction, it enables more reliable forecasts of extreme weather events to be made.

In addition, if society is to be able to adapt to climate change, we need to know whether it is primarily adapting to hotter and drier conditions on the one hand, or wetter and stormier conditions on the other. Knowing this will determine the type of infrastructure investments needed to make society more resilient to climate change. Unfortunately, the IPCC reports show that for most regions of the world, even the sign of annual-mean precipitation change is profoundly uncertain.

Secondly, if we are ever to consider climate geoengineering, e.g. spraying aerosols in the stratosphere to reflect sunlight, we have to be sure we are not making the climate worse in some parts of the world.

Thirdly, the emerging field of “loss and damage” requires some kind of quantitative attribution of specific weather events to human greenhouse gas emissions.

Finally, we need to understand the climate system better. For example, are there significant tipping points in the climate system and are our emissions likely to take us over a tipping point in the coming years?

p

In ITHACA we have studied three ways of improving the computational efficiency of our climate models: a) reducing the numerical precision of the variables in the models from 64 bits – the historical default – to 32, and even 16 bits, b) using quantum computers which could in principle improve computational speed exponentially, c) use AI methods to represent parts of the climate model.
One of the most important results obtained in this project was working with the European Centre for Medium-range Weather Forecasts, reducing their operational forecast models from 64 to 32 bit precision. In collaboration with ECMWF colleagues it was shown that the same levels of forecast skill could be obtained with a 40% reduction in compute run time. This reduction in run time was used to increase the vertical resolution of the model. This increased forecast skill. Hence, we were able to demonstrate the aphorism suggested in the ITHACA proposal: “more accuracy with less precision”.

With the development of AI and GPU architecture, we have looked at the extent to which it is possible to reduce numerical precision further. Using emulators of 16 bit precision we have shown that further reductions are possible if we combine reduced precision with so-called stochastic rounding schemes. Here a number is not rounded deterministically to the nearest reproducible number. Instead, it is rounded probabilistically. Stochastic rounding is now being implemented in hardware, e.g. by GraphCore and Nvidia. This bodes well for further reductions in numerical precision.

In addition to running models with low precision, we have shown how information theory can be used to determine the optimum precision (i.e. optimum compression) by which data can be stored – not too precise that it wastes resources, and not too imprecise that vital information is lost.

One of the first workshops on how AI can be exploited in weather and climate modelling (with workshop proceedings published in Phil. Trans. Roy. Soc.) was run under ITHACA. From this we have been studying how low-precision AI can be utilised in weather/climate models. To this end, we have studied a) the development of parametrisation schemes such as gravity wave drag, based on AI trained on output from complex models, b) the use of low-precision AI to downscale coarse-resolution climate models to postcode scale.

Finally, we have studied whether quantum technology can be used to speed up weather and climate models. Here we have found a fundamental obstacle – the measurement process means that with (say) a 1,000 qubit computer, only 1,000 bit values can be output. This is much too small a value to be useful for operational weather and climate prediction.

The results from ITHACA are feeding forward into new projects, such as a World Food Programme/Google project to downscale ECMWF forecasts on timescales of days to seasons over East Africa, and EVE, where we will be working with NVIDIA to develop a new generation of ultra high-resolution climate models.
The key results can be summarised as follows.

Firstly, as demonstrated explicitly by the adoption of ITHACA’s 32-bit analysis by ECMWF for its operational forecasts, operational weather and climate models can run satisfactorily at 32 bits. It should also be possible to run at lower precision, notably at 16 bits, using stochastic rounding algorithms. Such work can become operational once computer chip manufacturers include stochastic rounding on chip. This is likely to become available by a major chip manufacturer in a few years.

Secondly, low precision AI is able to play a major role in future weather and climate models: by expressing model parametrisations using neural network software, trained on the parametrisations, and by downscaling using Generative Adversarial Networks, instead of the conventional Limited Area Models.

Thirdly, information theoretic analysis shows that significant savings can be make compressing archived model data using low-precision formats. The size of data archives is becoming a big problem as model resolution increases.

Fourthly, it is unlikely that quantum technology will play a significant role in accelerating weather and climate model code. The fundamental problem here is that the amount of data that can be output by a quantum computing is limited (logarithmically) by the collapse of the wavefunction during measurement.

Plans are being made to develop next generation km-scale climate models. Results from ITHACA will guide the development of such models, and ensure that most benefit is being made of the (exascale) computing needed to run such models.
Shallow water simulations in 16-bit arithmetics: Alternative number formats to floats