CORDIS - EU research results

What poor information can tell: Analysis of climate policies under large uncertainty about climate change

Final Activity Report Summary - WIT (What Poor Information Can Tell: Analysis of Climate Policies under Large Uncertainty about Climate Change)

Uncertainty is pervasive in the assessment of climate change, its potential impacts and mitigation options. An adequate handling of the large uncertainty is a key prerequisite for the validity of climate policy analysis. The research undertaken in this project aimed at improving our understanding of how to analyse climate policy under uncertainty. Since the challenge was too far-reaching to allow for a single best answer, the adequate handling of uncertainty had to be broken down into various aspects, such as how well did we need to quantify the uncertainties for conducting a meaningful decision analysis (normative perspective), and how well could those uncertainties be quantified in real world applications (descriptive perspective). Moreover, we had to find how we could implement promising decision making frameworks in integrated assessments of climate change and climate policy.

Concerning the normative question, the project focussed on the idea that weaker concepts than classical probability, i.e. imprecise probability or ambiguity, might be admissible to a decision analysis. A high-level seminar was organised on this topic, which also included the widespread use of scenarios for climate policy analysis. The discussion highlighted the difficulties of subjective, i.e. Bayesian, probabilities to provide an adequate uncertainty representation in the climate context, and it was pointed out that allowing for ambiguity, i.e. imprecise beliefs, or using scenarios could alleviate these difficulties.

Nevertheless, both alternatives were also criticised. The fellow focussed his research on a framework combining Bayesian probability and imprecise probability, representing ambiguity, in the process of updating beliefs with new information. In this framework, the ambiguity component served as an additional indicator for the discrepancy between beliefs and data, i.e. ambiguity increased with discrepancy. This allowed avoiding overconfidence in the uncertainty model that might emerge from the fact that not all possibilities were accounted for in the prior probability assessment. The work was published in the International Journal of Approximate Reasoning, volume 50, pages 583-596, in 2009.

On a side track, the fellow also investigated the scenario approach in collaboration with a PhD student, Vanessa Schweizer, at Carnegie Mellon University. He and Schweizer characterised the space of future emission scenarios with a small set of descriptors and, based on statements in the Intergovernmental Panel on Climate Change (IPCC) Special report on emissions scenarios (SRES), characterised the internal consistency of scenarios over the entire uncertainty space. The work showed that scenario analysis could be used to systematically explore the uncertainty space and to provide an indication regarding the scenarios' plausibility. In the particular example, highly consistent coal-powered growth scenarios could be identified which were not included in the SRES.

Turning to the next question, to what extent real world uncertainty might be quantifiable for climate policy analysis, the project focussed particularly on uncertainties regarding future events beyond the collective experience of humankind. As pointed out above, the epistemic nature of these events strongly impeded a probabilistic treatment. The question was whether imprecise probabilities could help. The increasing attention on tipping points in the climate system provided a perfect application to investigate this question. The assessment of the likelihood of crossing such tipping points under global warming largely defied quantification, even though the potentially large socioeconomic impacts of such events were a source of concern. The fellow, in collaboration with Prof. Jim Hall from Newcastle University, prepared a questionnaire for eliciting imprecise probabilities of crossing prominent potential tipping points in the climate system, concerning the Atlantic circulation, Greenland and West Antarctic ice sheets, Amazon rainforest and El Nino, and distributed it to experts in the field.

The results showed that a significant fraction of researchers were willing to answer probability questions in this format and most took the opportunity to include significant ambiguity in their statements about these highly uncertain events. The results also indicated that the large uncertainty among experts did not imply that such events were considered to be remote. The study, published in the proceedings of the National Academy of science (PNAS), volume 106(13), pages 5041-46, in 2009, showed lower probability bounds for triggering major changes in the climate system that were in many cases considerably higher than the probability allocated to catastrophic events in current climate damage assessments. Hence, the empirical evidence from this exercise seemed to suggest that allowing for imprecise probabilities and ambiguity could actually help to quantify deep uncertainties. However, given the patchy results from the normative branch of the research, it remained a challenge to employ satisfactory decision frameworks under ambiguity that could generate sensible climate policy recommendations from such expert elicitations.

Finally, the research focussed on implementing suitable decision making frameworks under uncertainty in integrated assessments of climate policies with global coupled climate-economy models. This work was performed by the fellow in close collaboration with Dr Hermann Held and two PhD students, Matthias Schmidt and Alexander Lorenz, at the Potsdam Institute for Climate Impact Research (PIK). While a significant body of literature already existed regarding cost-benefit analysis under uncertainty, the problem of minimising costs of achieving a climate target under uncertainty received less attention. Held and the fellow formulated the problem as welfare maximisation under a so-called chance constraint, expressed in terms of a minimum probability of reaching a given climate target and developed an algorithm to impose this decision criterion on a model of economic growth and endogenous technological change with uncertainty in climate and technology parameters.

Their analysis showed that stringent mitigation strategies could not guarantee a very high probability of limiting warming to 2 degrees Celsius under current uncertainty about climate sensitivity and climate response time scale (volume 31(1): S50-61, 2009). While the framework deemed suitable for the static case, further work spearheaded by Schmidt and Lorenz revealed the possibility of rejecting costless information once future learning about climatic parameters was taken into account. This counterintuitive result reflected conceptual flaws of cost minimisation under chance constraints. It turned out that:
1. suitable decision criteria avoiding such flaws had to include an explicit trade-off between economic cost and climate targets; and that
2. the special case of maximising the imprecise probability of jointly observing an economic and climate target was the best candidate for extending the analysis to the case of ambiguity.
The latter decision criterion was already studied by the fellow in an earlier publication (E. Kriegler, H. Held, T. Bruckner, 2007, 'Climate protection strategies under ambiguity about catastrophic consequences', included in J. Kropp, J. Scheffran (eds): Advanced Methods for Decision Making and Risk Management in Sustainability Science, Nova Science Publ. Inc., New York, pp. 3-42).