Skip to main content
European Commission logo print header

EUropean CLimate and weather Events: Interpretation and Attribution

Final Report Summary - EUCLEIA (EUropean CLimate and weather Events: Interpretation and Attribution)

Executive Summary:
Climate change is expected to impact extreme weather in Europe. There is therefore a clear need to adapt effectively to climate change, when recent heatwaves, floods and droughts have demonstrated the vulnerability of European citizens to extreme weather. However, scientifically robust information about the extent to which recent extreme weather can be linked to climate variability and change is often lacking. There is therefore a clear need to develop better information on weather and climate risks as part of the operational capacities in the climate change context of Copernicus
.
EUCLEIA has developed the science of attribution of extreme weather and climate events. It has demonstrated the potential capability of future operational attribution systems to produce reliable and user-relevant attribution assessments of recent extreme events. And EUCLEIA has worked with targeted stakeholder groups to better understand the potential uses of attribution information for a range of sectors.

The EUCLEIA project has had a number of important achievements. It has developed a much clearer understanding of the possible user needs for a future event attribution service for a range of different sectors. Substantial progress has been made in developing experimental designs for attribution assessments. New methods for event attribution have been developed permitting an improved understanding of the changing risk of extreme events. EUCLEIA has developed techniques for assessing the reliability of event attribution assessments. It has made substantial input to the annual reports explaining extreme events of the previous year from a climate perspective published in the Bulletin of the American Meteorological Society. It has also made an important contribution to the development of event attribution science internationally through engagement in international meetings and workshops and contribution to scientific literature including review articles and perspective pieces and through its contribution to the 2016 report by the US National Academies of Sciences into event attribution.

The EUCLEIA project has laid important groundwork for the development of a future operational attribution service for Europe. Such a system will need to be supported by the continuing development of event attribution science. This will enable an increased capability of an operational attribution system as the underpinning science matures, for example to be able to robustly attribute a wider variety of extreme events.

Project Context and Objectives:
Climate change is expected to impact extreme weather in Europe. There is therefore a clear need to adapt effectively to climate change, when recent heatwaves, floods and droughts have demonstrated the vulnerability of European citizens to extreme weather. However, scientifically robust information about the extent to which recent extreme weather can be linked to climate variability and change is often lacking. There is therefore a requirement to develop better information on weather and climate risks as part of the operational capacities in the climate change context of Copernicus.

The aim of EUCLEIA was to develop the means to provide reliable information about weather and climate risks by developing an event attribution system for Europe. This system has demonstrated the capability to deliver reliable and user-relevant attribution assessments on a range of timescales; on a fast track basis in the immediate aftermath of extreme events, on a seasonal basis based on a state of the art modelling system and annually to the scientifically prestigious annual attribution supplement of the Bulletin of the American Meteorological Society. The capabilities of event attribution have been evaluated on a set of test cases involving heat waves, cold spells, floods, droughts and storm surges.

EUCLEIA has worked with targeted stakeholder groups, including the insurance industry, regional managers and policy makers and the general public to deepen understanding of the requirements for future operational attribution systems. The project has assessed the extent which well verified assessments of the extent to which such weather-related risks have changed due to human influences on climate can be produced and communicated. It has also assessed the types of weather and climate-related events for which further research is needed to support future developments of operational attribution systems.

EUCLEIA has five top level objectives each of which has been assessed with project milestones and deliverables, and which map onto six Work Packages associated with much of the scientific and system development. These objectives are listed below making reference to their associated WPs:

1. Derive the requirements that targeted user groups (including regional stakeholders, re-insurance companies, general public / media) have from attribution products and demonstrate the value to these users of the attribution products developed under EUCLEIA (WP3: Dissemination and Outreach; WP4: Assessing detection and attribution through general public and stakeholder analysis)

2. Develop experimental designs and clear ways of framing attribution studies in such a way that attribution products provide a fair reflection of current evidence on attributable risk. (WP5: Development of attribution methodologies and exploration of framing issues)

3. Develop the methodology of representing the level of confidence in attribution results so that attribution products can be trusted to inform decision making (WP6:Evaluation and diagnostics).

4. Demonstrate the utility of the attribution system on a set of test cases of European weather extremes (WP7: Applications of the methods to targeted test cases).

5. Produce traceable and consistent attribution assessments on European climate and weather extremes on a range of timescales; on a fast-track basis in the immediate aftermath of extreme events, on a seasonal basis to our stakeholder groups, and annually to the BAMS attribution supplement (WP8: Development and application of near real time attribution service).

The achievements of the project against these objectives are provided in the next section. These achievements have been reviewed by an Expert Advisory Board (EAB) who have provided regular constructive feedback during the course of the project. The EAB consisted of Dr. Francis Zwiers (University of Victoria Climate Impacts Consortium, Canada), Dr Claudia Tebaldi (National Centre for Atmospheric Research, USA), Dr. Thomas Peterson (formerly National Oceanic and Atmospheric Administration, USA, now retired), Prof. David Karoly (University of Melbourne, Australia).
The EAB commented in March, 2016 that EUCLEIA has been very influential in the development of event attribution science and the demonstration of a possible operational approach. One indication of its influence, the EAB commented, is that several EUCLEIA participants were asked to provide input to the US National Academies of Science Committee on Extreme Weather Events and Climate Change Attribution. This included a substantial level of participation by EUCLEIA scientists in the key workshop that was organized to obtain input from the community that is undertaking event attribution (see http://dels.nas.edu/global/basc/eea-workshop-linked-agenda.xml) as well as involvement in the review of the report also released in March, 2016 (https://www.nap.edu/catalog/21852/attribution-of-extreme-weather-events-in-the-context-of-climate-change).

The EAB also commented that despite very impressive progress in the project, the science of event attribution is still in a phase of rapid development, and thus implementation of an operational event attribution service will require a continued research and development effort. It will be important as the science matures to ensure that operational capability develops to take advantage of this emerging capability, while at the same time, operational systems remain consist with scientific understanding and do not risk the credibility of the science by providing ill supported products.

A useful pointer to the development of operational systems was provided by the National Academies of Science report on event attribution. They advised that the development of such systems would benefit from links to operational weather and climate prediction on a range of timescales. This would allow the development of coordinated modelling systems, evaluation techniques and communication strategies. This would provide a consistent picture of past, current and likely future extreme events, from days to decades ahead. The NAS committee stressed the importance of continued research to support the on-going development of operational systems.

In the view of the committee who wrote the NAS report on event attribution, a successful operational event attribution system would have the following attributes:

• There should be an objective way to select events to reduce selection bias so stakeholders understand how individual events fit into the broader picture of climate change;

• There should be provision of stakeholder information about causal factors within days of an event, followed by periodic updates as more data and analysis results become available;

• There should be clear communication of key messages to stakeholders about the methods and framing choices as well as the associated uncertainties and probabilities;

• There should be reliable assessments of performance of the event attribution system through evaluation and verification processes.

The NAS event attribution report was produced during the latter stages of the EUCLEIA project but the objectives of EUCLEIA were well framed to help meet the research and development needs identified by the report. The scientific progress made under EUCLEIA has helped advance the science of event attribution. The development of a prototype quasi-operational attribution system has helped explore how the type of operational system envisaged by the NAS report might work, with the system providing results on a range of timescales and being increasingly linked to an operational seasonal forecasting system. And through engagement with stakeholder groups, EUCLEIA has been able to develop understanding of the range of potential users for event attribution information from a range of sectors.


Project Results:
First we provide a list of the main achievements of the project. Further information is provided below, organised according to the five main objectives of the project.

EUCLEIA has achieved the following:

• A much clearer understanding of the possible user needs for a future event attribution service for a range of different sectors

• Substantial progress in developing experimental designs for attribution assessments.

• New methods for event attribution permitting an improved understanding of the changing risk of extreme events

• Development of techniques for assessing the reliability of event attribution assessments

• Substantial input to the annual reports explaining extreme events of the previous year from a climate perspective published in the Bulletin of the American Meteorological Society.

• Important contribution to the 2016 report by the US National Academies of Sciences into event attribution

• Important contribution to the development of event attribution science internationally through engagement in international meetings and workshops and contribution to scientific literature including review articles and perspective pieces

• Development of a prototype attribution system

• Demonstration of the capability of event attribution system on a set of targeted test cases



OBJECTIVE 1: Derive the requirements that targeted user groups (including regional stakeholders, re-insurance companies, general public / media) have from attribution products and demonstrate the value to these users of the attribution products developed under EUCLEIA (WP3: Dissemination and Outreach; WP4: Assessing detection and attribution through general public and stakeholder analysis)

This objective has been met through the targeted work in WP4 and through involvement from EUCLEIA scientists in the WP3 activities with the stakeholder user panel. Two particular user groups have been targeted in WP4 allowing a much more in depth analysis. The focus for analysis in WP4 were storm surges in the Baltic coast region of Germany and heat waves in the greater Paris area. Through the stakeholder user panel, WP3, was able to gain a wider, albeit less in-depth, perspective on a wider range of regions and sectors.

The main achievements are as follows.

• Substantial work has been carried out under EUCLEIA to understand user needs for event attribution assessments. Previously there has been very little understanding of how event attribution assessment might be relevant to users other than to the broader scientific community and to the media keen to report on recent extreme weather events. Now, thanks to the detailed surveys carried out under EUCLEIA in WP4 and discussions with the stakeholder panel in WP3, we have a much better understanding of user needs, although continued work with users will be needed as the take up of event attribution assessments increases.

• It is clear that there is widespread interest in assessments of the extent to which recent extreme weather and climate events can be linked to climate change.

• Our research has shown that different stakeholder groups have different requirements and therefore a future attribution service needs to consider provision of tailored information for different sectors.

Task 4.1: Social articulation of attribution: defining the conceptual and theoretical foundations

The development of a theoretical framework for the iterative grounded theory has been carried out linking extreme event attribution and theories/approaches to risk governance and risk perception. Within a risk governance framework extreme event attribution may influence, or be influenced by, all the phases of the risk governance cycle. Within a risk perception framework, extreme event attribution may dynamically modify, or be modified by, the way stakeholders perceive the risks associated with climate change.

A harmonised methodology for the focus group organisation and results analysis for the regional level stakeholder needs and understanding analysis has been developed. This provided a basis for inter-regional and –stakeholder comparisons. Developing a stakeholder typology and conducting a survey of mayors, in-depth interviews with key stakeholders and workshops with focus groups allowed the development of an understanding of regional level stakeholder needs.

Task 4.2: Analysing regional level stakeholders’ needs and understanding

EUCLEIA has carried out in depth surveys of sectors responding to storm surges in the Baltic Sea region of Germany and to heat waves in the Paris region of France. This has allowed the project to develop a detailed understanding of user requirements for two specific sectors. This has been supplemented with broader discussions with a stakeholder panel involving representatives from a variety of different sectors and further research involving the media and the insurance sectors (Task 4.3).

This work has shown that event attribution as knowledge production is deemed desirable for many reasons. For example, designing infrastructure may call for information on the changing return period of extreme events. Moreover, event attribution can give a better idea of what the future has in store, it may help raise public awareness about the need to do something about climate change, and can give a clearer sense of the options available in order to reduce unwanted consequences of weather and climate extremes. For some users the information is currently only of use in a strategic context (e.g. for the insurance industry to understand future business models) rather than in an operational context (e.g. to set next years insurance premiums). Also many users express a preference for reliable assessments produced in slower time (eg on seasonal or annual timescales) over very rapid media-time scale assessments.

Analyses have been undertaken into the specific requirements to make event attribution relevant, reliable and expedient for regional stakeholders for two specific regions and concerns, namely for storm surges in the Baltic Sea region of Germany and for heat waves in the Paris region. Stakeholders consulted included from politics, public administration, civil society, associations, and the private sector in fields including spatial planning, coastal protection, climate mitigation and adaptation, nature protection, tourism, maritime/port industry, and emergency management in the Baltic Sea case; emergency health care, health planning, mass transit, collective provision of temperature regulation (cold), urban planning, local and regional climate planning, land use planners, local and regional climate offices, forestry, agriculture in the Paris case. Interviews and workshop discussions reveal that the credibility of information is commonly ensured by relying on information from trusted sources, people or institutions. A very rapid availability of event attribution assessments is, in contrast to what the media might want, rarely mentioned to be of relevance by most stakeholders which might be due to the fact that most stakeholders are engaged in long-term preparedness or continuous awareness-raising campaigns. Salience is most commonly attached to the fact that event attribution is linked to the consequences of extreme events or stakeholder-specific problems. Accordingly, a suitable event attribution product should be tailored to the specific concerns of the particular stakeholder, received from a trusted “honest broker”, and published rather at a later stage but with greater confidence than produced quickly but with larger uncertainties. Regional climate service providers can serve as an interface for creating the mutual understanding between scientists and stakeholders needed for the production of reliable and useful event attribution assessments.

Task 3.3 Coordination of stakeholder user participation

In addition to region specific stakeholder engagement, a wider stakeholder panel has been engaged with additional meetings held during the project. These workshop meetings involved representatives of the health, investment finance, disaster risk reduction, legal, foreign policy, national climate and energy policy, science communications and hydrologic impact modelling sectors. User engagement highlighted that the perspectives of the stakeholders who will use the attribution analysis are vital in developing appropriate attribution services.

The results from the meetings with the wider stakeholder panel elicited the same broad messages that were identified in work package 4. Importantly, our engagement has shown that there is a clear demand for event attribution science within a range of decision making contexts and across a range of sectors within Europe. The stakeholder panel agreed with the other stakeholders by stating that any scientific outputs need to closely reflect user needs and to be communicated in such a way as to make the science easily applied to their own context. The communication of the analysis was seen a critical part of the process with stakeholders wanting to be able to talk to an expert about the science, what it means and how it can best be applied. This should be an ongoing dialogue and not just a single exchange. Scientists involved in the EUCLEIA project were seen as highly credible sources, and as such would be an excellent resource. The issue of credible authority was particularly critical for the legal and insurance sectors who need to be able to cite credible authorities when either arguing a case or setting regulations. Access to the experts is problematic, particularly given the international nature of the stakeholders. Stakeholders suggested that an online portal for event attribution services would allow for a central point of access to leading academics and vetted academic papers.

For the UK Law sector what they want to be able to demonstrate for a successful court case would be a balance of evidence that shows that ‘but for human influence the impact would not have happened.’ As a way to develop an event attribution service relevant to the legal sector, it was felt that it would be beneficial to develop test cases that would help better understand how the law could be applied, how the court might respond to the science etc.

The disaster risk reduction (DRR) sector said that they would be interested in the changing frequency of extreme events and also the changing magnitude perspective. For example, did an event cause a threshold to be exceeded, which caused a particular impact to occur? Within the European policy landscape there is an increasing demand from policy makers to want to know about event attribution and there is a growing need to build knowledge capacity. It is worth noting a potential interest in events that may occur in other parts of the world but could have wider implications in a European context such as the Russian heatwave of 2010, where harvest failure affected the price of wheat that in turn affected the price of bread in European shops.

The stakeholder user panel engagement found that national governments could find it problematic to use analyses that contradict themselves, and so it would be more advantageous to have a considered analysis rather than one that was done in ‘real-time that could potentially be revised. In addition, any statement that has too much uncertainty to it would also make it problematic to use as it would be seen as potentially undermining the credibility of the science.

Within the health sector, event attribution is seen as a positive planning tool, for example in looking closely at cold or hot weather plans and how can you make more specific weather-related warnings to those that are most vulnerable to the impacts. If it can be shown that events are likely to occur on a more frequent basis there may be an argument to set up national incidence centres on a more permanent basis, rather than setting them up only when such disasters occur. One of the recurring roles for event attribution science is as a communication tool, to help raise awareness and understanding that climate change is affecting us now. This function was highlighted by the Development finance sector where currently they thought not enough is being done to finance schemes in vulnerable regions against events such as flooding, e.g. early warning systems or relocating vulnerable groups/people. There is a potential communication role for event attribution to make any links with climate variability and change and thereby help to direct funding appropriately to vulnerable areas.

The stakeholder panel considered what an operational attribution service could look like for different sectors with the EU. It was thought that free access to outputs from such a service would be critical for success across Europe. A range of outputs from analyses was thought desirable in order to support the different ways that different sectors might use the information. It was thought useful to be able to relate past and current changes in extreme events to their likely future evolution. The suggestion was made that to support an operational attribution service there should be various contextual information including background information on attribution science, examples of good and bad practice in event attribution and links to other providers and organisations from a non-European perspective. Questions were asked about how the quality and provenance of any future operational attribution service would be verified. It was noted that an attribution service needs to continually develop in response to developments in science, IT and user requirements. The provision of impacts information was seen as highly relevant as was strong dialogue between stakeholder and scientists. If an event attribution were to form part of the Copernicus climate service this was seen by our stakeholder panel as a positive development.

In addition to how an attribution service could be structured within a European climate service, the stakeholder panel also considered issues related to how the scientific analyses are communicated to decision makers. Critical to this was the use of language in the process, and the extent to which phrases and concepts have meaning to different sectors and how it could be used. For example, a phrase such as ‘best guess’ was thought to be too vague and it was thought important that the actual degree of confidence should be clearly articulated. In addition, there is a need to consider the use of technical language and phenomena such as jet stream, observational record, return period, ensembles, and to what extent these need to be defined for a non-technical audience. To help with the understanding of technical language it was suggested to use vocabulary or concepts that are familiar in daily use, such as betting odds to communicate return times.

Task 4.3: Research into the requirements of the insurance and re-insurance sectors

[Taken from ES of D4.3] An assessment of the requirements of the insurance industry for event attribution has been carried out. Extreme weather events have to be considered as key concern to the insurance sector today and even more so in the future. A review of the German and French natural hazard insurance systems reveals that public and private adaptation to changing extreme weather events is widely lacking to date. The review also shows that Baltic Sea storm surges are of key relevance from a general risk-, but not from an insurance perspective. Heat waves, are of relevance to the insurance sector in France, but from a quite narrow perspective: land subsidence associated to prolonged period of droughts. Yet, in France, the state guaranteed insurance against natural catastrophe may be the most affected by climate change and an associated increase in the probability of extreme weather events. Overall the literature indicates that changing risk patterns, challenging risk assessments, and the evident need for adaptation pose key challenges to the insurance system. This demands a solid information base to take appropriate decisions.

The empirical findings from the Baltic Sea test case show that all of the interviewees were confident that extreme event attribution could produce important and interesting findings. Nevertheless, this statement was often followed by ‘but’s like that event attribution does not provide an added value to the existing information, other components of risk are more important, or that it is not applicable in existing business processes. This was not perceived to be the case for all business processes, though. Representatives from the insurance sector could well imagine that event attribution would inform strategic decisions, improve risk modelling and premium calculation, and support public awareness-raising and political leverage. Despite the fact that most of the interviewees were certain that event attribution is relevant, no one was convinced that the added value of event attribution is currently large enough to pay for it. Being aware of the requirements which could turn event attribution science into credible and relevant information products, can help overcome much of the hesitation. Overall, most of the stakeholders did not have a preference for event attribution information in near-real time, particularly not if they would only be irregularly available for singular events. They rather wanted solid and reliable information which is attuned to their models and products, fits their viewpoints, and/or the results support existing objectives.

The empirical findings in the greater Paris area test case show that understanding both the regulatory environment and current actuarial practices are the key to analysing the reception of event attribution by the French insurance sector. The impact of the regulatory environment is three fold. First, business operations are organised around yearly cycles – their robustness have to be demonstrated on a yearly basis – which means that climate change is not on the radar screen. Second, insurance against extreme event is compulsory and automatic once covered by building and vehicle insurance packages (that are set by law in their basic configuration). There is little room to influence the insurers activity and the insured person’s behaviour without revamping the regulatory environment. Third, a reform of the current framework is envisioned. On that front even attribution may serve at restarting the reform which is currently stalled, it may also be of use in a carefully reformed insurance regulatory framework more attuned to challenges associated to climate change. Current actuarial practices are centred on the use of past events time series. Computing future probabilities, not based stricto-sensu on past long term time series, would be a major cultural shift. This would be justified if event attribution clearly points to the fact that the current climate regime is sufficiently different (because of human influence) that it calls for an innovation as radical as event attribution. Finally, the Paris case study points to a potential need for event attribution being envisioned not on single events, but rather on a class of conjoint events.

Task 4.4 Identifying the gap between the general public and scientific community and dissemination of the survey results to stakeholders

Analyses and interpretations of interviews with German scientists have been carried out. This research has shown that attribution as perceived from climate scientists is the assignment of a certain course to a statistically detected deviation from the norm. Event-Attribution is answering the question if and to what extent a single extreme event is caused by anthropogenic climate change. The interviewed scientists articulate and understand “Attribution” solely in the English language. A translation into their national language does not belong to their everyday actions and is therefore limited. It can be assumed that the issues raised by national language are equivalent to the role of the national language in a certain discourse. In terms of “Attribution” this can be understood as an eloquent speechlessness.

From the perspective of the interviewed scientists, the role of climate service is to transfer scientific knowledge to society in order to regain seriousness and to initiate preventive actions in the public. Climate scientists perceive climate service not as a stand-alone product but as a supplement to already existing products. The interviewed scientists pose their personal utility, first. The customers and their possible interests regarding the product keep largely unrecognized. This work demonstrates the importance of on-going dialogue between attribution scientists and potential users of event attribution information in order to inform future research and to facilitate the production of useful event attribution products.


Recommendations for future attribution service provision

As a result of stakeholder engagement, the EUCLEIA project has produced key recommendations for the production of future attribution services. Analysing the key requirements of the consulted stakeholders against the background of existing literature on evaluation and climate services has resulted in the identification of three core criteria to evaluate potential services from extreme event attribution for specific user groups. These are:

1) Clarity and comprehensibility,

2) Context-sensitivity and decision-relevance,

3) Trustworthiness.


In the development of an operational attribution service these categories need to be further elucidated into more specific criteria. These are diverse and cannot be met with a one-size-fits-all service. To derive recommendations for providing a service on extreme event attribution it is therefore inevitable to consult stakeholders and get hold of their specific needs and requirements as it has been done in WP 4. The EUCLEIA project has proposed several detailed recommendations which are relevant to particular groups of stakeholders, but which are not necessarily applicable to all groups in the same way.
To put these recommendations into action, scientists can and should interact with stakeholders and knowledge brokers. For the further development of an attribution service, an expedient stakeholder dialogue is likely to require enabling knowledge and raising interest first, before tailoring EEA to the needs of stakeholders. This requires an easy access to the overall topic first which is facilitated by presenting understandable information in a first step of stakeholder interaction. Subsequently, relevant topics and questions can be identified and services can be better tailored to the needs of potential users. A stakeholder dialogue is, in the one way or the other, inevitable to ensure legitimate and relevant information. Established cross-sectorial and sector-specific climate services act at the interface of science and practice. Connecting with a network of boundary organisations at European, national, and regional level can accordingly facilitate a continuous dialogue. Against this background, climate services can support putting the above mentioned recommendations into action. This requires that they are in a continuous dialogue with stakeholders and scientists, independent, provide long-term services, are scientifically competent, know how to make science understandable, and are aware of stakeholder needs to create interesting and useful products.

A deliverable is due in month 36 D4.5: Synthesis report “Promoting attribution products: conditions for success with stakeholder and recommendations”


OBJECTIVE 2: Develop experimental designs and clear ways of framing attribution studies in such a way that attribution products provide a fair reflection of current evidence on attributable risk. (WP5: Development of attribution methodologies and exploration of framing issues)

To meet this objective aims this part of the EUCLEIA project has undertaken some of the underpinning scientific research needed to understand the sensitivity of attribution assessments to the framing of attribution questions, to the model structure used and to the use of statistical as opposed to model-based methods.

The main achievements are as follows.

• EUCLEIA has made substantial progress in developing experimental designs for attribution assessments. These have been summarised in a review paper led by the EUCLEIA coordinator and co-authored by many EUCLEIA scientists.

• New methods have been developed that split changes of the risk of extreme events into dynamic and thermodynamic components. This allows an improved understanding of the changing risk of events and an analysis of whether unusual weather patterns are changing.

• New research has shown how different ways of framing attribution studies can lead to very different conclusions emphasising the importance of a clear understanding and communication of the framing of attribution questions.

Task 5.1: Framing for attribution studies

The EUCLEIA project has explored a range of different ways of framing event attribution, all of which can be understood in terms of conditioned probabilities. In all cases, investigators are exploring how an external driver of climate, such as past greenhouse gas emissions, affects the probability of occurrence of an event of a particular class, conditioned on a set of boundary conditions or initial conditions that are held constant. Differences in framing between different studies may be understood in terms of the specificity of the event class definition and the delineation of the conditions that are held constant.

It is important to understand that differences in how the event attribution question is framed can lead to significant differences in outcomes, so users need to be aware of these differences in comparing results between studies. Lack of agreement between studies regarding the attribution of a specific event is not necessarily an indication of lack of robustness of individual conclusions if the individual studies are framing the event definition question in different ways. That said, agreement across a range of different ways of framing the attribution question can indicate that a particular case is especially simple, and hence less likely to be sensitive to the details of how the attribution question is addressed in the individual studies. Hence comparing results between studies framing the event attribution question in different ways is always desirable, since it improves our understanding of the causal factors behind the event in question.

Having identified the critical role played by conditioning in determining the outcome of event attribution questions, the next question is what is the appropriate level of conditioning to apply. From a scientific perspective, there is no specific answer to this question: different levels of conditioning can be used to investigate different aspects of an event depending on the question under consideration.

A probabilistic attribution study of the UK floods from the 2013/14 based on model results obtained with weather@home data winter has been published in Nature Climate Change (Schaller et al, 2016). A paper from Peter Uhe et al., again using the weather@home capability compares three different attribution methods for the record 2014 European temperatures, looking at the spatial distribution of FAR as well as the FAR for the whole European region. Each method shows a very strong anthropogenic influence on the event over all regions within Europe. An interesting result from this work was noting the dependence of the FAR on the size of the region analysed. For this event, larger regions resulted in a larger FAR, highlighting the need to carefully choose the region analysed on an event by event basis. This also indicates there may be a need to quote the size of the region analysed when presenting values for the FAR of an event.

Task 5.2: Sensitivity of attribution conclusions to model structure

Investigation of the sensitivity of attribution conclusions to model structure has shown that the skill of different approaches for Europe in terms of temperature, precipitation and sea level pressure is relatively similar, indicating there is not as yet evidence for a clearly preferable model set up in terms of reliability assessments. A novel way of using forecast sea surface temperatures in attribution experiments has been developed, tested and is showing promise as a way of developing fast-response assessments (Haustein et al, 2016).

Task 5.3 Statistical methodologies for comparing current climate with past events

A new method based on flow analogs has been developed in order to estimate how temperature and precipitation events would have changed in the past (or counterfactual) climates as
compared to the current (or factual) climate. The method is based on the comparison of events in the two climate periods but for similar flows. The method also makes it possible to distinguish the “dynamical” and “thermodynamical” contributions to the changes. This is a novel and important way of analysing extreme events because it enables a clearer understanding of their causes.

The first question raised is whether, under similar atmospheric flows, an event would have had a different response in the absence of human activities, or in a different climate period. It is therefore a “conditional attribution method” where the conditioning is on the atmospheric flow. In order to answer this, the method consists in “picking” analog flows, using a distance on geopotential height or sea level pressure, from either observations or simulations, in different sets or periods, and calculating the odds of the extreme event in each set/period using the value of the indicator . A pioneer example of such an analog method was that of the winter of 2010 (Cattiaux et al., 2010). It analysed the cold winter 2009-2010 temperatures, and showed that temperatures associated with analog flows from previous years were even colder than those observed in 2009-2010. This indicated that climate change has mitigated the extremely cold spell of 2010.

Research carried out under EUCLEIA has now broadened out this approach so that flow analogues can now also be used to disentangle dynamical and thermodynamical contributions for a class of events, i.e. not assuming a specific atmospheric flow. Assume for instance that we aim at attributing the exceeedance, over an area, of monthly precipitation amount over a threshold (this can be generalized to any space and time scale). We use an ensemble of model simulations representing presenting day or actual conditions (A) and an ensemble of simulations represent natural conditions in which there has been no human induced climate change (N). Then, we can calculate the probability of exceedance in precipitation of analogs of all A flows in the N simulations, and compare them with the probability of exceedance in analogs of all A flows in the A simulations. Vautard et al. (2016) showed that the difference provides an estimate of the thermodynamical contribution of the change in probabilities, because the flows are not changed in the comparison. And vice versa, the difference between probabilities of exceedance associated with A flows in A analogs and N flows in A analogs provides an estimate of the dynamical contribution to exceedance probability changes. It estimates how the change in flows has changed the probability of the event class.

The method has been applied to the case of extreme rainfall season of 2013-2014 in the U. K. (Vautard et al, 2016). Results, summarized in Figure 1, (detailed in attached document) show that for centennial events, the change in probability is due in one third to change in flows and two thirds to change in thermodynamics.

Another application of analogs statistics has also been developed, which identifies thermodynamical and dynamical contributions in individual events, using a Bayesian approach (Yiou et al., 2016). The approaches require comparisons and application on large sets of cases in order to identify their merits and weaknesses.

We have also developed a methodology to identify unusual weather patterns. Analysing observations or reanalyses unusual seasonal patterns are found when it is difficult to find good analogs. This new technique has been tested for the summer of 2015 in France which was an unusually hot summer despite the fact that a persistent blocking pattern did not develop line in 2003. The summer of 2015 was unusually warm, being the second warmest in France and southern Europe (after 2003) since 1900. The atmospheric circulation during this summer was very different from 2003, with very few blocking episodes, and rather zonal and southerly circulation patterns. The summer of 2015 yields a record of low number of days with good analogues. Hence the summer of 2015 exhibits unusual patterns of the atmospheric circulation. These results are reported in an article in preparation (P. Yiou, R. Vautard, J. Cattiaux, A. Ribes, M. Vrac, Emerging trends in the North Atlantic circulation, in preparation).


OBJECTIVE 3: Develop the methodology of representing the level of confidence in attribution results so that attribution products can be trusted to inform decision making (WP6: Evaluation and diagnostics).

Attribution of extreme events requires modelling experiments in order to generate counter-factual climate conditions and in order to generate the statistics of extreme weather in a non stationary climate. Confidence in attribution results therefore relies on the ability of the model to properly simulate extreme events and their sensitivity to climate change. This objective is to develop a framework for model evaluation. In theory, model simulations should be able to simulate observed extreme statistics, their changes, but also the underlying processes. However simulating observed changes is not always possible as these changes are not always detectable in observations only due to the short length of the instrumental period and natural variability. The focus has been here on processes and model extreme statistics.

There have been several main achievements.

• New diagnostics have been developed for the identification of key processes driving extreme events have been developed, in particular for heat waves, droughts and cold spells, but also in linking weather patterns and temperature anomalies

• New methods for evaluating the reliability of event attribution results have been developed

• The quasi-operational HadGEM3-A based system has been shown to be reliable enough for event attribution, provided care is taken in interpreting results for those climate variables for which model biases exist.

Task 6.1: Observations for key climate processes: collection and identification of needs

Observations are essential to detect changes and understand the contribution of the anthropogenic influence in the development of extreme events. They allow scientists to better understand underlying processes and to calibrate and validate models. During EUCLEIA, we have collected information about the range of useful observations and their gaps in three different perspectives:

• Observations for fast-track attribution (available with at most a few days)

• Observations for slow-track attribution (available within a few months to a year)

• Observations for key processes and model evaluation

We found that observation requirements for attribution are very different depending on the type of event, and on the time scale in which an attribution statement needs to be made. Fast-track attribution can provide rapid analyses, but it has to rely on the relatively few datasets (or even forecasts) available in near real-time (NRT), and which are long enough to compute a climatology. In contrast, slow-track attribution takes more time but might be able to make more robust statements as it relies on more observational and model-based data.

Furthermore we have identified existing gaps and shortcomings in available observations. Atmospheric variables such as pressure and temperature are generally well observed. Precipitation products are more uncertain as rain or snow may occur very localized. Most gaps are found in observations of land surface hydrology, i.e. soil moisture, or evapotranspiration, therefore hindering a better understanding/attribution of e.g. droughts. Involved measurements through sensors in the soil, or eddy-covariance flux towers, respectively, are complex. Furthermore land surface hydrology depends on soil and vegetation characteristics which can be highly heterogeneous.

The most notable observational improvement in recent years is the growing number of satellite-derived products. These can be especially useful thanks to their global coverage. However, records are still comparatively short or poorly homogeneous in time and it may not yet be possible to infer corresponding climatologies which are required for attribution.

Our main findings are that:

• There is large number of existing observation-based datasets that can be employed in the context of attribution. Moreover, updated or new products appear constantly, slowly addressing some of the observational shortcomings identified in this work. Hence it seems likely that the capability to conduct well-constrained attribution studies will improve with time as a result of such observational improvements;

• There is a lack of long homogeneous time series, with sufficient meta-data. Only station data have a long historical coverage. Gridded data sets usually start in the middle of the 20th century;

• Satellite data and remote-sensed data may provide very high resolution (such as products from MODIS or radar products for heavy precipitation), but generally have insufficiently long temporal coverage for attribution use, except for model evaluation;

• A clear gap for fast-track attribution is the lack of available and accessible real-time observations that could be compared with long, homogeneous time series for reliable statements for comparing current with past climate. This hampers robustness in analyses as return periods are sensitive to small changes in the extreme values;

Task 6.2: Identification of key processes driving extreme events

Extreme events generally result from an ensemble of processes involving atmospheric dynamics and large-scale drivers, as well as regional to local-scale processes which interact with one another. This is a large research area, but EUCLEIA partners have focused on a few key processes that models have to faithfully reproduce in order for models to be used for attribution.

We have identified a few key processes driving extreme events in Europe. In particular, we focus on heat weaves, droughts, cold spells and heavy precipitation events. For each event type we assess and describe the role of large scale drivers, local processes and feedbacks, and that of climate change impacts. First we have shown that models should reproduce the large-scale circulation (as seen in Fig. 2, top left in attached documet), not only in mean flows, but also in the full spectrum of variability. Diagnostics such as “weather regime” (Vautard, 1990; Michelangeli et al., 1995; Cassou et al., 2005) classification should help in this, as typical weather regimes that would persist for several weeks are often associated with (or generate) extremes. Cold spells and heat waves are associated with blocking anticyclones generating flows that are largely departing from the mean flow. Another striking example is the extreme monthly rainfall amount that was observed over South U. K. in January 2014, which is largely due to the persisting zonal weather regime, as shown by Christidis and Stott, 2015; Schaller et al., 2016.

In addition to blocking, an important mechanism to take into account for cold spells is the interaction between the troposphere and stratosphere (Christiansen et al., 2001) and the occurrence of sudden stratospheric warming events, which was the case in 2010. The potential role of sea ice decline was examined in a complementary analysis. However the correlation between sea-ice extent and NAO, for instance, was very weak supporting similar results in the recent literature (e.g. Li et al 2015, Gerber et al. 2014).

The importance of an accurate representation of soil moisture in the development of summer heat and drought in models has also been studied. First, the role of soil moisture was compared with that of sea surface temperatures (SST) for global land climate in general and an additional focus on extreme conditions such as droughts [Orth and Seneviratne, 2015]. The analysis shows that soil moisture has at least as much influence on land climate as SSTs. Its influence on hot temperatures is even stronger than the role of SSTs. The role of soil moisture in the development of heat waves in the context of climate change was also seen to be crucial for the case of the 2010 Russian heat wave (Hauser et al., 2016).

In a different study (Vogel et al., 2014; Stegehuis et al., 2016), we conducted an analysis evaluating the contribution of initial soil moisture in the development of summertime temperature anomalies, and comparing it to other process contributions, through an array of seasonal simulation experiments (Fig. 2, in attached dopcument - right panel). In general, the contribution of initial soil moisture does not exceed 1 degree. It has a significant time correlation with the mean seasonal temperature anomaly itself over South/Central Europe, with values reaching 0.3-0.4. However the chaotic atmospheric flow dominates the temperature anomaly. It was also found that despite this, the temperature trend found in Western Europe in the last three decades was mostly attributable to early summer soil moisture anomalies, while changes In atmospheric weather patterns dominate in Eastern Europe.

As for the other extreme event types, heavy precipitation has repeatedly caused significant damage across Europe in recent years. Local processes such as convection or interaction with orography can lead to such events in connection with circulation induced frontal systems. Whereas in coastal regions the dominant driver is the large scale circulation, the local processes are more important in central and eastern Europe. Convective precipitation events are projected to increase in the future, but projected trends of heavy precipitation are uncertain with strong regional variations due to changes in large scale circulation and moisture supply impacting these events.

The particular case of Mediterranean events that occurred in the fall of 2014 was studied [Vautard et al., 2015]. A significant trend in the annual maximal daily rainfall amounts was found in Autumn with increases in extreme amounts of about 30% relative to the middle of the 20th century. The relation between regional temperature and extreme rain has been found to follow a rate comparable to that of Clausius-Clapeyron or higher.



Task 6.3: Development of diagnostics and model evaluation

A detailed analysis has been carried out of the skill of the HadGEM3-A model in simulating extreme events in the view of their attribution, accounting for the importance of the processes above described. A manuscript is in preparation (Vautard et al., 2016). The analysis has been performed based on an ensemble of 15 atmospheric simulations with observed Sea Surface Temperatures of the last 54 year period (Fig. 2 in attached document; bottom left). The analysis investigates the main processes leading to extreme events, including atmospheric circulations, their links with temperature extremes, land-atmosphere and troposphere-stratosphere interactions. It also compares simulated and observed variability, trends and generalized extreme value theory parameters for temperature and precipitation (Krueger et al., 2014). We find that the models simulate quite well weather patterns and extreme events, even though some biases are found. Slightly excessive drying may be the cause of an excessive summer interannual variability and too intense heat waves. This does not however seem to hinder the model to properly simulate summer temperature trends. Cold extremes appear well simulated, as well as the underlying blocking frequency and stratosphere-troposphere interactions. Extreme precipitations are overestimated and too variable. In the Baltics weather conditions are seemingly not leading to strong enough storm surges sea level increase. While not fully conclusive because more investigation is still needed, our findings show that the model can be used for attribution, provided caution is taken in interpretations given the biases found here.

Task 6.4: Development of reliability assessment methods

The evaluation of the model ensembles used for attribution is important in order to verify that the ensemble is well calibrated and therefore that attribution results produced from them are reliable. Good progress has been made on the development of reliability assessment methods for this purpose. Unreliable forecasts are found prone to overestimate the fraction of attributable risk (FAR). Estimates of the changing frequency of extremes obtained from later and earlier periods of the observational record are found to provide a reliable basis for the assessment of the error in the FAR.

There is still considerable debate as to the relationship between event attribution and the evaluation measure used in seasonal forecasting and known as forecast reliability. We have made extensive examination of the literature, as well as conducted new studies, and come to the following conclusions:

• Within a statistical model, increasing model error can be seen to simultaneously decrease both reliability and the accuracy of the fraction of attributable risk (FAR), the most common measure of how events probabilities evolve with climate change. However, the same model does not suggest that reliability and error on FAR are directly related, so reliability itself is difficult to use as a correction factor.

• Using full climate simulations, a strong link could not be found between reliability and error on FAR when considering all years equally. However, when the record is split into an early and a late period (where the early period more closely resembles the world before climate change), the reliabilities of these two periods taken together correlate much more closely with the error on FAR. This is because they now represent the probabilities with and without climate change that make up FAR. Where reliability is present but low, it can be increased by calibrating the deficient variable within the model ensemble. This technique has been demonstrated both within the statistical model and the EUCLEIA climate simulations. In the statistical model, it can also be shown to correct errors in FAR. Following such a correction, any kind of event attribution should give accurate probabilities.

• Where there is little or no reliability, event attribution may still be performed by pooling years nearby to the year of interest, then filtering so that the atmospheric conditions more closely match those of the observed event. Preliminary studies suggest this will improve the reliability along with refining the attribution results. New attribution techniques which are currently under development should further diagnose and minimise model errors and uncertainties when incorporated into future event attribution systems.


Objective 4: Demonstrate the utility of the attribution system on a set of test cases of European weather extremes (WP7: Applications of the methods to targeted test cases).

In WP7 EUCLEIA scientists have carried out a set of detailed analyses on targeted test cases. This has been able to demonstrate how to apply event diagnostics to identify processes that generate the extreme events. It has applied the fast-track attribution methods developed during the project as well as using output from the HadGEM3-A prototype attribution system. Reliability assessments have been carried out and results have been disseminated through peer-reviewed papers and through a prototype website (https://testingeucleia.wordpress.com/) which has elicited user feedback.

We originally undertook to carry out five case studies but we have chosen to carry out an additional test case as well on the wet/dry summer of 2012 in Northern/Southern Europe. One of the original test case studies has been delayed (European heatwave, 2015) due to resignation of the member of staff involved at the University of Oxford. The main achievements are as follows.

• A set of detailed analyses has been carried out on 6 targeted test cases

• These analyses have demonstrated how multiple methods can be applied to a particular event and synthesised by a team of scientists working across institutes

• Peer-reviewed papers are being produced for each event

The cold spell of Winter 2009/2010

The winter 2009-2010 was a relatively cold one in Europe with a series of strong cold spells, the strongest in the middle of December. In this period the temperature was below normal almost everywhere except for few regions in Northern Scandinavia. The coldest anomalies, below -4σ, were found in the middle of Germany. The winter as a whole was dominated by a strong negative NAO, which in turn was connected to a weak stratospheric vortex.

Two different methods for event attribution have been included: one based on the HadGEM3-A ensembles and one based on the statistical surrogate method described in Christiansen (2015). The surrogate method is based on a simple algorithm to produce ensembles of surrogate fields for both the unperturbed climate and the changed climate. The method is based on observations and surrogate fields have the same spatial and temporal structure as the original observed field.

We consider two indices of cold spells. The first is defined on grid-point scale as the minimum temperature over a whole winter. The second takes the spatial extent into consideration and is defined as the area of the larges continuous region with temperatures below 2σ, where σ is the local, seasonally varying standard deviation. The European winter (DJF) mean temperature is shown in Fig. 3 in attached document as function of time. Observations (E-OBS) show a linear trend of 0.30 K/decade. This is somewhat larger than the mean of the HadGEM3 historical ensemble, which shows a trend of 0.20 K/decade. However, 3 out of the 15 ensemble members show a trend that is comparable to that of the observations, so the difference is not statistically significant. It is also worth noticing that the HadGEM3 model has a negative bias and that the change in observations looks more like a break-point in the late 1980s than a linear trend. Inspection of the bias shows that it is dominated by western Scandinavia and the Alps while it is close to zero elsewhere. The ensemble of surrogates has a linear trend close to that of the observations as should be expected by construction. The ensemble of surrogates shows less variation among ensemble members than the HadGEM ensemble.


To get an impression of the changes in extremes we normalize the local temperatures and pool them all together. Figure 4 (in attached document) shows the changes in the resulting distributions over time. The challenge of detection and attribution of cold extremes becomes clear: although there is a general change in the distributions, the changes are small for the cold tail. This is quantitatively different from summer temperatures, which show a general shift of the whole distribution toward warmer values (not shown). Note that also for the cold tail, the HadGEM3 historical ensemble shows smaller changes than the observations. A more detailed comparison of the ensembles and the observations including the geographical and temporal variability of variance, skewness, and cold quantiles has been performed.
Probability distributions of the indices have been calculated for both the HadGEM ensembles and the surrogate ensembles. For an observation xobs we calculate the Risk Ratio RR = P1(xobs)/P0(xobs) where P is the cumulative distribution function. This is a measure of how much the probability for an event as extreme or more extreme than the observed, xobs, differs between the unperturbed (0) and perturbed (1) climate. Figure 5 (in attached document) shows this measure for the coldest winter day of 2009/2010. We see that the probability for such cold events have been reduced over almost all of Europe. This holds for both the HadGEM3 and the surrogate methods, although most values are moderate. The HadGEM3 method gives larger changes than the surrogate method, which can be understood from the fact that the unperturbed ensemble with HadGEM3 represents pre-industrial conditions while the corresponding ensemble with the surrogate method represents the 1960s. Comparable results are obtained for the continuous area index. Work is in progress to test the robustness and provide uncertainties for these calculations.

The drought of 2015 in Central Europe

An analysis has been carried out on the drought in Central Europe in 2015. Initial results show that different conclusions on the existence of human influence are obtained, depending on the chosen Earth System Model and the event attribution methodology. Assessments with observations, and a regional climate model fail to find a significant role for human influence, whereas Earth System Model results show a large range of attribution statements -- spanning from a large negative to a large positive contribution of anthropogenic climate change. Our results highlight the need for a multi-model and multi-approach framework in event attribution research, especially for events with low signal-to-noise such as regional droughts.

This analysis indicates that much further research is needed before regional droughts become tractable as part of an operational attribution system.

The heatwave of 2015 in Central Europe

An analysis is under way but has been delayed due to change in personnel.

Severe rainfall event in July 2014 in the Netherlands

An analysis has been made of an extreme rainfall event in the Netherlands on 28 July 2014 when a number of thunderstorms moved across the country. Rainfall totals in excess of 130mm fell in just a few hours leading to localised flooding, €10m property damage in the Netherlands (the same complex caused more damage in Belgium) and widespread traffic disruption. A comprehensive and systematic probabilistic attribution analysis of the 28 July 2014 episode of extreme precipitation has been carried out using a number of methodologies. Consideration has been given throughout to the present challenges in event attribution and the steps required to establish and communicate a robust statement about the role of anthropogenic climate change on the likelihood of this and similar events.

We investigated the pointwise maximum daily precipitation and the highest daily precipitation at any station or grid point in the Netherlands in the summer half year in observations (KNMI station data and E-OBS gridded data), in two GCMs (EC-Earth 2.3 and HadGEM3-A) and a set of RCMS (RACMO and a CORDEX multi-model ensemble). We chose the daily scale although most damage was due to precipitation in a few hours because there is much more observational data at this scale and model output is more readily available. In summer, the daily extremes almost always correspond to much shorter time scale thunderstorms. For the spatial extent we chose the Netherlands due to data availability. There is very little spatial variation in extreme precipitation in the Netherlands in summer. The orography is not pronounced, urban effects are small and we could not find a coastal effect on this time scale within the country, so we assumed all data are identically distributed and can be pooled.

EC-Earth proved unable to represent the extremes, both from a physical point of view (the resolution is much coarser than the scale of convective events, about 10 km) and by comparing the tail of PDF against the observations. HadGEM3-A did much better, in spite of its 60km resolution, after a multiplicative bias correction. The regional models with 11km resolution were also able to represent the statistics of these events, after bias correction.

We fitted both observations and model grid point data to a GEV that scales with the smoothed global mean temperature as an index of global warming. Spatial dependencies are taking into account by using a moving block bootstrap. For the observations (1901-2016), the historical HadGEM3-A run (1961-2013) and the RACMO data (1950-2016) we used the trend as an indicator for climate change. This assumes that low-frequency natural variability and the effect of natural forcings are small, which is the case in Western Europe. We also compared the historical HadGEM3-A fit with the historicalNat without anthropogenic forcings.

Fig.6.(in attached document) shows GEV fit that scales with global mean temperature to April–September maximum one-day precipitation at all ~320 stations in the Netherlands (left) and the spatial maximum of the stations (right). The red lines represent the climate of 2014, the blue lines the climate of 1961. There is pronounced trend in the observations, with daily extremes of the 131.6 mm observed on this day now two to three times more likely than in 1960 (10–12% more intense). The return time at a given location of this event was about 2000 (1100 to 5000) years in the current climate. As the events are much smaller than the country, the probability of observing such an event anywhere in the Netherlands is much larger, with a return time of about 10 (6–25) years.

The HadGEM3-A and RACMO ensembles also show a significant increase in intensity of these high daily precipitation extremes in the Netherlands in the summer half year. However, the modelled increase is much smaller than the observed one. This is probably due to the limited ability of these hydrostatic models to represent the processes in violent thunderstorms. To make a full attribution we will have to use convection-permitting models. However, the long runs necessary for attribution studies are not yet available for this area.

European summer rainfall 2012

Event definition - Summer (JJA) 2012 was the wettest in the UK since 1912, while 2012 was the second wettest year overall since 1910. 2012 saw the driest summer in Spain since 1928. In order to best capture the dipole structure of the 2012 precipitation anomaly, two regions are used to define the event: northern Europe (10W-25E, 50-60N) and southern Europe (10W-25E, 35-34N).

Observational analysis

The extreme summer of 2012 was one in a number of consecutive wet summers in northern Europe, and dry summers in southern Europe. The mean precipitation rate in northern Europe was 3.2mm/day (CRU TS3.2.3) 1.7 standard deviations above the 1960-2012 mean. This was a 1 in 47 year event. In southern Europe, the mean precipitation rate of 0.54mm/day was 2 standard deviations below the 1960-2012 mean. This was a 1 in 30 year event. Southern Europe was also very warm in summer 2012. The mean temperature of 23.8C was 2 standard deviations over the 1960-2012 mean. Analysis of CRUTS3.2.3 precipitation, with global mean surface temperature as a covariant, used to scale the fit to 1960 and 2012 showed that there is no influence from increasing global temperatures on northern European precipitation, but that it does make dry summers more likely in southern Europe (2.5 times more likely for 2012 vs. 1960, 3 times more likely for 2012 vs. 1901).There was no measurable change in the atmospheric circulation between 1931--1970 and 1971—2011, but a circulation that is analogous to the summer 2012 does lead to higher precipitation rates than normal in the UK. Thermodynamical probability ratios do not show a consensus in the change in probability of wet summers conditional on the atmospheric circulation analogue of 2012 over the UK.

Model evaluation

Model analysis focussed, so far, on the HadGEM3-A attribution ensemble. HadGEM3-A fails to capture the observed drying trend over Spain. The ensemble mean does not reproduce the 2012 precipitation anomaly: individual ensemble members for this year show a wide range of anomaly patterns, which cancel in the ensemble mean. However, the model is capable of producing anomalies with the correct magnitude and spatial pattern. The diversity amongst ensemble members suggests that there is a large natural component to the event. Composites of events with daily precipitation 1.5 standard deviations below the southern Europe mean are in good agreement with observations. For events 1.5 standard deviations above the northern European mean the model tends to locate the maximum precipitation anomaly too far east. The similarity between historical and historical Nat composites suggest a key role for natural variability in the 2012 event.

Model results

For northern European precipitation there is no significant change in the mean precipitation between 1960-1974 and late 1999-2013, and the event is not made more likely with anthropogenic forcing (the risk ratio is 0.94 for 1999-2013). For southern European precipitation there is anthropogenic contribution. The risk ratio is 1.69 for 1999-2013. In contrast with observations, the model simulates no significant difference between the seasonal precipitation in the early and late periods. 1999-2013 was significantly warmer than 1960-1974, with a large anthropogenic contribution to this trend (risk ratio=574 for 1999-2013).

Synthesis

The extreme summer of 2012 was characterised by a dipolar precipitation anomaly. Both model and observations suggest the large precipitation amounts in northern Europe were primarily the result of natural variability. The large precipitation deficit in southern Europe was almost twice (model) to three times (observations) as likely as a result of anthropogenic forcing. Southern Europe was also very hot in summer 2012. HadGEM-A suggests this was made over 550 times more likely by anthropogenic forcing.


Baltic storm surge, November 1995 and 2006

In November 1995 and 2006 severe storm surges occurred along the German Baltic Sea coast. Water level heights of about 1.8 m above sea level were observed at tide gauges in German coastal cities as e.g. in Wismar. The aim was to place these storm surge events, as e.g. the one of 2006, in the historical climate context trying to distinguish the role of anthropogenic climate change from natural variability.
We explored this question using the regional ocean model TRIM-NP (Kapitza, 2008) to perform two 7-member ensemble simulations to calculate water level of the Baltic Sea in 12.8 km spatial resolution for 1971-2010. Two atmosphere-only multi-decadal ensembles of Hadley Centre Global Environmental Model version 3-A (HadGEM3-A) model data, provided by the Met Office Hadley Centre (Christidis et al. 2013), were used as atmospheric forcing representing the historical climate and the natural climate without human influence respectively. In the model framework of HadGEM-TRIM a pre-processing of the HadGEM3-A data was necessary. An algorithm was developed to provide variables, format, and grid information needed by TRIM-NP. In addition, the standard 360-day output of HadGEM3-A had to be extended to provide the 365 days that are needed for the consideration of tides in TRIM-NP. For the evaluation, one model reconstruction of water level of TRIM-NP has been performed for 1971-2010 using the CoastDat2 (Geyer, 2014) data as atmospheric forcing data to obtain a gridded reference dataset (CoastDat-TRIM). CoastDat2 itself is an atmospheric hindcast simulation for Europe from 1948-2012 based on the regional climate model COSMO-CLM (Rockel et al., 2008). In this study, the events under consideration were defined by determining those events with the maximal intensity of water level within the period of 1971-2010. Selected severe storm surge events were the one on 4 November 1995 with strong impact and the storm surge on 1st November 2006 as one of the most recent extreme events. We focused our analysis on four German cities located along the German coast of the federal states of Schleswig-Holstein and Mecklenburg-Vorpommern in the southwestern Baltic Sea where water levels and thus storm surges are officially recorded at tide gauges. The considered cities are Travemünde, as most western city, Wismar, Warnemünde and Sassnitz.

Our findings indicate for these locations that the probability of weak storm surges (1m-1.25m) is higher in the climate with anthropogenic forcings than in the natural climate without human influence. A similar finding is shown for moderate storm surges (interval of 1.25m-1.5m) however the differences are not statistically significant. There is hardly any differentiation possible of water levels higher than 1.5 m. This can be explained by the underestimation of water level variability of HadGEM-TRIM evident when compared with CoastDat-TRIM.

Objective 5: Produce traceable and consistent attribution assessments on European climate and weather extremes on a range of timescales; on a fast-track basis in the immediate aftermath of extreme events, on a seasonal basis to our stakeholder groups, and annually to the BAMS attribution supplement (WP8: Development and application of near real time attribution service).
Under this objective the EUCLEIA project has looked ahead to the implementation of an operational attribution system for Europe. By developing a prototype attribution system and seeking to provide regular attribution assessments on a range of more rapid timescales than the normal peer review process, EUCLEIA has demonstrated that there is a clear potential to develop a fully operational attribution system in future.

The main achievements under this objective are as follows.

• A prototype state-of-the-art operational attribution system has been developed. It is built on the HadGEM3-A model, upgraded to the highest spatial resolution used in event attribution studies.

• In the final stage of EUCLEIA, the HadGEM3-A system has been set up to run on a seasonal cycle, i.e. it generates ensembles of simulations every quarter for the assessment of events in the preceding season.

• Alongside seasonal assessments, advanced capabilities have been developed to enable rapid event attribution. Several different fast-track methodologies have been applied to the study of events, including empirical trend analyses, pre-computed estimates of the odds of extremes from global models, real-time regional model output and a number of high-statistics coupled model ensembles.

• EUCLEIA partners have made a major contribution to the annual special report of the Bulletin of the American Meteorological Society (BAMS) on extreme events throughout the lifetime of the project, demonstrating the range of scientific output that can be obtained with a range of methodologies. 15 studies were published during the first two years of the project with an additional XX studies accepted for publication in the 2016 issue.

Task 8.1: Development of a quasi-operational HadGEM3-A attribution system

Work Package 8 has not only enabled EUCLEIA to stay at the forefront of event attribution research, but has also prototyped the operationalisation of its science with the view of integrating it into the developing climate services. The upgraded Haley Centre’s attribution system, built on the high-resolution version of the HadGEM3-A model, constitutes the backbone of an emerging attribution service, modelled after the structurally similar seasonal forecasting system that has been operational in the UK Met Office for several years. While both systems generate ensembles of simulations that can provide probabilistic assessments for the occurrence of extremes, the attribution system looks at the recent past rather than the near future and further attempts to quantify the anthropogenic impact by generating a second ensemble of simulations representing what the climate would have been without human influence. The use of atmospheric models in event attribution research is very popular, as it enables large ensembles to be generated at a lower computational cost compared to coupled models. The novelty of the HadGEM3-A system lies in its unprecedented high resolution, only beaten by regional models nested within coarser global counterparts. As a result of its high resolution, HadGEM3-A is expected to give a more accurate representation of processes relevant to extremes and a better description of their characteristics. A detailed model evaluation assessment was carried out by WP6.

EUCLEIA uses a new configuration of HadGEM3-A characterised by a non-hydrostatic dynamical core that brings notable improvements in numerical stability and a resolution of N96L85 (60km at mid-latitudes; 85 vertical levels: 50 tropospheric and 35 stratospheric). The land surface and hydrology schemes were also upgraded to JULES (Joint UK Land Environment Simulator7), a community land surface model with 4 sub-surface layers and 9 “tile” (i.e. land surface) types given as fractions of each grid cell. Experiments with all external forcings (ALL) and natural forcings only (NAT) are carried out representing the climate with and without human influences. Boundary conditions are given by series of Sea Surface Temperatures (SST) and Sea Ice (SIC) fields. The ALL experiments takes these from observed values (HadISST), while for the NAT experiments an estimate of the change due to anthropogenic influence is removed from the observations. This estimate is obtained from experiments with atmosphere-ocean coupled models. Deliverable D8.1 discusses in detail the technical characteristics of the system.

Ensembles of multi-decadal simulations were produced during the first year of EUCLEIA covering the period 1960-2013. Both ALL and NAT experiments were carried out, each comprising 15 simulations in total. The ALL simulations became the basis of detailed model evaluation analyses undertaken by WP6, while the two experiments together provided data for the first attribution studies with the new system published in the 2016 annual supplement of BAMS on extreme events. Moving towards a prototype operational system, the original simulations were subsequently extended to near-present during the second year of EUCLEIA and the size of the ensembles progressively increased to 525 members. In the final phase of the project the HadGEM3-A system has been operating on a seasonal cycle, whereby the model simulations are extended every quarter to represent the climate of the most recent season. Thus EUCLEIA successfully delivered the prototype system proposed in the project’s description of work, illustrated in Fig. 7 (in attached document). Data from the multi-decadal simulations and the first extension ensembles were disseminated initially to project partners via the JASMIN workspace and later to the wider scientific community via the CEDA node of ESGF. Making the model output available on this popular data sharing platform has helped build strong links with leading attribution scientists internationally and allowed EUCLEIA to participate and play a pivotal role in key research initiatives like the C20C+ project.


Task 8.2 Delivery of annual assessments of extreme events for the BAMS attribution report

The annual report of BAMS on prominent extreme events of the previous year in the context of climate change and variability provides an excellent means of communicating developments in the science of event attribution. EUCLEIA has made significant contributions to the report since the beginning of the project with 6 papers published in 2014, 9 papers in 2015 and 8 papers in 2016. Contributions to the 2016 issue include the first two attribution studies with the upgraded high-resolution HadGEM3-A system. EUCLEIA’s contributions to the BAMS report (summarised in Table 1 in the attached document) demonstrate a range of methodologies used in, or developed by, EUCLEIA, consider a range of different types of extremes and give examples of different ways of framing attribution questions.

Task 8.3: Delivery of fast response assessments

In addition to the HadGEM3-A system which can assess the human influence on extreme events up to about a season after they occur, EUCLEIA has also developed a fast-track capability to enable response on media-timescales, e.g. a few days after events. A number of methodologies and data sources have been pooled together to this end, including empirical trend analyses applied to observations and downscaled model data from regional models, as well as pre-computed estimates of the risk ratio based on global climate models. Before rapid event attribution can be performed routinely several challenges need to be overcome. Firstly, from an impact point of view the events analysed often need to be defined on small spatio-temporal scales like, for example, localised extremes lasting less than 3 days. Global models and methods like optimal fingerprinting are typically applied to larger scales and therefore alternative methods that better handle short and small scales were developed jointly with the WWA project and employed in fast-track studies. A second challenge relates to the availability and timely provision of observational data. To meet this challenge, KNMI updates daily (or on demand) its Climate Explorer website up to yesterday or the day before. ECMWF (re)analyses data are also used and often extended five days into the future using forecasts, so that events may be investigated even before they occur. Finally, the collection of model output requires well-tested data extraction programs and can take up to a day. As each event is unique, it may involve the definition of a new indicator and generation and testing of new extraction scripts. Therefore, fast-track analyses require focussed work from one or more scientists.

A typical procedure to produce a fast response assessment involves the following steps. The first task is defining the event, usually by setting a threshold on a meteorological variable that can adequately describe it and its associated impacts (e.g. maximum temperature or rainfall amount). Impact models may also be used for direct attribution of impacts. The following step is the collection of observational and modelled data. Standard evaluation tests are employed to assess whether models are fit for purpose and thereby select only those deemed reliable. Simple bias correction techniques may be applied at this stage to match the modelled with the observed climate, ensuring that the return time of the extreme event under consideration is the same in both cases. The change in the likelihood of the event due to anthropogenic forcings is subsequently estimated by means of the risk ratio, i.e. the ratio of the probability of this class of events in the current climate and the probability in a climate not affected by human influence. This can be done either by deriving the probabilities from two separate ensembles of model simulations for the two types of climate, or by fitting the trend in the probability over the historical record (e.g. using an extreme value distribution with a climate change metric as a covariate) and comparing the present probability relative to that in some earlier period. The latter can provide simple attribution assessments that use observational data only. Finally, results from different methods need to be synthesised and communicated properly. Provided that different approaches give rise to a consistent outcome that also agrees with the scientific understanding of the climate system, an attribution statement is formulated. The best way to do this is to tailor the information for different target audiences. A press release can thus be produced that is understood by the general public, a non-technical summary for science journalists and a scientific text that provides the necessary transparency and traceability of the assessment. The scientific text can also be submitted to a scientific journal, though in future, when the service moves towards operationalisation, this will no longer be the case.

Fast-track analyses have been produced for a number of events over the course of EUCLEIA. These include the 2016 summer heatwave in Western Europe, Storm Desmond, the Seine and Loire floods, but also events outside Europe (e.g. the Southern Brazil water shortages in 2014/15) and large scale extremes (e.g. the record global mean temperature in 2014; this work was highlighted in the WMO statement on the status of the climate). Figure 8 (in attached document) shows results from the study of the Seine floods, a work that started on the 3rd of June 2016 while the event was still ongoing (24 May – 4 June 2016). A multi-method analysis was carried out and the results were communicated on the 9th of June, while a discussion paper was published later in the month. Results from a trend analysis with observational data are shown in Fig. 8a. Return times for the 3-day basin-averaged extreme precipitation event are larger than can be determined with the observed timeseries. Setting the shape parameter to zero, a best estimate of 180 years is obtained (uncertainty range of 50-3000 years). The analysis with data from HadGEM3-A experiments (Fig. 8b) yields a return time of 200 years (uncertainty range of 100-500 years). The risk ratio for the event is found to be is 1.9 (1.1 to 3.4) relative to 1960, or 0.95 (0.5-2.2) relative to the natural climate. Using larger ensembles generated with the HadRM3P model (Fig. 8c) the probability of the event if found to have increased relative to the natural world by a factor of 2 (0.6-5). Data from experiments with the EC-Earth model were downscaled with the RACMO2 regional model (Fig. 8d) and were also analysed. A risk ratio of 2 (1.3-4.9) was estimated. Finally, an analysis with a subset of the EURO-CORDEX climate projections led to a risk ratio estimate of 1.6 (0.5-4.9). The multi-method study concluded that the event had a return time of a few hundred years. The risk ratios were adjusted when necessary to match conditions in the natural world with the climate in the 1960s. A combined result was obtained by averaging the results from individual analyses and adding the uncertainties in quadrature. It was concluded that the best estimate of the risk ratio is 2.3 (>1.6). The change is equivalent to a 6-7% increase in precipitation intensity, which is consistent with the expected water vapour increase in a warming world (Clausius-Clapeyron) and the heating of the Mediterranean and subtropical Atlantic (by almost a degree) that are likely sources of moisture for the event. Studies like this demonstrate how EUCLEIA has contributed to the development of tools that can provide scientifically reliable early assessments of extreme events within about a week. These can subsequently be complemented by more detailed studies (e.g. based on the HadGEM3-A system described earlier) that can investigate in more detail the role of possible key drivers like the atmospheric circulation, modes of variability and the state of the ocean.

Potential Impact:
Potential impact (including socio-economic impact and wider societal implications of the project so far) and the main dissemination activities and exploitation of results
Scientific conferences

EUCLEIA has received extensive publicity at some major scientific conferences. Regarding major scientific conferences EUCLEIA featured heavily in a well attended Union session on event attribution at the Fall Meeting of the American Geophysical Union in December, 2014 and also at a session at the Our Common Future under Climate Change conference held at UNESCO in Paris in July, 2015.

EUCLEIA convened a session at European Geosciences Union annual congress 2016 “Detecting and attributing climate change: trends, extreme events, and impacts”.

The project will be well represented at American Geophysical Union meeting in December 2016, including two papers on stakeholder engagement.

The Expert Advisory Board praised EUCLEIA for being very influential in the development of event attribution science and the demonstration of a possible operational approach. The EAB noted that EUCLEIA made a significant contribution to the report by the US National Academies of Science Committee on Attribution of Extreme Weather Events in the Context of Climate Change that was released in March, 2016.

Scientific capacity building

EUCLEIA scientists have made important contributions to the development of event attribution science internationally through engagement in international meetings and workshops and contribution to scientific literature including review articles and perspective pieces. EUCLEIA scientists have helped build capacity to carry out event attribution assessments in countries that have not traditionally taken a leading role in this area of research and this collaboration has resulted in joint publication of papers in the annual BAMS attribution reports with scientists from Brazil, India and China.

Climate policy fora

A UN side event session was held at the UNFCCC COP20 meeting in Lima, Peru in December, 2014 to discuss the science of event attribution and the wider implications for impacts and for vulnerable communities in South America at which EUCLEIA was highlighted. This meeting was very well attended (with every seat taken and people standing). We also presented the scientific progress made under EUCLEIA in more detail at a second side event in the EU pavilion at the UNFCCC COP20 meeting.

Soon after the publication of the special BAMS report in 2014 and 2015 the project and scientific coordinators of EUCLEIA visited UK Government departments to present studies in the report relevant to the UK, but also talk about recent developments in event attribution in the context of EUCLEIA and the relevance of this work to policy. Attribution assessments of the cold spring of 2013 and the catastrophic storms of winter 2013/14 in the UK were presented to the Department of Environment Food and Rural Affairs and the Department of Energy and Climate Change.

On a European level, the FP7/H2020 C3S workshop in Brussels on 28-29 September 2016 provided an excellent opportunity to showcase EUCLEIA’s work to the European Commission. The project coordinator gave an overview of the project, presented its highlights and main achievements and outlined the future strategy of integrating extreme event attribution into the developing European climate services.

There has been engagement with a number of interested parties. The European Environment Agency have been in consultation with EUCLEIA scientists to understand the changing risk of drought in Europe and Public Health England have consulted EUCLEIA scientists to better understand their analysis of the UK Heatwave in 2013.


Dissemination to the wider public

Research developed by EUCLEIA also featured in the WMO report on the state of the climate in 2014. The report demonstrates how a new fast-track attribution methodology developed by WP8 can be applied to real events. The changing odds of very warm years and seasons in regions across the world are pre-computed over a range of pre-specified temperature thresholds (used to define extreme events) with and without the effect of human influence. Attribution assessments can thus be made as soon as a new event happens. The methodology, described in detail in a peer-reviewed publication, employs optimal fingerprinting to obtain observationally constrained estimates of the global temperature response to external forcings from which regional information is extracted. Soon after this work was published, year 2014 was confirmed to be the warmest on record, both globally and also in the UK. This provided an immediate opportunity to produce the first fast-track attribution assessment based on the pre-computed annual temperature distributions, which subsequently featured as a research highlight in the WMO statement. The global and UK annual mean temperature distributions shown in Fig. 9 (in attached document) are readily available by the new methodology. Regarding the global mean surface temperature, the observed record (vertical black line in Fig. 9a) lies within the red distribution, but in the extreme warm tail of the green distribution. This suggests that the record would not have been equalled or broken in natural climate without the effect of anthropogenic forcings. The UK record of 2014 lies within both distributions, albeit more to the extreme warm tail of the green distribution (Fig. 9b).It is estimated that human influence has increased the likelihood of record-breaking temperatures in the UK by a factor of ten.


Every opportunity to publicise the project has been taken when talking with journalists either when they are writing a general piece about the science or when they are responding to
incidences of extreme weather. For example, New Scientist featured the EUCLEIA project in a feature article on event attribution in an August 2014 edition of the magazine (https://www.newscientist.com/article/mg22329842-400-and-now-the-weather-featuringclimate-change-blame/) and The Guardian featured EUCLEIA in an article about extreme weather and climate change in September, 2014. (http://www.theguardian.com/environment/climate-consensus-97-percent/2014/sep/02/globalwarming-making-weather-more-extreme).

The Economist featured EUCLEIA research in an article on the same topic in May 2015
(http://www.economist.com/news/international/21650552-scientists-are-getting-moreconfident-about-attributing-heatwaves-and-droughts-human.)

We have also taken opportunities to publicise EUCLEIA in popular articles and web posts written by EUCLEIA scientists. For example Peter Stott wrote a blog for Carbon Brief in February 2014 that highlights EUCLEIA www.carbonbrief.org/blog/2014/02/what-climatechange-attribution-can-tell-us-about-extreme-weather-and-the-recent-uk-floods/. Peter Stott also took up the invitation to write a Perspective article for Science, published in June, 2016 (Stott, 2016). A list of further popular articles written by EUCLEIA scientists is provided later in the report.


EUCLEIA scientists and stakeholder partners in the UK media sector worked with the Science Media Centre, UK to work towards developing a set of guidelines to support the accurate reporting of extreme events in the news. EUCLEIA was involved in a communications workshop for weather presenters in Paris 2014 organised by the WMO to develop understanding of how to better communicate and report on climate change and extreme event attribution.

EUCLEIA has made a major contribution to the annual BAMS reports explaining extreme events of the previous year from a climate perspective. The Science coordinator of EUCLEIA (Peter Stott) is a co-editor of the report and during the three years of the EUCLEIA project, EUCLEIA scientists have contributed to 23 papers published in these reports. These reports, the next one of which will be released in December, 2016, have in previous years garnered considerable media interest around the World.

List of Websites:
www.eucleia.eu
Scientific representative - Peter Stott - peter.stott@metoffice.gov.uk
Project Manager: Sarah Gooding - sarah.gooding@metoffice.gov.uk