A series of modules have been written that enable OpenDX to read and understand data in the PRISM data format of NetCDF CF format. These modules are described in the accompanying documentation. When these modules are installed properly on a system, they will appear in the Import category in the Visual Program Editor of OpenDX allowing the user easy access to the functionality.
The OASIS3 coupler is a software package allowing synchronized exchanges of coupling information between numerical models representing different sub-systems of the climatic system. In particular, OASIS3 performs the transformations needed to express, on the grid of a target model, the coupling fields produced by a source model on its grid. Different transformations and 2D interpolations (gaussian weighted nearest neighbour, bilinear, bi-cubic, conservative re-mapping, etc) are available in OASIS3, thanks in particular to its interfacing with the SCRIP 1.4 library. OASIS3 is a portable set of Fortran 77, Fortran 90 and C routines. At run-time, OASIS3 acts as a separate mono process executable, which main function is to interpolate the coupling fields, and as a library linked to the component models, the PRISM Model Interface Library, PSMILe. The models themselves may remain as separate executables, keeping their own main options as in the uncoupled mode. To communicate with OASIS3, or directly between them, or to perform I/O actions, the component models need to include few specific PSMILe calls. OASIS3 PSMILe supports: - Parallel communication between a parallel component model and OASIS3 main process based on Message Passing Interface (MPI); - Direct communication between parallel component models when no transformations are required, also based on MPI; - Automatic sending and receiving actions at appropriate times, time integration or accumulation of the coupling fields; - I/O actions from/to files thanks to the mpp_io library from GFDL. Flexibility is ensured by the fact that all options for a particular coupled simulation (i.e. the component models, the coupling fields, the coupling sequencing and periods, the interpolations and other transformations), are defined externally before the run by the user in an external configuration text file. This configuration file is read at the beginning of the run by OASIS3 and transferred to the model PSMILe libraries; during the run. OASIS3 and the model PSMILe libraries then perform appropriate actions automatically. OASIS3, and in particular its PSMILe library, is an evolution realized in the PRISM project of the OASIS coupler developed and maintained since more than 10 years in CERFACS (Toulouse, France). OASIS3 and its toy coupled model have been compiled and successfully run on Fujitsu VPP5000, NEC SX6, SGI Octane and 03000, IBM Power4, COMPAQ Alpha cluster and Linux PC cluster. OASIS3 has been extensively used in the PRISM demonstration runs and is currently used by approximately 10 climate-modelling groups in Europe, USA, Canada, Australia, India and Brazil. Sharing standard coupling software between different institutions brings along many benefits: - Assembling Earth System Models based on component models sharing a common coupling software is technically easier; - Each institution benefits, at relatively low cost, from the central development of the coupler by dedicated IT experts; - Sharing common software increases interaction between the different institutions; - Computer manufacturers are inclined to work on software widely used across the climate modelling community, increasing thereby its portability.
The PRISM SRE handbook gives a detailed description of the PRISM Standard Running Environment. It addresses SRE users as well as developers. The 40 pages contain several tables and figures. As the SRE is still evolving, the handbook will be updated regularly. The current version (1st edition) of the handbook is describing the SRE release prism_2-4. It is available on the web in html format or as a pdf document.
The Max Planck Institute for Meteorology (MPI-Met) in Hamburg, Germany has developed global and regional climate models of the atmosphere, the ocean and the cryosphere as well as models describing biophysical and biogeochemical processes. All these models represent important components of the Earth System. The PRISM project has been established to enhance the efficiency of earth system modelling through development of a common coupling software infrastructure under standardised coding conventions. The software comprises the coupler OASIS3 and its associated libraries for grid and time transformation of exchange data, for model I/O in netCDF format (CF meta-data Convention) and for time control of the experiments. In addition, standardised and portable compilation and running environments (SCE and SRE) have been developed to facilitate compilation and execution of coupled models. As one result of PRISM the MPI-Met models are adapted to the PRISM environments: The component models ECHAM5 (atmosphere), MPI-OM (ocean and ice), HAMOCC (bio-geo-chemistry) and MOZART (chemistry) are adapted to the PRISM Standard Compile Environment (SCE). The atmosphere model and the ocean model are coupled via the PRISM coupler OASIS3. Appropriate calls to the PSMILe library are implemented in the models' source codes. Several coupled configurations of the MPI-Met component models have been adapted to the Standard Running Environment (SRE), e.g. MPI-AO (ECHAM5 + MPI-OM), MPI-OB (MPI-OM + HAMOCC) and MPI-AOB (ECHAM5 + MPI-OM + HAMOCC). The MPI-Met coupled model configuration MPI-AO (also known as ECHO) was used as a prototype at the PRISM Training Session held in June 2003 in Hamburg. It was used to demonstrate the PRISM environments and the coupler OASIS3 as well as the implementation of the PSMILe calls in state-of-the-art GCMs.
IPSLCM4 earth system climate model was adapted to PRISM infrastructure and used for demonstration runs. The climate coupled configuration called IPSL_CM4 and used at Pierre-Simon Laplace Institute (IPSL) is composed of the oceanic component OPA, the sea ice component LIM, the atmospheric component LMDZ and the land surface component ORCHIDEE, coupled through OASIS coupler. This document gives some information about this configuration and its adaptation to the PRISM system. The two first parts give a brief description about this coupled model and its components. The third part is more technical and stresses on the adaptation of the models sources. At last, the fourth part is a step-by-step guide to help the user to compile and run IPSL_CM4 with the PRISM environment.
The Earth System Modelling community in Europe is organised in the European Network for Earth System Modelling ENES (http://www.enes.org). A major challenge for the climate research community is the development of comprehensive Earth system models capable of simulating natural climate variability and human-induced climate changes. Such models need to account for detailed processes occurring in the atmosphere, the ocean and on the continents including physical, chemical and biological processes on a variety of spatial and temporal scales. They have also to capture complex non-linear interactions between the different components of the Earth system and assess how these interactions can be perturbed as a result of human activities. The PRISM infrastructure is improving the European capability in this area to a large extent. Accurate scientific information is required by government and industry to make appropriate decisions regarding our global environment, with direct consequences on the economy and lifestyles. It is therefore the responsibility of the scientific community to accelerate progress towards a better understanding of the processes governing the Earth system and towards the development of an improved predictive capability. An important task is to develop an advanced software and hardware environment in Europe, under which the most advanced high-resolution climate models can be developed, improved, and integrated. Aspects of the improvement of the hardware environment are described in this result, which is a draft for a brochure that was planned for wider distribution within ENES, and beyond. It contains descriptions of the questions rendered to be politically and scientifically important, and the methods to approach answers to them, which result in the need for a large scale European Climate Computing Facility. Architecture for such a facility is then proposed, considering general requirements, GRID technologies, compute power, data storage, wide area network connectivity, support services, staffing and site requirements, and the procurement process. It closes with considerations about funding and management. Since this topic is a politically difficult one, a group of people is currently revisiting this draft, and plans to come up with a more comprehensive and appropriate version in the course of 2005.
The PRISM system is an infrastructure to enable users to perform numerical experiments, coupling interchangeable model components, e.g. atmosphere, ocean, biosphere, chemistry etc., using standardised interfaces. The architecture of the infrastructure provides the means to configure, submit, monitor and subsequently post process, archive and diagnose the results of these coupled model experiments remotely. The design is a centralised architecture that minimises the administration and duplication of resources. This design is achieved using a Web Services System providing applications over the Internet. This Web Services System makes use of web servers, application servers, resource directories, discovery mechanisms and message services in cooperation with Java clients and server. Technical and scientific results: The result is the Graphical User Interfaces and the infrastructure that allows the user to remotely configure a coupled model run and then deploy configuration to a remote site where the PRISM infrastructure is instrumented. A PRISM web security system is available for use with the Web Services System Innovative features: - Portable system for many platforms - A system for remote use - Graphical User interfaces assisting in the efficient use of the system - Modularity allows for different levels of adaption and use - Out of the box installation and deployment - View of the graphical interfaces can be seen SMS and PrepIFS and PrepOASIS Benefits: - Scientific Benefits: - The use of the infrastructure allows the modellers to concentrate on the modelling knowing that the technical difficulties with deployment and monitoring have been taken care of. Social Benefits: The use of the same system and remote systems means that scientists can do less travelling and more science. Economic Benefits: - The efficient use of large or small computing services through pre-installed environments and remote users will save time and money by the use of Graphical User Interfaces that requires less support, administration and produces correct technical solutions. Current status: Two computing centres have implemented the full Web Services System.
A central CVS repository was installed at CSCS on machine bedano. It is organised according to the PRISM standard directory tree. It contains source code of all PRISM models that had been adapted to the Standard Compile and Running Environments (SCE and SRE). Besides, tools for model compilation and execution as well as input data tar-files are included. Within the project phase the models and scripts had been freely available for all PRISM members. To facilitate the download, CVS modules have been defined for all coupled model combinations. The modules are made up from the coupled model name in capital letters. For each coupled model combination three modules have been defined. The first module (extention SRC) contains the source code of the component models and libraries as well as the scripting utilise for compilation and execution. The second module (extention DATA) embraces the input data needed for a coupled model run. A third module (no extention) combines all of the above. This allows the user to download all that is needed to compile and run a coupled model experiment typing a single command. The sources combined as a module are tested in that coupled configuration on several PRISM sites. The repository is available from several remote machines via direct connection (p-server method). Besides, a web interface was installed. Standards for the tagging of source code and scripts have been developed. The commitment of model and library source code to the bedano repository happened in close cooperation with the developing institutes.
A proposed structure for coupling tiled land-surfaces with the planetary boundary layer is proposed. An article describes the proposal. Different models (ORCHIDEE, ISBA,) are in the process of joining this structure during PRISM project and work still continue.
The PRISM project has been established to enhance the efficiency of Earth System modelling through the development of a common coupling infrastructure for all aspects of Earth System modelling. An important part of the infrastructure is the PRISM software. It comprises the PRISM couplers (OASIS3 and OASIS4), associated libraries, as well as standardised and portable environments for model compilation and execution (SCE and SRE). Besides, tools for post-processing and visualisation of model output data have been developed. Depending on the user's preferences it is possible to set-up, run and monitor an experiment through a graphical user interface (GUI) or at the scripting level. Many state-of-the-art European component models have been adapted to the PRISM standards. The list comprises models for all components of the Earth System, i.e. atmosphere, chemistry, vegetation, ocean, cryosphere and bio-geo-chemistry models. A number of combinations of these component models have been assembled to run coupled configurations. Experiments have been performed on diverse platforms, e.g. NEC-SX, SGI-Origin, Fujitsu-VPP, IBM-Power4, CRAY X1 and different linux platforms. Tables of component models, of the coupled model configurations and of the platforms that have been integrated to the PRISM system are available on the web http://prism.enes.org. The PRISM infrastructure allows for easy exchange of component models or model results between different scientists and European research centres. This was achieved through a high level of modularity, portability and flexibility. Besides, all components of the system are well documented and easy-to-use. Standards have been defined for the interfaces between different models and between the models and the couplers. This is fostering very much the exchange of individual components within a coupled Earth System model. One important aspect is the definition of a common data format (netCDF/CF) for model I/O as well as for the post-processing and visualisation tools. This potentiates experiment and model inter-comparison at minimum effort. Besides the technical achievements, one important success of PRISM is that it has brought the different partners of the European Earth system research community to interact and work closely together. This led to invaluable trust building, naturally opening up into scientific co-operation and co-ordination. The PRISM climate research network comprises more than 20 partners from ten different European countries, among them four computer manufacturers. Strong collaboration was established with related project in the US (Earth System Modelling Framework, ESMF), Japan (Earth Simulator Centre) and elsewhere (FLUME project of the UK Met Office, COSMOS project at MPI-MET). The PRISM infrastructure enables scientists and institutions to share the development, maintenance and support of a comprehensive Earth System modelling software environment. This not only minimises developing costs but also enhances co-operation between the different research centres, as all state-of-the art component models integrated to the PRISM System use common standards and give a common look-and-feel. The computer manufacturers contribute to the infrastructure, as they foster the efficiency of the PRISM software (porting, optimisation) on a variety of platforms. This allows the scientists to focus on Earth System science. On the other hand, the Earth System modelling community will affect the new generation computing platforms towards their needs.
This document presents the details of the PRISM TOYCLIM coupled model, using the OASIS3 coupler. TOYCLIM is a toy-coupled model, i.e. a coupled model involving component models with no real physics but reproducing a realistic coupling exchange (parallel decomposition, size and number of coupling fields, interpolation and other operations, frequencies of coupling or I/O, etc.) and therefore testing the coupler functionality. TOYCLIM is a coupling between 3 different component models, two of which have the same numerical grid. It tests different OASIS3 transformations and interpolations needed to express, on the grid of a target model, the coupling fields produced by a source model on its grid. TOYCLIM also shows a practical example on how to communicate with OASIS3 or to perform I/O actions via few specific calls to the PRISM System Model Interface Library, PSMILe. OASIS3 and its toy coupled model have been compiled and successfully run on Fujitsu VPP5000, NEC SX6, SGI Octane and 03000, IBM Power4, COMPAQ Alpha cluster and Linux PC cluster, and have been used in the PRISM demonstration runs.
In the regional work package WP3h, an interface connecting a GCM and a RCM has been developed on the basis of the PRISM system. Fields are exchanged in the source grid resolution and interpolation is carried out in the traditional manner, i.e. within the receiving model. Generally, this set-up could be used to exchange initialisation and coupling fields in both directions. In practice, we start with a one-way coupling whereby the "global model" is represented by a program that reads in global (or a subset) data fields (the "pretending GCM"). The communication is established by OASIS, whereas, the actual data exchange during runtime is handled by the PSMILe interface, which is installed on both sides, i.e. in the pretending GCM and the RCM. The pretending GCM can easily be exchanged by a real GCM.
A new fully parallel coupler for Earth System Models (ESMs), OASIS4, has been developed within the European PRISM project. OASIS4 main functional parts are the Driver, the Transformer, and the PRISM System Model Interface Library, PSMILe. During the run, OASIS4 Driver first extracts the configuration information defined by the user in the XML files and then organizes the process management of the coupled simulation. OASIS4 Transformer performs, in a fully parallel mode, the re-gridding needed to express, on the grid of a target model, a coupling field provided by a source model on its grid. An ESM coupled by OASIS4 consist of different applications (or executables), which executions are controlled by OASIS4. Each ESM application may host only one or more than one climate component models (e.g. model of the ocean, sea-ice, atmosphere, etc.). To interact with the rest of the ESM at run-time, the component models have to include specific calls to the OASIS4 PSMILe. Each application and component model must be provided with XML files that describe its coupling interface established through PSMILe calls. The configuration of one particular coupled ESM simulation, i.e. the coupling and I/O exchanges that will be performed at run-time between the components or between the components and disk files, is done externally by the user also through XML files. The OASIS4 PSMILe, linked to the component models, includes the Data Exchange Library (DEL), which performs the MPI-based (Message Passing Interface) exchanges of coupling data, either directly or via additional Transformer processes, and the GFDL mpp_io library, which reads/writes the I/O data from/to files using the NetCDF format. At the beginning of the run, the OASIS4 PSMILe also performs, for the coupling fields that need re-gridding between their source and target grids, a parallel neighbourhoods search based on a multi-grid algorithm. Other MPI-based parallel coupler performing field transformation exists. The originality of OASIS4 relies in its great flexibility, as the coupling and I/O configuration is externally defined by the user in XML files, in its parallel neighbourhood search based on the geographical description of the process local domains, and its common treatment of coupling and I/O exchanges, both performed by the PSMILe library. OASIS4 functionality and scalability have been demonstrated with different "toy" models. A "toy" model is an empty model in the sense that it contains no physics or dynamics; it reproduces, however, a realistic coupling in terms of the number of component models, the number, size and interpolation of the coupling or I/O fields, the coupling or I/O frequencies, etc. OASIS4 "toy" models were run on different platforms showing also its portability: SGI Origin and ALTIX, NEC SX6, AMD Athlon PC-Cluster, Fujitsu AMD Opteron PC-Cluster, and IBM Power4. OASIS4 has also been used to realize the coupling between the MOM4 ocean model and a pseudo atmosphere model reading appropriate forcing fields. As the climate modelling community is progressively targeting higher resolution climate simulations, run on massively parallel platforms, with coupling exchanges involving a higher number of (possibly 3D) coupling fields exchanged at a higher coupling frequency, the key design concepts of parallelism and efficiency drove the OASIS4 developments, at the same time keeping in its design the concepts of portability and flexibility that made the success of OASIS3. Furthermore, the OASIS4 PSMILe Application Programming Interface (API) was kept as close as possible to OASIS3 PSMILe API. This should ensure a smooth and progressive transition between OASIS3 and OASIS4 in the climate modelling community.
Two ocean general circulation models were adapted to the PRISM software: OPA (version 8.2 from IPSL, Paris) only for the ocean-atmosphere interface to OASIS 3 and MPI-OM (from MPI-M, Hamburg). Dissemination and use potential: - Available from PRISM bedano CVS server; - Key innovative features. Current status and use: - The ECHAM5/MPI-OM PRISM set up is now used for the IPCC runs at MPI-M in Hamburg; - The OPA System Team has announced the intention to release the OPA9 version within the PRISM SCE environment if possible (final decision in February 2005, if evolutions of PRISM SCE are suitable); - ECMWF in Reading plans an extensive use of OPA9 and the PRISM environment for its seasonal forecast activities. Expected benefits: - Easier to assemble ESM; - Increased scientific collaboration in ESM.
The MPI-Met PRISM Earth System Model Adaptation Guide describes how the component models developed at the Max Planck Institute for Meteorology in Hamburg, Germany (MPI-Met) is adapted to the PRISM software. This includes the implementation of the coupling software (OASIS3, PSMILe) into the models' source codes and the adaptation to the Standard Compile Environment (SCE) and the Standard Running Environment (SRE). It explains, how the system can be used to run coupled model experiments and how to extended it. The Adaptation Guide addresses model users as well as developers. The 66 pages contain several tables and figures. As the MPI-Met Earth System component models as well as the PRISM coupler and the environments are evolving, the handbook will be updated regularly. The current version (1st edition) is describing the PRISM system release prism_2-4. It is available on the web in html format or as a pdf document.
This document presents the OASIS4 coupler, fully parallel software allowing synchronized exchanges of 3D coupling information between numerical models representing different sub-systems of the climate system. OASIS4 main functional parts are the Driver, the Transformer, and the PRISM System Model Interface Library, PSMILe. OASIS4 Driver and Transformer are first presented. To communicate with OASIS4 or to perform I/O actions, the component models need to include few specific calls to the PRISM Model Interface Library, PSMILe, which are then described. Each component model must be provided with XML files that describe its coupling interface established through PSMILe calls. The user does the configuration of one particular coupled ESM simulation externally also through XML files. More information on those XML files is then provided. Finally, some explanations on how to compile OASIS4 and run toy coupled models using OASIS4 are given. Results on scalability tests performed with those toy models are also presented.
The PRISM Standard Running Environment (SRE) is a central aspect of the PRISM infrastructure. It is closely related to the PRISM Standard Compiling Environment (SCE). It provides a common and user-friendly frame for the execution of climate modelling experiments and defines standards for many aspects of Earth System modelling, beginning with the organisation of initial data and ending with the post-processing and visualisation of experiment results. Because of the large number of models and platforms used within the European climate modelling community, and taking into consideration the quick development of both software and hardware, the SRE is designed in a flexible and open way. The environment is easily extendable to accommodate new models and platforms. An important aspect of the SRE is a well-defined Unix directory tree. It provides room for all that is needed to run a coupled model experiment: source code of the component models (including the coupler), in- and output data and executables of several independent experiments that might run simultaneously, as well as the compiling and runtime utilities. Scripts for model execution are specific for the component model, the coupled model constellation and the platform the model is supposed to run on. The SRE does not comprise ready-to-use scripts, but provides a comprehensive set of utilities to generate standardised scripts specific to the model and to the user's platform. The scripts are assembled from a base of small files, called header files, containing script code fragments. These fragments are specific for a model or a platform or both, or they can be used for all models on all platforms. The method allows for easy adaptation to newly coupled models or new platforms as model and site dependent sections are clearly identified. Besides, maintenance is small, as there is little redundant code. The scripts generated within the SRE (i.e. scripts for model integration, data pre- or post-processing, visualisation, and archiving of output data) give a common look & feel for every model adapted to the PRISM infrastructure. This minimises the effort to set-up and run coupled model experiments. The standards also help designing and running new coupled models and facilitate porting activities to new platforms. Cooperation between different centres and scientists is facilitated. Once a model is integrated in the SRE it profits from easy portability to all other PRISM platform, support from the PRISM Team and from future updates of the system.
This document presents the OASIS3 coupler software that allows synchronized exchanges of coupling information between numerical models representing different sub-systems of the climate system. To communicate with OASIS3 or to perform I/O actions, the component models need to include few specific calls to the PRISM Model Interface Library, PSMILe, which are first described. The OASIS3 configuration file into which the user has to define all coupling and I/O parameters and the transformations required for each coupling field is presented. The different transformations and 2D interpolations (gaussian weighted nearest neighbour, bilinear, bi-cubic, conservative re-mapping, etc.) available in OASIS3 to express, on the grid of a target model, the coupling fields produced by a source model on its grid, are detailed. All additional auxiliary data files required by OASIS3 are also described. Finally, some explanations on how to compile OASIS3 using the PRISM Standard Compiling Environment and run a coupled model with OASIS3 using the PRISM Standard Running Environment are given. The different annexes of the document describe the different types of grids supported by OASIS3, the changes between the different versions, the Copyright statement, and list the coupled model realized with OASIS.
The Standard Compile Environment (SCE) is a central aspect of the PRISM infrastructure. It is closely related to the PRISM Standard Running Environment (SRE). It provides a common and user-friendly frame for source code management and model compilation and defines standards for many aspects of Earth System modelling experiments. Because of the large number of models and platforms used within the European climate modelling community, and taking into consideration the quick development of both software and hardware, the SCE is designed in a flexible and open way. The environment is easily extendable to accommodate new models and platforms. An important aspect of the SCE is a well-defined Unix directory tree for the component models (including the coupler), libraries, compiler output and the compiling and runtime utilities. The structure of the model source code directory tree is kept as simple as possible. Models adapted to this structure can be compiled with the portable tools provided. Scripts for model compilation are specific for the component model, the coupled model constellation and the platform the model is supposed to run on. The SCE does not comprise ready-to-use scripts, but provides a comprehensive set of utilities to generate standardised scripts specific to the model and to the user's platform. The scripts are assembled from a base of small files, called header files, containing script code fragments. These fragments are specific for a model or a platform or both, or they can be used for all models on all platforms. The method allows for easy adaptation to newly coupled models or new platforms as model and site dependent sections are clearly identified. Besides, maintenance is small, as there is little redundant code. Compile scripts generated within the SCE give a common look & feel for every model adapted to the PRISM infrastructure. This minimises the effort to set-up coupled model experiments. The standards also help with designing new, coupled models and facilitate porting activities to new platforms. Cooperation between different centres and scientists is facilitated. Once a model is integrated in the SCE it profits from easy portability to all other PRISM platform, support from the PRISM Team and from future updates of the system.
At the end of PRISM's main sections (definition, development, assemblage and demonstration phases), the results achieved so far have been subject to a critical evaluation. This is important to keep track of the development in such a complex project and to maintain quality standards. The coupler (workpackage 3a), the prototypes of the PRISM component models (WP3b-h), the visualisation and diagnostic system (WP4a) and the architecture (WP4b) have all been evaluated and reported upon in the initial review of system status (WP2a3). The last twelve months of the project saw developments towards the final, fully PRISM-compatible system and demonstrations on multitude platforms and with several components in order to show that the system is actually working under a number of different configurations. A detailed final report has been prepared and can be downloaded at http://prism.enes.org. This report describes the PRISM framework, the coupling, compiling and running environments, the assembled models including their quality assessment, the PRISM repository, the graphical user interface and web services, data analysis and visualisation and the setup and results of the demonstration runs as well as community activities. The main part of PRISM is the coupler and the component models. These components are available as envisaged in the Description of Work. All deliverables were fulfilled during the report period in a timely manner and with the required quality standard. It would, however, in principle be desirable to include more model combinations to let join more component models the common PRISM efforts. Such a task would clearly not have been possible within the limited time frame of the project. Further PRISM follow-up activities are therefore highly encouraged to ensure that Europe keeps its leading role in Earth System Modelling. The main outcome of WP2a is the final report. This report will, together with other reports issued by the PRISM community, serve as a reference for development activities (via the REDOC and ARCDI documents produced earlier under this WP), and it will enable users to install, initialise and run the PRISM system on their platforms. Through the envisaged wide distribution throughout the community the expected benefits can be greatly enhanced.
COCO (CDMS overloaded for CF Objects) is a data processing library which supports the chosen PRISM data formats. It is written in Python and is based on the Climate Data Management System (CDMS). CDMS is an object-oriented data managed system, specialized for organizing multidimensional, grid data used in climate analysis and simulation. The accompanying documentation describes the library in more detail.
Based on a set of short integrations we have proved the feasibility of coupling diverse model combinations as well as some technical functionalities of the PRISM system. In some cases, the demonstrations started on the assembling phase, i.e., we demonstrate that it is very easy to assemble two components that came from very different software environments, with few changes to the codes themselves. These results will help the researchers to easily exchange model components and play with the modularity of their earth model systems.
The PRISM SCE handbook gives a detailed description of the PRISM Standard Compilation Environment. It addresses SCE users as well as developers. The 66 pages contain several tables and figures. As the SCE is still evolving, the handbook will be updated regularly. The current version (1st edition) of the handbook is describing the SCE release prism_2-4. It is available on the web in html format or as a pdf document. .
Tools and techniques for processing and visualizing climate model data have been reviewed, extended and developed to prove that they can meet the wide range of requirements of climate modellers. Demonstration applications have been created that prove that data can be stored, read, processed and displayed in a variety of ways using the chosen tools. For processing data, a package called coco has been developed which builds on the existing CDAT package. For high quality interactive visualisation, two packages, OpenDX and VTK, have been examined. OpenDX incorporates a visual programming editor that enables visualisation applications with graphical user interfaces to be developed. VTK has a Python scripting information to enable it to be used from within scripts or from a command line, though coupled with the Qt library; graphical applications have also been developed. Finally architecture for the system has been designed and a prototype has been developed for monitoring model runs by automatically producing plots as the model progresses. This system uses coco and VCS to generate plots on a periodic basis. Consideration has been given to ways of unifying these different packages. The ParaGen system defines standard ways for writing scripts and describing their interfaces through use of an XML schema. The XML colour editor provides ways of defining standard colour tables that can be used by different graphics packages to help ensure that plots from different packages use similar colour scales if required.
VTK is a one of the software packages examined for its potential as a visualisation tool for PRISM model data. An application called VTK_Mapper was developed to prove that VTK could provide a high quality, interactive graphics capability. This work required a significant amount of research into VTK and into other tools such as Qt that was used for building a graphical user interface for the application. The work has been documented in the accompanying report that also includes pointers to VTK documentation.
Recognising the need for shared software infrastructure, the European Network for Earth System Modelling (ENES) organised the PRISM project, which was funded for 3 years (starting December 2001) by the European Union under the 5th Framework Programme. The PRISM project gathered 22 partners, including the main European climate modelling institutions and four computer manufacturers. It had an overall budget of 4.8 million euro, corresponding to a total effort close to 80py. One main objective of PRISM is to provide a portable, user-friendly, flexible, and standard based infrastructure for assembling, compiling, running, monitoring and post-processing Earth System Models built on state-of-the-art component models developed in the different European modelling groups. Today, PRISM provides as standard software: - Standard coupler and I/O software, OASIS3; - A standard compiling environment (SCE) at the scripting level; - A standard running environment (SRE) at the scripting level; - Graphical User Interfaces (GUIs) to configure the SCE and SRE; - A GUI for an end-to-end monitoring of climate experiments; - Standard diagnostic and visualisation tools. Although PRISM was designed as a demonstration project, its technical value is already recognised by many European research groups. First user experiences show that using the PRISM system eases the assembly, compilation and running of complex component models via the use of PRISM standards. Some of the ESM configurations described above are starting to be used both for local scientific projects and wider community undertakings (IPCC runs, ENSEMBLES, GEMS FP6 projects). Besides those technical achievements, one important success of PRISM is that it has brought the different partners of the European Earth system research community to interact and work closely together. This led to invaluable trust building, naturally opening up into scientific co-operation and co-ordination. Today, this closely co-ordinated network of experts (IT specialists, climate scientists, computer manufacturers) is ready to go one step further by providing a routinely maintained state-of-the-art software infrastructure for the Earth System Modelling community. The PRISM software described above needs to be maintained and constantly improved to fit the evolving needs of the Earth system modelling community. The system is also intended to integrate progressively more component models and data archive infrastructures, to be implemented on additional sites, and serve a wider community. Sustained staff and financial support is therefore needed to ensure the continued and co-ordinated maintenance of PRISM, together with an adequate level of user support, both required to guarantee a growing community buy-in and trust-building. Without sustained support, it seems inevitable that the PRISM software will diverge over time. Local support teams are unlikely to be able to develop the same level of expertise across the whole software system and any result is likely to be more expensive and/or of lower standard than would be available by an expert co-ordinated team, as proposed here. Sustained support is also of key importance to attract highly qualified experts, draw additional EU and other temporary funding, work towards convergence of software infrastructures used in climate research and related fields of expertise, such as impact studies scientific data assimilation and operational forecasting. Several meetings resulted in the launch of the PRISM Support Initiative (PSI) in October 2004, one month before the end of the EU FP5 contract, in which about 10 institutions are provided close to 10 py/y sustained effort to: - Co-ordinate improvement, maintenance and user support of current PRISM Software - Support adaptation of other component models to PRISM technical standards; - Install PRISM software environment at additional computing sites; - Prepare for the future by seeking additional funding and proposing development strategies. - Ensure coordination with related international projects (ESMF, FLUME) The first meeting of the steering board is planned for April 2005. More details available on http://www.prism.enes.org/sustained
Tools and techniques for processing and visualizing climate model data have been reviewed, extended and developed to prove that they can meet the wide range of requirements of climate modellers. This result tries to point the way forward in how these tools need to be adopted and adapted to meet the current and future requirements of climate scientists.