last modified Apr 10, 2018 02:48 PM
You are here: Home / DICAD / Subproject 1 (DKRZ) / Data Workflows / Production Phase

Production Phase

Data request, post processing of raw model output, CMIP6 compliant formatting

Important workflow steps during the production phase are explicated below. They are highlighted with red circles and numerated in the next figure. The complete figure is explained in Data Workflows.

In-depth production phase workflow english

Step 1:

The CMIP6 endorsed MIPs have defined a number of experiments in order to study certain processes of the Earth system. In order to enable the envisaged studies a MIP may ask its participants to provide a number of variables from the experiments it defined as well as from experiments defined by other MIPs. Modelling groups wanting to engage in CMIP6 decide in which MIPs to participate, which experiments to perform, and which MIPs to support by providing variables from the experiments asked for by those MIPs. The variables are requested with specific aggregations (averages in time or space, accumulated values, extrema, etc.), as well as with a priority, and possibly for a certain time slice. The totality of these variables makes up the CMIP6 data request. A web-GUI is developed and maintained by the DKRZ in order to ease managing the data request. It can be used to taylor the request according to the plans of the modelling group in order to obtain the so-called customized data request, and calculate resulting parameters as e.g. the data volume resulting from that customized request on the model's grids. The customized data request and the volume size can be used to adjust the group's plans with respect to its ressources and configure its model output.

Step 2:

Postprocessing of model output includes up to 3 phases. If the variables are not saved with the required aggregation, the aggregation would have to be done after the model integration. Possibly diagnostics are needed in order to calculate requested variable from the variables in the model output (e.g. total precipitation as the sum of total liquid and total solid precipitation). In a last phase the variables have to be re-written compliant with the project standard. In doing so the newly developed 'cmor' operator of the CDO toolkit can be used.

The input into the online interface mentioned in Step 1 is used to generate model dependent recipe and mapping tables. The mapping tables contain the model dependent information for the application of the 'cmor' operator.
The recipe table may, besides the necessary diagnostics, also include information about the data sets holding the variables. Thus it can be used to automatically generate script(fragment)s which support the standard compliant data processing. Documentation about Step 1 and Step 2  are accessible through the above mentioned interface (tag 'Documentation&Links' at

Step 3:

Model simulations can be evaluated with the help of the ESMValTool already in parallel with the integration. The ESMValTool will contrast the model results with those of other models - if available - as well as with observational data.

Prior to a publication in a CMIP6 ESGF data node the data have to undergo a quality check (see Management Phase).