You are here: Home / User Portal / Documentation / Data Processing / Software and Tools

Data Processing Software

Tools for file handling, data processing and analysis

Climate Data Operators (CDO)

The CDO package is developed at the Max-Planck-Institute for Meteorology. It includes more than 400 operators for standard processing of geoscience data like simple statistical and arithmetic functions, data selection and subsampling, and spatial interpolation. The software is capable to read GRIB, netCDF, HDF, ieg (REMO), service and specifically formatted files and process or convert them. For more information you may download the documentation and the source code.


NetCDF Operators (NCO)

NCO is a suite of programs known as operators. Each operator is a standalone, command line program executed at the shell level on UNIX/Linux systems. The operators take netCDF or HDF file(s) as input, perform an operation (e.g., averaging or hyperslabbing), and produce a netCDF file as output. The operators, primarily designed for manipulation and analysis of data, are:

  • ncap2 - arithmetic processor
  • ncatted - attribute editor

  • ncbo - binary operator (add, multiply, etc)
  • ncdiff - differencer

  • ncea - ensemble averager

  • ncecat - ensemble concatenator

  • nces - ensemble statistics
  • ncflint - file interpolator

  • ncks - kitchen sink (extract, cut, paste, print data)

  • ncra - running averager

  • ncrcat - record concatenator

  • ncrename - renamer

  • ncwa - weighted averager


NetCDF Utilities: ncdump, ncgen, nccopy

ncdump − Convert netCDF file to ASCII form (CDL) or show format variant of netCDF file

ncdump [-c|-h] [-v var1,...] [[-b|-f] [c|f]] [-l len] [-n name] [-p n[,n]] [-k] [-x] [-s] [-t|-i] [-g ...] [-w] file

The ncdump utility generates an ASCII representation of a specified netCDF file on standard output. The ASCII representation is in a form called CDL (‘‘network Common Data form Language’’) that can be viewed, edited, or serve as input to a companion program ncgen. The ncdump program may also be used as a simple browser for netCDF data files, to display the dimension names and sizes; variable names, types, and shapes; attribute names and values; and optionally, the values of data for all variables or selected variables in a netCDF file. Below are some examples of using ncdump.

For header information use

ncdump -h file

To see header and data of var1 and var2 use

ncdump -v var1,var2 file

To query the kind of a netCDF file ('classic', '64-bit offset', 'netCDF-4' or 'netCDF-4 classic model') use

ncdump -k file


ncgen - Generate a binary netCDF file from a CDL file or C/Fortran program for creation of a netCDF file matching the specifications

ncgen [-b] [-c] [-f] [-k file format] [-l output language] [-n] [-o netcdf_filename] [-x] input_file

The ncgen and ncdump utilities can be used as inverses to transform the data between binary and CDL representations. For example, to generate a binary netCDF file from an ASCII formatted CDL file use

ncgen -o ifile.cdl


nccopy - Copy a netCDF file, optionally changing format, compression, or chunking in the output file

nccopy [-k n] [-d n] [-s] [-c chunkspec] [-u] [-w] [-m n] [-h n] [-e n] [-r] infile outfile

The nccopy utility can be used to copy and convert an input netCDF file of any type to an output netCDF file in other (compatible) netCDF format variant, and optionally to rechunk, compress or decompress the data. For example, to convert a compressed netCDF-4 classic format file to a netCDF-3 file use

nccopy -k 1 ifile ofile
nccopy -k classic ifile ofile



The afterburner is the standard post-processor for ECHAM data and provides the following operations:

  • Extract specified variables and levels
  • Compute derived variables
  • Transform spectral data to Gaussian grid representation
  • Perform vertical interpolation from model levels to pressure or height levels
  • Write data in GRIB, netCDF, SERVICE or EXTRA format

Afterburner has been integrated into CDO (version 1.6.9 and later).


2D Data Analysis and Visualization Tools

Other software that integrates capabilities for data analysis and visualization (e.g. NCL, IDL, Python etc.) is described here.


To use the described tools you need to load appropriate modulefiles into the shell environment via the environment modules system. To list all available modulefiles use:

module avail


Qualtiy Assurance tool at DKRZ

The Quality Assurance (QA) tool developed at DKRZ tests the conformance of meta-data of climate simulations given in NetCDF format to conventions and rules of projects. Additionally, the QA checks time sequences and the technical state of data (i.e. occurrences of Inf or NaN, gaps, replicated sub-sets, etc.) for atomic data set entities, i.e. variable and frequency, e.g. tas and mon for monthly means of near-surface air temperature`. When atomic data sets are subdivided into several files, then changes between these files in terms of (auxiliary) coordinate variables will be detected as well as gaps or overlapping time ranges. This may also apply to follow-up experiments.

At present, the QA checks data of the projects CMIP5 and CORDEX by consulting tables based on requested controlled vocabulary and requirements. When such pre-defined information is given about directory structures, format of file names, variables and attributes, geographical domain, etc., then any deviation from compliance will be annotated in the QA results.


The Quality Assurance tool QA-DKRZ developed at DKRZ checks conformance of meta-data of climate simulations given in NetCDF format with conventions and rules of projects. At present, checking of CF Conventions, CMIP5, and CORDEX is supported. Userguide

Document Actions