Compiling and Running CHIMERE on Shaheen 2

CHIMERE – Chemistry-transport model version 2020r3 is a is a Eulerian chemistry-transport model (CTM).

The following guide shows how to compile and run CHIMERE in /scratch directory of Shaheen 2. We will be using Intel Compiler installed on Shaheen 2 to compile CHIMERE and its dependencies.

Please run the compilation on cdl5 login node

ssh cdl5

Compiling

First download the source tarball from CHIMERE's webpage. For this you should have registered and have a username and a password sent to you.

cd /scratch/$USER && mkdir -p chimere_install && cd chimere_install wget --user=your.email --password=your.password https://www.lmd.polytechnique.fr/chimdata/chimere_v2020r3.tar.gz tar xvf chimere_v2020r3.tar.gz export CHIMERE_ROOT=/scratch/$USER/chimere_install/chimere_v2020r3 export CHIMERE_BASE=/scratch/$USER/chimere_install

Dependencies

CHIMERE requires the following dependencies installed before it can be built:

  • C, C++ and Fortran compilers

  • NETCDF

  • HDF5 parallel

  • GRIB API

  • Jasper

  • Eccodes

  • BLITZ C++ library

Some of the above are installed as modulefiles on Shaheen, you can use them out-of-the-box. The rest can be built with the same compiler suite as CHIMERE is intended to be built.

Setup shell environment for compiling

Before anything, lets load the modules required for the compiling.

module swap PrgEnv-cray PrgEnv-intel module load cray-netcdf-hdf5parallel module load cray-hdf5-parallel module load jasper udunits2 module load blitz/1.0.2 export NETCDF=$NETCDF_DIR export GRIB_DIR=$CHIMERE_BASE

GRIB_API

Let’s download and compile grib_api first:

Eccodes

Now compile the Eccodes source and install in the same directory as GRIB_API

Chimere Configuration files for compilation

This step creates the files to be used by the CHIMERE builder to make use of correct compiler and libraries.

  1. Create mychimere/mychimere-shaheen.intel .

  2. Edit mychimere-shaheen.intel and copy the following content to it:

  3. Create mychimere/config_wrf/configure.wrf.shaheen.intel. This file guides CHIMERE builder to configure WRF on Shaheen 2.

  4. Edit configure.wrf.shaheen.intel and copy the following content in it:

  5. Create mychimere/config_wps/configure.wps.shaheen.intel. This file guides CHIMERE builder to configure WRF on Shaheen 2.

  6. Edit configure.wps.shaheen.intel and copy the following content in it:

  7. Create mychimere/makefile.hdr.Makefile.hdr.shaheen.intel

  8. Edit Makefile.hdf.shaheen.intel and copy the following in it:

  9. CHIMERE tries to discover OpenMPI in the environment. We are not using it on Shaheen. To disable this discovery and allow the compiler wrappers to link MPI on automatically, we will comment the following lines in the file $CHIMERE_ROOT/scripts/check_config.sh:

 

Building WRF and WPS

We are now ready to build WRF and WPS using script provided by CHIMERE

The compiler logs can be found and monitored in

Building CHIMERE

We are now ready to build CHIMERE

 

Running CHIMERE

CHIMERE is installed to run on compute nodes of Shaheen. Therefore, it will not work on the login nodes.

TestCase preparation

But before we can submit the job, we need to download the TestCase2020.tar.gz and related databases from https://www.lmd.polytechnique.fr/chimere/2020_getcode.php

Keep your data and the Test directory in scratch.

Also, after extracting data from from the downloaded tarballs, you may need to rename MEGAN dataset to the directory name, NETCDF_150sec or NETCDF_30sec, depending on your choice.

The demonstration here runs CHIMERE using Offline methods. For this you will need to update the _FILL_. In this demo, the TestCase is in a directory in scratch (/scratch/shaima0d/tickets/42259)

Below are some changes you will to do in the file ${chimere_root}/scripts/chimere_step2.sh. Some of the lines should be commented for srun, the native MPI launcher on Shaheen, can launch the parallel jobs.

 

The following is a jobscript demonstrating how to run CHIMERE on a compute node of Shaheen.

Batch job

At the end of successful run the root test directory should be populated with three netcdf files:

Happy hunting