Version 13 (modified by knoop, 7 years ago) (diff)

--

PALM Tests

PALM features a test suite to ensure its reliability while development goes forward. It can be executed locally with the following options:

palmtest -h "<configuration>" -d "<list of test cases>" -X <max number of cores to execute on> -N "<test id>"

As these are all optional, the defaults are as follows:

  • configuration: default
  • list of test cases: all available
  • max number of cores to execute on: all available but not more that 32
  • test id: the current time

The Testserver

The server responsible for testing can be found here. It is a 32 Core shared memory node running Jenkins on top of Ubuntu Server 16.04. The server runs the following test script based on plain PALM installation.

#!/bin/bash

palmtest -h "default" -N "${BUILD_NUMBER}" -X 32
result_default=$?

if [[ ${result_default} -ne 0 ]]; then 
   palmtest -h "debug" -N "${BUILD_NUMBER}_debug" -X 32
   result_debug=$?
else
   result_debug=0
fi

if [[ ${result_default} -eq 0 ]] && [[ ${result_debug} -eq 0 ]]; then 
   exit 0
else
   exit 1
fi

How to add a test case

Any developer can add new test cases. This is as simple as adding a respective *_p3d file and *_rc file to the INSTALL directory in the PALM trunk. The palmtest script will detect all available *_p3d namelist files with matching *_rc in this directory, execute them using palmrun and compare them to the respective *_rc reverence file. Please make sure the reverence *_rc file is valid and reproducible. It is the developers responsibility to make sure that the newly added test succeeds at the time it is added! All methodical errors associated with a specific test case that arise later, are also the initial developers responsibility.

Rules for test case development

Please to comply with the following Rules while developing new test cases:

  1. The test case must be executable on 1, 2, 4, 8, 16 and 32 cores.
  2. The execution time of the test case on a single core should not exceed 10 seconds.
  3. The test case is only allowed to be committed after the developer made sure that the test can run successful on the server.
  4. If a test case creates problems that are not fixable within an acceptable timeframe, the test case will be removed. The test case may be added again, ones all problems are solved.

List of proposed test cases

Name Description Status Responsible Developers
example_cbl Simple good old convective boundary layer setup Implemented
ex_lsm_clearsky Basic LSM setup Implemented Katrin
test_oceanml Basic ocean mixed layer setup Implemented Siggi
FFTW / MG setup Missing
humidity setup (is used in ex_lsm_clearsky) Implemented
passive scalar setup Missing
topography setup Missing
particle LCM setup - for passive tracers with SGS fluctuations Missing Johannes
particle LCM setup - for cloud droplets including collision Missing Johannes
bulk-microphysics setup Missing Johannes
non-cyclic recycling setup Missing
non-cyclic turbulence generator setup Missing
cyclic fill setup Missing
restart setup Missing
all output activated setup Missing
USM + canopy setup Missing
RRTMG setup Missing
nesting setup Missing
chemistry setup Missing
Setup for testing the wind turbine model Missing

List of proposed build setups

Name Description Status Responsible Developers
default Based on gfortran, MPICH2, FFTW and NetCDF with options: -O3 -ffree-line-length-none Implemented Helge
debug Based on default but with debug options: -O0 -Wall -Wextra -pedantic -fcheck=all -fbacktrace Implemented Helge
different compiler: intel/cray/pgi ? Missing
non-MPI - should be serial. i.e. without -D__parallel (Siggi) Missing
hybrid MPI-OpenMP Missing
non-NetCDF - I think this is not necessary (Siggi) Missing
nopointer Missing
FFTW - only required after OpenACC is part of default code (Siggi) Missing
RRTMG Missing
KPP_CHEM Missing

Attachments (1)

Download all attachments as: .zip