5.0 Code installation

This chapter describes the installation of PALM on a Linux workstation (local host). There are two methods to install and run PALM. With the simple method, you can run PALM interactively on the local computer. This method is using the scripts palm_simple_install and palm_simple_run.  The simple method cannot create batch jobs itself, and cannot be used to create restart runs. Also file handling and most other features of the advanced method are not available. This method is only recommended for those who like to test PALM quickly.

The advanced method is for running PALM in batch mode on a suitable remote computer. The installation procedure is using the script mbuild. All jobs are started on the local host using the script mrun. The PALM output is automatically sent back from the remote host to the local host. Alternatively, mrun can also be used to start PALM in interactive mode on the local host, or as a batch job on the local host (if a queueing system like NQS, PBS, or LoadLeveler is available).

Only the advanced method gives full access to all PALM features.

Requirements

The installation and operation of PALM requires at mimimum (for the advanced method on both, the local and the remote host, unless stated otherwise):

  1. A Korn-shell (AT&T ksh or public domain ksh); must be available under /bin/ksh.
  2. A NetCDF-library with version number not earlier than 3.6.2 (for NetCDF, see under www.unidata.ucar.edu).
  3. A FORTRAN90/95 compiler.
  4. The Message Passing Interface (MPI), at least on the remote host, if the parallel version of PALM shall be used.
  5. On the local host, the revision control system subversion (see subversion.tigris.org). This is already included in many Linux distributions (e.g. SuSe). subversion requires port 3690 to be open for tcp/udp. If there are firewall restrictions concerning this port, the PALM code cannot be accessed. The user needs a permit to access the PALM repository. For getting a permit please contact the PALM group (raasch@muk.uni-hannover.de) and define a username under which you like to access the repository. You will then receive a password which allows access under this name.

    The advanced method additionally requires:
  6. A job queueing system on the remote host. Currently, mrun can handle LoadLeveler (IBM-AIX) and NQS/PBS (Linux-Clusters, NEC-SX).
  7. ssh/scp-connections to and from the remote host must not be blocked by a firewall.
Currently, mrun is configured to be used on a limited number of selected machines. These are SGI-ICE systems at computing center HLRN in Hannover (lcsgih), Berlin (lcsgib), IBM-Regatta systems at Yonsei University (ibmy), and at DKRZ, Hamburg (ibmh), an NEC-SX8 system at RIAM, Kyushu University, Fukuoka (necriam), as well as on the Linux cluster of IMUK (lcmuk), Tokyo Institute of Technology (lctit), Kyoto computing center (lckyoto), and Cray-XT4/5 systems at the Bergen Center for Computational Science (lcxt4) and at the Finish Meteorological Institute (lcxt5m). The strings given in brackets are the systems names (host identifiers) under which mrun identifies the different hosts.

You can also use mrun/PALM on other Linux-Cluster, IBM-AIX, or NEC-SX machines. See below on how to configure mrunfor other machines. However, these configurations currently (version 3.7a) allow to run PALM in interactive mode only. Batch mode requires manual adjustments for the respective queing system and MPI installation in scripts mrun, mbuild, and subjob.

The examples given in this chapter refer to an installation of PALM on an IMUK Linux workstation and (for the advanced method) the SGI-ICE system of HLRN, used as remote host. They are just referred to as local and remote host from now on.

The installation process for the advanced method requires a valid account on the local and on the remote host as well.

The advanced installation method is described below. For the simple method see the end of this chapter.
 

Package Installation

The first installation step requires creating a set of directories on the local and, for the advanced method, on the remote host. These are:

~/job_queue
~/palm
~/palm/current_version
~/palm/current_version/JOBS

The names of these directories can be freely selected (except ~/job_queue), however new users should choose them as suggested, since many examples in this documentation as well as all example files are based on these settings. The directory ~/palm/current_version on the local host will be called the working directory from now on.

In the second step a working copy of the recent PALM version, including the source code, scripts, documentation, etc.  must be copied to the working directory (local host!) by executing the following commands. Replace <your username> by the name that you have chosen to access the repository, and <#> by any of the available PALM releases, e.g. "3.7a" (new releases will be announced to members of the PALM mailing list).

cd ~/palm/current_version
svn checkout --username <your username> svn://130.75.105.2/palm/tags/release-<#> trunk

You will then be prompted for your password. After completion, there should be a subdirectory trunk in your working directory. It contains a number of further subdirectories, which contain e.g. the PALM source code (SOURCE) and the scripts for running PALM (SCRIPTS).

Alternatively, executing

svn checkout --username <your username> svn://130.75.105.2/palm/tags/release-<#> abcde

will place your working copy in a directory named abcde (instead of a directory named trunk). But keep in mind that you will have to adjust several paths given below, if you do not use the default directory trunk.

Please never touch any file in your working copy of PALM, unless you know what you are doing.

You can also get a copy of the most recent developer code by executing

svn checkout --username <your username> svn://130.75.105.2/palm/trunk trunk
This version may contain new features (they might not be documented well), but it may also contain bugs.

Package Configuration

To use the PALM scripts, the PATH-variable has to be extended and the environment variable PALM_BIN has to be set (on local and remote host) in the respective profile of the users default shell (e.g. in .profile, if ksh is used):

export PATH=$HOME/palm/current_version/trunk/SCRIPTS:$PATH
export PALM_BIN=
$HOME/palm/current_version/trunk/SCRIPTS

You may have to login again in order to activate these settings.

On the local and on the remote host, some small helper/utility programs have to be installed, which are later used by mrun e.g. for PALM data postprocessing. The installation is done by mbuild. This script requires a configuration file .mrun.config, which will be also used by mrun in the following. A copy has to be put into the working directory under the name .mrun.config by

cp trunk/SCRIPTS/.mrun.config.default .mrun.config

Beside many other things, this file contains typical installation parameters like compiler name, compiler options, etc. for a set of different (remote) hosts. Please edit this file, uncomment lines like
#%remote_username  <replace by your ... username>   <host identifier>

by removing the first hash (#) character and replace the string "<replace by ...>" by your username on the respective host given in the <host identifier>. You only have to uncomment lines for those hosts on which you intend to use PALM.

Warning: When editing the configuration file, please NEVER use the TAB key. Otherwise, very confusing errors may occur when mrun is executing.

Beside the default configuration file .mrun.config.default, the directory trunk/SCRIPTS contains additional configuration files which are already adjusted for special hosts, e.g. .mrun.config.imuk can be used at Hannover University, etc.. These files have to be edited in the same way as described above.

After modifying the configuration file, the respective executables are generated by executing

mbuild -u -h lcmuk
mbuild -u -h lcsgih

The second call also copies the PALM scripts (like mrun and mbuild) to the remote host.

Pre-Compilation of PALM Code


To avoid the re-compilation of the complete source code for each model run, PALM willl be pre-compiled once on the remote host by again using the script mbuild. Due to the use of FORTRAN modules in the source code, the subroutines must be compiled in a certain order. Therefore the so-called make mechanism is used (see the respective man-page of the Unix operating system), requiring a Makefile, in which the dependencies are described. This file is found in subdirectory trunk/SOURCE, where also the PALM code is stored. The compiled sources (object files) are stored on the remote computer in the default directory ~/palm/current_version/MAKE_DEPOSITORY_<block_descriptor>, where <block_descriptor> is composed of the third (and fourth, if existing) column of the respective block in the configuration file (e.g. lcsgih_parallel for HLRN).

The pre-compilation for the remote host (here the SGI-ICE system of HLRN) is done by

mbuild -h lcsgih  

mbuild will prompt some queries, which must all be answered "y" by the user. The compiling process will take some time. mbuild transfers the respective compiler calls to the remote host where they are carried out interactively. You can follow the progress at the terminal window, where also error messages are displayed (hopefully not for this standard installation). By just entering

mbuild

PALM will be (consecutively) pre-compiled for all remote hosts listed in the configuration file. If you want to compile for the local host only, please enter

mbuild -h lcmuk

Installation Verification

As a last step, after the compilation has been finished, the PALM installation has to be verified. For this purpose a simple test run is carried out. This once again requires the mrun configuration file (described in chapter 3.2), as well as the parameter file (described in chapter 4.4.1). The parameter file must be copied from the PALM working copy by
mkdir -p JOBS/example_cbl/INPUT
cp trunk/INSTALL/example_cbl_p3d JOBS/example_cbl/INPUT/example_cbl_p3d

The test run can now be started by executing the command

mrun -d example_cbl -h lcsgih -K parallel -X 8 -T 8 -t 500 -q testq -r “d3# pr#”
This specific run will be carried out on 8 PEs and is allowed to use up to 500 seconds CPU time. After pressing <return>, the most important settings of the job are displayed at the terminal window and the user is prompted for o.k. (“y”). Next, a message of the queuing system like “RequestSubmitted to queue… by…” should be displayed. Now the job is queued and either started immediately or at a later time, depending on the current workload of the remote host. Provided that it is executed immediately and that all things work as designed, the job protocol of this run will appear under the file name ~/job_queue/lcsgih_example no more than a few minutes later. The content of this file should be carefully examined for any error messages.

Beside the job protocol and according to the configuration file and arguments given for mrun options -d and -r,further files should be found in the directories

~/palm/current_version/JOBS/example_cbl/MONITORING

and

    ~/palm/current_version/JOBS/example_cbl/OUTPUT

Please compare the contents of file

    ~/palm/current_version/JOBS/example_cbl/MONITORING/lcsgih_example_rc

with those of the example result file which can be found under trunk/INSTALL/example_cbl_rc., e.g. by using the standard diff command:

diff  JOBS/example_cbl/MONITORING/lcsgih_example_cbl_rc trunk/INSTALL/example_cbl_rc

where it is assumed that your working directory is ~/palm/current_version.

You should not find any difference between these two files, except of the run date and time displayed at the top of the file header. If the file contents are identical, the installation is successfully completed.

Configuration for other machines

Starting from version 3.2a, beside the default hosts (HLRN, etc.), PALM can also be installed and run on other Linux-Cluster-, IBM-AIX, or NEC-SX-systems. To configure PALM for a non-default host only requires to add some lines to the configuration file .mrun.config.

First, you have to define the host identifier (a string of arbitrary length) under which your local host shall be identified by adding a line

%host_identifier  <hostname>  <host identifier>

to the configuration file (best to do this in the section where the other default host identifiers are defined). Here <hostname> must be the name of your local host as provided by the unix-command "hostname". The first characters of <host identifier> have to be "lc", if your system is (part of) a linux-cluster, "ibm", or "nec" in case of an IBM-AIX- or NEC-SX-system, respectively. For example, if you want to install on a linux-cluster, the line may read as

%host_identifier  foo  lc_bar

In the second step, you have to give all informations neccessary to compile and run PALM on your local host by adding an additional section to the configuration file:

%remote_username   <1>      <host identifier> parallel
%tmp_user_catalog  <2>      <host identifier> parallel
%compiler_name     <3>      <host identifier> parallel
%compiler_name_ser <4>      <host identifier> parallel
%cpp_options       <5>      <host identifier> parallel
%netcdf_inc        <6>      <host identifier> parallel
%netcdf_lib        <7>      <host identifier> parallel
%fopts             <8>      <host identifier> parallel
%lopts             <9>      <host identifier> parallel

The section consists of four columns each separated by one or more blanks. The first column gives the name of the respective environment variable used by mrun and mbuild, while the second column defines its value. The third column has to be the host identifier as defined above, and the last column in each line must contain the string "parallel". Otherwise, the respective line(s) will be interpreted as belonging to the setup for compiling and running a serial (non-parallel) version of PALM.

All brackets have to be replaced by the appropriate settings for your local host:

A typical example may be:
%remote_username   raasch                                  lc_bar parallel
%tmp_user_catalog  /tmp                                    lc_bar parallel
%compiler_name     mpif90                                  lc_bar parallel
%compiler_name_ser ifort                                   lc_bar parallel
%cpp_options       -DMPI_REAL=MPI_DOUBLE_PRECISION:-DMPI_2REAL=MPI_2DOUBLE_PRECISION:-D__netcdf  lc_bar parallel
%netcdf_inc        -I:/usr/local/netcdf/include            lc_bar parallel
%netcdf_lib        -L/usr/local/netcdf/lib:-lnetcdf        lc_bar parallel
%fopts             -axW:-cpp:-openmp:-r8:-nbs              lc_bar parallel
%lopts             -axW:-cpp:-openmp:-r8:-nbs:-Vaxlib      lc_bar parallel

Currently (version 3.7a), depending on the MPI version which is running on your local host, the options for the execution command (which may be mpirun or mpiexec) may have to be adjusted manually in the mrun-script. A future version will allow to give the respective settings in the configuration file.

If you have any problems with the PALM installation, the members of the PALM working group are pleased to help you.


Simple installation method

The simple installation method is using the scripts palm_simple_install and palm_simple_run for installing and running PALM.

Package Installation

First step: Create a directory:

mkdir -p ~/palm/current_version/

You can freely choose the directory name, but if you intend to switch to the advanced method of running PALM later, you should use ~/palm/current_version. This directory will be called working directory from now on.

Second step: Check out a working copy of the recent PALM version from the svn-repository. Replace <your username> by your valid repository username, and <#> by any of the available PALM releases, e.g. "3.7a" (new releases will be announced to members of the PALM mailing list).

cd ~/palm/current_version
svn checkout --username <your username> svn://130.75.105.2/palm/tags/release-<#> trunk

You will be prompted for your password. After completion, a subdirectory trunk will appear in your working directory. It contains a number of further subdirectories, which contain e.g. the PALM source code (SOURCE) and the scripts for running PALM (SCRIPTS). For checking out the most recent PALM version (developer version), see the advanced installation method described above.

Please never touch any file in your working copy of PALM, unless you really know what you are doing.


Configuration and compilation

Third step: To use the PALM scripts, the PATH-variable has to be extended and the environment variable PALM_BIN has to be set. For convenience, setting should be done in the respective profile of the users default shell (e.g. in .profile, if ksh is used):

export PALM_BIN=$HOME/palm/current_version/trunk/SCRIPTS
export PATH=$PALM_BIN:$PATH

You may have to login again in order to activate the profile settings.

Fourth step: Call the installation script:

palm_simple_install -i MAKE.inc.ifort.imuk

The script copies the PALM source code into a new subdirectory MAKE_DEPOSITORY_simple. This directory also will contain a Makefile and an include file MAKE.inc to be used for compiling the code in the next step. The include file contains compiler name, compiler options, library path for NetCDF, etc. Please adjust these settings as required by your system, before you proceed with the next step. The default settings in MAKE.inc are for the Intel-FORTRAN compiler in the IMUK environment. You may find default settings for other compilers and environments in .../trunk/INSTALL/ See for files MAKE.inc.*.

cd MAKE_DEPOSITORY_simple
make
cd ..

In order to shorten compilation time, you can run make in parallel, e.g. make -j 4 runs 4 compile threads simultaneously.

Fifth step: Carry out a test run in order to check the installation. The test run (as every PALM run) requires a parameter file for steering PALM, which is in FORTRAN-NAMELIST format. This file has been already generated by the installation script palm_simple_install under JOBS/example_cbl/INPUT/example_cbl_p3d. PALM is started with script palm_simple_run. Before the first run, it may be necessary to change the MPI-execution command (mpiexec, mpirun, etc.) and options in this script, depending on the MPI-library that you are using. The script can be found in directory .../trunk/SCRIPTS. Default settings in this script are for the SGI-mpt-library installed on HLRN. You will find the execution command almost at the end of this script. After having adjusted the MPI-execution command, start the run with:

palm_simple_run -p 4 -n 4 -c example_cbl

where option -p gives the total number of MPI tasks, -n gives the number of MPI tasks per node, and -c gives the parameter file to be used.
After the run has finished, all OUTPUT files can be found in directory OUTPUT.... Names and contents of PALM output files are described in chapter 3.4. The directory name is composed of the parameter file name, the number of cores that have been used, and the current data and time. For every run a unique directory is created.

Sixth step: To verify the results of this example run, compare it with the default result:

diff  OUTPUT..../RUN_CONTROL   trunk/INSTALL/example_cbl_rc

You should not find any difference between these two files, except of the run date and time displayed at the top of the file header, and, maybe, the number of cores that have been used. If the file contents are identical, the installation is successfully completed.

up

Last change:  $Id: chapter_5.0.html 287 2009-04-09 08:59:36Z raasch $