source: palm/trunk/SOURCE/palm.f90 @ 2282

Last change on this file since 2282 was 2261, checked in by raasch, 7 years ago

changes in mrun: unified cycle numbers for output files are used, paths and filenames are allowed to contain arbitrary numbers of dots, archive feature completely removed from the script, nech related parts completely removed, OpenMP bugfix in prognostic_equations

  • Property svn:keywords set to Id
File size: 15.9 KB
Line 
1!> @file palm.f90
2!------------------------------------------------------------------------------!
3! This file is part of PALM.
4!
5! PALM is free software: you can redistribute it and/or modify it under the
6! terms of the GNU General Public License as published by the Free Software
7! Foundation, either version 3 of the License, or (at your option) any later
8! version.
9!
10! PALM is distributed in the hope that it will be useful, but WITHOUT ANY
11! WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR
12! A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
13!
14! You should have received a copy of the GNU General Public License along with
15! PALM. If not, see <http://www.gnu.org/licenses/>.
16!
17! Copyright 1997-2017 Leibniz Universitaet Hannover
18!------------------------------------------------------------------------------!
19!
20! Current revisions:
21! -----------------
22!
23!
24! Former revisions:
25! -----------------
26! $Id: palm.f90 2261 2017-06-08 14:25:57Z schwenkel $
27! output of run number for mrun to create unified cycle numbers
28!
29! 2233 2017-05-30 18:08:54Z suehring
30!
31! 2232 2017-05-30 17:47:52Z suehring
32! Renamed wall_flags_0 and wall_flags_00 into advc_flags_1 and advc_flags_2,
33! respectively, within copyin statement. Moreover, introduced further flag
34! array wall_flags_0.
35! Remove unused variables from ONLY list.
36!
37! 2178 2017-03-17 11:07:39Z hellstea
38! Calls for pmci_ensure_nest_mass_conservation and pres are added after
39! the nest initialization
40!
41! 2118 2017-01-17 16:38:49Z raasch
42! OpenACC directives and related code removed
43!
44! 2011 2016-09-19 17:29:57Z kanani
45! Flag urban_surface is now defined in module control_parameters.
46!
47! 2007 2016-08-24 15:47:17Z kanani
48! Temporarily added CALL for writing of restart data for urban surface model
49!
50! 2000 2016-08-20 18:09:15Z knoop
51! Forced header and separation lines into 80 columns
52!
53! 1976 2016-07-27 13:28:04Z maronga
54! Added call to radiation_last_actions for binary output of land surface model
55! data
56!
57! 1972 2016-07-26 07:52:02Z maronga
58! Added call to lsm_last_actions for binary output of land surface model data
59!
60! 1960 2016-07-12 16:34:24Z suehring
61! Separate humidity and passive scalar
62!
63! 1834 2016-04-07 14:34:20Z raasch
64! Initial version of purely vertical nesting introduced.
65!
66! 1833 2016-04-07 14:23:03Z raasch
67! required user interface version changed
68!
69! 1808 2016-04-05 19:44:00Z raasch
70! routine local_flush replaced by FORTRAN statement
71!
72! 1783 2016-03-06 18:36:17Z raasch
73! required user interface version changed
74!
75! 1781 2016-03-03 15:12:23Z raasch
76! pmc initialization moved from time_integration to here
77!
78! 1779 2016-03-03 08:01:28Z raasch
79! setting of nest_domain and coupling_char moved to the pmci
80!
81! 1764 2016-02-28 12:45:19Z raasch
82! cpp-statements for nesting removed, communicator settings cleaned up
83!
84! 1762 2016-02-25 12:31:13Z hellstea
85! Introduction of nested domain feature
86!
87! 1747 2016-02-08 12:25:53Z raasch
88! OpenACC-adjustment for new surface layer parameterization
89!
90! 1682 2015-10-07 23:56:08Z knoop
91! Code annotations made doxygen readable
92!
93! 1668 2015-09-23 13:45:36Z raasch
94! warning replaced by abort in case of failed user interface check
95!
96! 1666 2015-09-23 07:31:10Z raasch
97! check for user's interface version added
98!
99! 1482 2014-10-18 12:34:45Z raasch
100! adjustments for using CUDA-aware OpenMPI
101!
102! 1468 2014-09-24 14:06:57Z maronga
103! Adapted for use on up to 6-digit processor cores
104!
105! 1402 2014-05-09 14:25:13Z raasch
106! location messages added
107!
108! 1374 2014-04-25 12:55:07Z raasch
109! bugfix: various modules added
110!
111! 1320 2014-03-20 08:40:49Z raasch
112! ONLY-attribute added to USE-statements,
113! kind-parameters added to all INTEGER and REAL declaration statements,
114! kinds are defined in new module kinds,
115! old module precision_kind is removed,
116! revision history before 2012 removed,
117! comment fields (!:) to be used for variable explanations added to
118! all variable declaration statements
119!
120! 1318 2014-03-17 13:35:16Z raasch
121! module interfaces removed
122!
123! 1241 2013-10-30 11:36:58Z heinze
124! initialization of nuding and large scale forcing from external file
125!
126! 1221 2013-09-10 08:59:13Z raasch
127! +wall_flags_00, rflags_invers, rflags_s_inner in copyin statement
128!
129! 1212 2013-08-15 08:46:27Z raasch
130! +tri in copyin statement
131!
132! 1179 2013-06-14 05:57:58Z raasch
133! ref_state added to copyin-list
134!
135! 1113 2013-03-10 02:48:14Z raasch
136! openACC statements modified
137!
138! 1111 2013-03-08 23:54:10Z raasch
139! openACC statements updated
140!
141! 1092 2013-02-02 11:24:22Z raasch
142! unused variables removed
143!
144! 1036 2012-10-22 13:43:42Z raasch
145! code put under GPL (PALM 3.9)
146!
147! 1015 2012-09-27 09:23:24Z raasch
148! Version number changed from 3.8 to 3.8a.
149! OpenACC statements added + code changes required for GPU optimization
150!
151! 849 2012-03-15 10:35:09Z raasch
152! write_particles renamed lpm_write_restart_file
153!
154! Revision 1.1  1997/07/24 11:23:35  raasch
155! Initial revision
156!
157!
158! Description:
159! ------------
160!> Large-Eddy Simulation (LES) model for the convective boundary layer,
161!> optimized for use on parallel machines (implementation realized using the
162!> Message Passing Interface (MPI)). The model can also be run on vector machines
163!> (less well optimized) and workstations. Versions for the different types of
164!> machines are controlled via cpp-directives.
165!> Model runs are only feasible using the ksh-script mrun.
166!>
167!> @todo create routine last_actions instead of calling lsm_last_actions etc.
168!> @todo eventually move CALL usm_write_restart_data to suitable location
169!------------------------------------------------------------------------------!
170 PROGRAM palm
171 
172
173    USE arrays_3d
174
175    USE control_parameters,                                                    &
176        ONLY:  cloud_physics, constant_diffusion, coupling_char, coupling_mode,&
177               do2d_at_begin, do3d_at_begin, humidity, initializing_actions,   &
178               io_blocks, io_group, land_surface, large_scale_forcing,         &
179               message_string, microphysics_seifert, nest_domain, neutral,     &
180               nudging, passive_scalar, runnr, simulated_time,                 &
181               simulated_time_chr, urban_surface,                              &
182               user_interface_current_revision,                                &
183               user_interface_required_revision, version, wall_heatflux,       &
184               write_binary
185
186    USE cpulog,                                                                &
187        ONLY:  cpu_log, log_point, cpu_statistics
188
189    USE indices,                                                               &
190        ONLY:  nbgp
191
192    USE kinds
193
194    USE land_surface_model_mod,                                                &
195        ONLY:  lsm_last_actions
196
197    USE ls_forcing_mod,                                                        &
198        ONLY:  init_ls_forcing
199
200    USE nudge_mod,                                                             &
201        ONLY:  init_nudge
202
203    USE particle_attributes,                                                   &
204        ONLY:  particle_advection
205
206    USE pegrid
207
208    USE pmc_interface,                                                         &
209        ONLY:  cpl_id, nested_run, pmci_child_initialize, pmci_init,           &
210               pmci_modelconfiguration, pmci_parent_initialize,                &
211               pmci_ensure_nest_mass_conservation
212
213    USE radiation_model_mod,                                                   &
214        ONLY:  radiation, radiation_last_actions
215       
216    USE urban_surface_mod,                                                     &
217        ONLY:  usm_write_restart_data       
218
219    IMPLICIT NONE
220
221!
222!-- Local variables
223    CHARACTER(LEN=9)  ::  time_to_string  !<
224    CHARACTER(LEN=10) ::  env_string      !< to store string of environment var
225    INTEGER(iwp)      ::  env_stat        !< to hold status of GET_ENV
226    INTEGER(iwp)      ::  i               !<
227    INTEGER(iwp)      ::  myid_openmpi    !< OpenMPI local rank for CUDA aware MPI
228
229    version = 'PALM 4.0'
230    user_interface_required_revision = 'r1819'
231
232#if defined( __parallel )
233!
234!-- MPI initialisation. comm2d is preliminary set, because
235!-- it will be defined in init_pegrid but is used before in cpu_log.
236    CALL MPI_INIT( ierr )
237
238!
239!-- Initialize the coupling for nested-domain runs
240!-- comm_palm is the communicator which includes all PEs (MPI processes)
241!-- available for this (nested) model. If it is not a nested run, comm_palm
242!-- is returned as MPI_COMM_WORLD
243    CALL pmci_init( comm_palm )
244    comm2d = comm_palm
245!
246!-- Get the (preliminary) number of MPI processes and the local PE-id (in case
247!-- of a further communicator splitting in init_coupling, these numbers will
248!-- be changed in init_pegrid).
249    IF ( nested_run )  THEN
250
251       CALL MPI_COMM_SIZE( comm_palm, numprocs, ierr )
252       CALL MPI_COMM_RANK( comm_palm, myid, ierr )
253
254    ELSE
255
256       CALL MPI_COMM_SIZE( MPI_COMM_WORLD, numprocs, ierr )
257       CALL MPI_COMM_RANK( MPI_COMM_WORLD, myid, ierr )
258!
259!--    Initialize PE topology in case of coupled atmosphere-ocean runs (comm_palm
260!--    will be splitted in init_coupling)
261       CALL init_coupling
262    ENDIF
263#endif
264
265!
266!-- Initialize measuring of the CPU-time remaining to the run
267    CALL local_tremain_ini
268
269!
270!-- Start of total CPU time measuring.
271    CALL cpu_log( log_point(1), 'total', 'start' )
272    CALL cpu_log( log_point(2), 'initialisation', 'start' )
273
274!
275!-- Open a file for debug output
276    WRITE (myid_char,'(''_'',I6.6)')  myid
277    OPEN( 9, FILE='DEBUG'//TRIM( coupling_char )//myid_char, FORM='FORMATTED' )
278
279!
280!-- Initialize dvrp logging. Also, one PE maybe split from the global
281!-- communicator for doing the dvrp output. In that case, the number of
282!-- PEs available for PALM is reduced by one and communicator comm_palm
283!-- is changed respectively.
284#if defined( __parallel )
285    CALL MPI_COMM_RANK( comm_palm, myid, ierr )
286!
287!-- TEST OUTPUT (TO BE REMOVED)
288    WRITE(9,*) '*** coupling_mode = "', TRIM( coupling_mode ), '"'
289    FLUSH( 9 )
290    IF ( TRIM( coupling_mode ) /= 'uncoupled' )  THEN
291       PRINT*, '*** PE', myid, ' Global target PE:', target_id, &
292               TRIM( coupling_mode )
293    ENDIF
294#endif
295
296    CALL init_dvrp_logging
297
298!
299!-- Read control parameters from NAMELIST files and read environment-variables
300    CALL parin
301
302!
303!-- Check for the user's interface version
304    IF ( user_interface_current_revision /= user_interface_required_revision )  &
305    THEN
306       message_string = 'current user-interface revision "' //                  &
307                        TRIM( user_interface_current_revision ) // '" does ' // &
308                        'not match the required revision ' //                   &
309                        TRIM( user_interface_required_revision )
310        CALL message( 'palm', 'PA0169', 1, 2, 0, 6, 0 )
311    ENDIF
312
313!
314!-- Determine processor topology and local array indices
315    CALL init_pegrid
316
317!
318!-- Generate grid parameters
319    CALL init_grid
320
321!
322!-- Initialize nudging if required
323    IF ( nudging )  THEN
324       CALL init_nudge
325    ENDIF
326
327!
328!-- Initialize reading of large scale forcing from external file - if required
329    IF ( large_scale_forcing )  THEN
330       CALL init_ls_forcing
331    ENDIF
332
333!
334!-- Check control parameters and deduce further quantities
335    CALL check_parameters
336
337!
338!-- Initialize all necessary variables
339    CALL init_3d_model
340
341!
342!-- Coupling protocol setup for nested-domain runs
343    IF ( nested_run )  THEN
344       CALL pmci_modelconfiguration
345!
346!--    Receive and interpolate initial data on children.
347!--    Child initialization must be made first if the model is both child and
348!--    parent if necessary
349       IF ( TRIM( initializing_actions ) /= 'read_restart_data' )  THEN
350          CALL pmci_child_initialize
351!
352!--       Send initial condition data from parent to children
353          CALL pmci_parent_initialize
354!
355!--    Exchange_horiz is needed after the nest initialization
356          IF ( nest_domain )  THEN
357             CALL exchange_horiz( u, nbgp )
358             CALL exchange_horiz( v, nbgp )
359             CALL exchange_horiz( w, nbgp )
360             IF ( .NOT. neutral )  THEN
361                CALL exchange_horiz( pt, nbgp )
362             ENDIF
363             IF ( .NOT. constant_diffusion )  CALL exchange_horiz( e, nbgp )
364             IF ( humidity )  THEN
365                CALL exchange_horiz( q, nbgp )
366                IF ( cloud_physics  .AND.  microphysics_seifert )  THEN
367!                   CALL exchange_horiz( qc, nbgp )
368                   CALL exchange_horiz( qr, nbgp ) 
369!                   CALL exchange_horiz( nc, nbgp )
370                   CALL exchange_horiz( nr, nbgp )
371                ENDIF
372             ENDIF
373             IF ( passive_scalar )  CALL exchange_horiz( s, nbgp )
374
375             CALL pmci_ensure_nest_mass_conservation
376             CALL pres
377          ENDIF
378       ENDIF
379
380    ENDIF
381
382!
383!-- Output of program header
384    IF ( myid == 0 )  CALL header
385
386    CALL cpu_log( log_point(2), 'initialisation', 'stop' )
387
388!
389!-- Set start time in format hh:mm:ss
390    simulated_time_chr = time_to_string( simulated_time )
391
392!
393!-- If required, output of initial arrays
394    IF ( do2d_at_begin )  THEN
395       CALL data_output_2d( 'xy', 0 )
396       CALL data_output_2d( 'xz', 0 )
397       CALL data_output_2d( 'yz', 0 )
398    ENDIF
399
400    IF ( do3d_at_begin )  THEN
401       CALL data_output_3d( 0 )
402    ENDIF
403
404!
405!-- Integration of the model equations using timestep-scheme
406    CALL time_integration
407
408!
409!-- If required, write binary data for restart runs
410    IF ( write_binary(1:4) == 'true' )  THEN
411
412       CALL cpu_log( log_point(22), 'write_3d_binary', 'start' )
413
414       CALL location_message( 'writing restart data', .FALSE. )
415
416       CALL check_open( 14 )
417
418       DO  i = 0, io_blocks-1
419          IF ( i == io_group )  THEN
420!
421!--          Write flow field data
422             CALL write_3d_binary
423          ENDIF
424#if defined( __parallel )
425          CALL MPI_BARRIER( comm2d, ierr )
426#endif
427       ENDDO
428
429       CALL location_message( 'finished', .TRUE. )
430
431       CALL cpu_log( log_point(22), 'write_3d_binary', 'stop' )
432
433!
434!--    If required, write particle data
435       IF ( particle_advection )  CALL lpm_write_restart_file
436!
437!--    If required, write urban surface data
438       IF (urban_surface)  CALL usm_write_restart_data
439       
440    ENDIF
441
442!
443!-- If required, repeat output of header including the required CPU-time
444    IF ( myid == 0 )  CALL header
445!
446!-- If required, final land surface and user-defined actions, and
447!-- last actions on the open files and close files. Unit 14 was opened
448!-- in write_3d_binary but it is closed here, to allow writing on this
449!-- unit in routine user_last_actions.
450    CALL cpu_log( log_point(4), 'last actions', 'start' )
451    DO  i = 0, io_blocks-1
452       IF ( i == io_group )  THEN
453          IF ( land_surface )  THEN
454             CALL lsm_last_actions
455          ENDIF
456          IF ( radiation )  THEN
457             CALL radiation_last_actions
458          ENDIF
459          CALL user_last_actions
460          IF ( write_binary(1:4) == 'true' )  CALL close_file( 14 )
461       ENDIF
462#if defined( __parallel )
463       CALL MPI_BARRIER( comm2d, ierr )
464#endif
465    ENDDO
466    CALL close_file( 0 )
467    CALL close_dvrp
468    CALL cpu_log( log_point(4), 'last actions', 'stop' )
469
470#if defined( __mpi2 )
471!
472!-- Test exchange via intercommunicator in case of a MPI-2 coupling
473    IF ( coupling_mode == 'atmosphere_to_ocean' )  THEN
474       i = 12345 + myid
475       CALL MPI_SEND( i, 1, MPI_INTEGER, myid, 11, comm_inter, ierr )
476    ELSEIF ( coupling_mode == 'ocean_to_atmosphere' )  THEN
477       CALL MPI_RECV( i, 1, MPI_INTEGER, myid, 11, comm_inter, status, ierr )
478       PRINT*, '### myid: ', myid, '   received from atmosphere:  i = ', i
479    ENDIF
480#endif
481
482!
483!-- Write run number to file (used by mrun to create unified cycle numbers for
484!-- output files
485    IF ( myid == 0  .AND.  runnr > 0 )  THEN
486       OPEN( 90, FILE='RUN_NUMBER', FORM='FORMATTED' )
487       WRITE( 90, '(I4)' )  runnr
488       CLOSE( 90 )
489    ENDIF
490
491!
492!-- Take final CPU-time for CPU-time analysis
493    CALL cpu_log( log_point(1), 'total', 'stop' )
494    CALL cpu_statistics
495
496#if defined( __parallel )
497    CALL MPI_FINALIZE( ierr )
498#endif
499
500 END PROGRAM palm
Note: See TracBrowser for help on using the repository browser.