source: palm/trunk/SOURCE/palm.f90 @ 3924

Last change on this file since 3924 was 3885, checked in by kanani, 6 years ago

restructure/add location/debug messages

  • Property svn:keywords set to Id
File size: 21.3 KB
Line 
1!> @file palm.f90
2!------------------------------------------------------------------------------!
3! This file is part of the PALM model system.
4!
5! PALM is free software: you can redistribute it and/or modify it under the
6! terms of the GNU General Public License as published by the Free Software
7! Foundation, either version 3 of the License, or (at your option) any later
8! version.
9!
10! PALM is distributed in the hope that it will be useful, but WITHOUT ANY
11! WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR
12! A PARTICULAR PURPOSE.  See the GNU General Public License for more details.
13!
14! You should have received a copy of the GNU General Public License along with
15! PALM. If not, see <http://www.gnu.org/licenses/>.
16!
17! Copyright 1997-2019 Leibniz Universitaet Hannover
18!------------------------------------------------------------------------------!
19!
20! Current revisions:
21! -----------------
22!
23!
24! Former revisions:
25! -----------------
26! $Id: palm.f90 3885 2019-04-11 11:29:34Z monakurppa $
27! Changes related to global restructuring of location messages and introduction
28! of additional debug messages
29!
30! 3761 2019-02-25 15:31:42Z raasch
31! unused variable removed
32!
33! 3719 2019-02-06 13:10:18Z kanani
34! Included cpu measurement for wall/soil spinup
35!
36! 3703 2019-01-29 16:43:53Z knoop
37! Some interface calls moved to module_interface + cleanup
38!
39! 3648 2019-01-02 16:35:46Z suehring
40! Rename subroutines for surface-data output
41!
42! 3524 2018-11-14 13:36:44Z raasch
43! unused variable removed
44!
45! 3494 2018-11-06 14:51:27Z suehring
46! Last actions for surface output added
47!
48! 3487 2018-11-05 07:18:02Z maronga
49! Updates version to 6.0
50!
51! 3484 2018-11-02 14:41:25Z hellstea
52! pmci_ensure_nest_mass_conservation removed permanently
53!
54! 3458 2018-10-30 14:51:23Z kanani
55! from chemistry branch r3443, forkel:
56! removed double do_emis check around CALL chem_init
57! replaced call to calc_date_and_time to init_date_and_time
58!
59! 3337 2018-10-12 15:17:09Z kanani
60! (from branch resler)
61! Fix chemistry call
62!
63! 3298 2018-10-02 12:21:11Z kanani
64! - Minor formatting (kanani)
65! - Added Call of date_and_time_init (Russo)
66! - Added Call of calc_date_and_time before call of init_3d where emissions
67!   are initialized:
68!   we have to know the time indices to initialize emission values (Russo)
69! - Added Call of netcdf_data_input_chemistry_data (Russo)
70!
71! 3274 2018-09-24 15:42:55Z knoop
72! Modularization of all bulk cloud physics code components
73!
74! 3258 2018-09-18 07:23:31Z Giersch
75! current revision for user interface has been changed to 3240
76!
77! 3241 2018-09-12 15:02:00Z raasch
78! unused variables removed
79!
80! 3235 2018-09-07 14:06:15Z sward
81! Added mas_last_actions call and multi_agent_system_mod dependency
82!
83! 3232 2018-09-07 12:21:44Z raasch
84! references to mrun replaced by palmrun, and updated
85!
86! 3182 2018-07-27 13:36:03Z suehring
87! Deduct spinup_time from RUN_CONTROL output of main 3d run
88! (use time_since_reference_point instead of simulated_time)
89!
90! 2951 2018-04-06 09:05:08Z kanani
91! Add log_point_s for pmci_init
92!
93! 2903 2018-03-16 08:17:06Z hellstea
94! Nesting-related calls to pmci_ensure_nest_mass_conservation and pres after
95! the nest initialization are removed as they may create unwanted initial
96! perturbation in some cases.
97!
98! 2894 2018-03-15 09:17:58Z Giersch
99! Modified todo list, _write_restart_data/_last_actions has been renamed to
100! _wrd_local, unit 14 will be opened now for each io_group
101! write_3d_binary is called wrd_local now, wrd_global moved from wrd_local to
102! palm.f90, unit 14 is closed directly after the wrd_local call, Module related
103! routines for writing restart data have been moved to wrd_local
104!
105! 2801 2018-02-14 16:01:55Z suehring
106! Changed lpm from subroutine to module.
107! Introduce particle transfer in nested models.
108!
109! 2766 2018-01-22 17:17:47Z kanani
110! Removed preprocessor directive __chem
111!
112! 2720 2018-01-02 16:27:15Z kanani
113! Version update to 5.0
114!
115! 2718 2018-01-02 08:49:38Z maronga
116! Corrected "Former revisions" section
117!
118! 2696 2017-12-14 17:12:51Z kanani
119! Change in file header (GPL part)
120! Implementation of chemistry module (FK)
121! Introduce input-data standard
122! Rename lsm_last_actions into lsm_write_restart_data
123! Move usm_write_restart_data into io_blocks loop (MS)
124!
125! 2512 2017-10-04 08:26:59Z raasch
126! user interface required revision updated
127!
128! 2320 2017-07-21 12:47:43Z suehring
129! Modularize large-scale forcing and nudging
130!
131! 2298 2017-06-29 09:28:18Z raasch
132! type of write_binary changed from CHARACTER to LOGICAL,
133! user interface required revision updated, MPI2 related part removed
134!
135! 2296 2017-06-28 07:53:56Z maronga
136! Added call to new spinup routine
137!
138! 2292 2017-06-20 09:51:42Z schwenkel
139! Implementation of new microphysic scheme: cloud_scheme = 'morrison'
140! includes two more prognostic equations for cloud drop concentration (nc) 
141! and cloud water content (qc).
142!
143! 2261 2017-06-08 14:25:57Z raasch
144! output of run number for mrun to create unified cycle numbers
145!
146! 2233 2017-05-30 18:08:54Z suehring
147!
148! 2232 2017-05-30 17:47:52Z suehring
149! Renamed wall_flags_0 and wall_flags_00 into advc_flags_1 and advc_flags_2,
150! respectively, within copyin statement. Moreover, introduced further flag
151! array wall_flags_0.
152! Remove unused variables from ONLY list.
153!
154! 2178 2017-03-17 11:07:39Z hellstea
155! Calls for pmci_ensure_nest_mass_conservation and pres are added after
156! the nest initialization
157!
158! 2118 2017-01-17 16:38:49Z raasch
159! OpenACC directives and related code removed
160!
161! 2011 2016-09-19 17:29:57Z kanani
162! Flag urban_surface is now defined in module control_parameters.
163!
164! 2007 2016-08-24 15:47:17Z kanani
165! Temporarily added CALL for writing of restart data for urban surface model
166!
167! 2000 2016-08-20 18:09:15Z knoop
168! Forced header and separation lines into 80 columns
169!
170! 1976 2016-07-27 13:28:04Z maronga
171! Added call to radiation_last_actions for binary output of land surface model
172! data
173!
174! 1972 2016-07-26 07:52:02Z maronga
175! Added call to lsm_last_actions for binary output of land surface model data
176!
177! 1960 2016-07-12 16:34:24Z suehring
178! Separate humidity and passive scalar
179!
180! 1834 2016-04-07 14:34:20Z raasch
181! Initial version of purely vertical nesting introduced.
182!
183! 1833 2016-04-07 14:23:03Z raasch
184! required user interface version changed
185!
186! 1808 2016-04-05 19:44:00Z raasch
187! routine local_flush replaced by FORTRAN statement
188!
189! 1783 2016-03-06 18:36:17Z raasch
190! required user interface version changed
191!
192! 1781 2016-03-03 15:12:23Z raasch
193! pmc initialization moved from time_integration to here
194!
195! 1779 2016-03-03 08:01:28Z raasch
196! setting of nest_domain and coupling_char moved to the pmci
197!
198! 1764 2016-02-28 12:45:19Z raasch
199! cpp-statements for nesting removed, communicator settings cleaned up
200!
201! 1762 2016-02-25 12:31:13Z hellstea
202! Introduction of nested domain feature
203!
204! 1747 2016-02-08 12:25:53Z raasch
205! OpenACC-adjustment for new surface layer parameterization
206!
207! 1682 2015-10-07 23:56:08Z knoop
208! Code annotations made doxygen readable
209!
210! 1668 2015-09-23 13:45:36Z raasch
211! warning replaced by abort in case of failed user interface check
212!
213! 1666 2015-09-23 07:31:10Z raasch
214! check for user's interface version added
215!
216! 1482 2014-10-18 12:34:45Z raasch
217! adjustments for using CUDA-aware OpenMPI
218!
219! 1468 2014-09-24 14:06:57Z maronga
220! Adapted for use on up to 6-digit processor cores
221!
222! 1402 2014-05-09 14:25:13Z raasch
223! location messages added
224!
225! 1374 2014-04-25 12:55:07Z raasch
226! bugfix: various modules added
227!
228! 1320 2014-03-20 08:40:49Z raasch
229! ONLY-attribute added to USE-statements,
230! kind-parameters added to all INTEGER and REAL declaration statements,
231! kinds are defined in new module kinds,
232! old module precision_kind is removed,
233! revision history before 2012 removed,
234! comment fields (!:) to be used for variable explanations added to
235! all variable declaration statements
236!
237! 1318 2014-03-17 13:35:16Z raasch
238! module interfaces removed
239!
240! 1241 2013-10-30 11:36:58Z heinze
241! initialization of nuding and large scale forcing from external file
242!
243! 1221 2013-09-10 08:59:13Z raasch
244! +wall_flags_00, rflags_invers, rflags_s_inner in copyin statement
245!
246! 1212 2013-08-15 08:46:27Z raasch
247! +tri in copyin statement
248!
249! 1179 2013-06-14 05:57:58Z raasch
250! ref_state added to copyin-list
251!
252! 1113 2013-03-10 02:48:14Z raasch
253! openACC statements modified
254!
255! 1111 2013-03-08 23:54:10Z raasch
256! openACC statements updated
257!
258! 1092 2013-02-02 11:24:22Z raasch
259! unused variables removed
260!
261! 1036 2012-10-22 13:43:42Z raasch
262! code put under GPL (PALM 3.9)
263!
264! 1015 2012-09-27 09:23:24Z raasch
265! Version number changed from 3.8 to 3.8a.
266! OpenACC statements added + code changes required for GPU optimization
267!
268! 849 2012-03-15 10:35:09Z raasch
269! write_particles renamed lpm_write_restart_file
270!
271! Revision 1.1  1997/07/24 11:23:35  raasch
272! Initial revision
273!
274!
275! Description:
276! ------------
277!> Large-Eddy Simulation (LES) model for atmospheric and oceanic boundary-layer
278!> flows
279!> see the PALM homepage https://palm-model.org for further information
280!------------------------------------------------------------------------------!
281 PROGRAM palm
282 
283
284    USE arrays_3d
285
286    USE bulk_cloud_model_mod,                                                  &
287        ONLY: bulk_cloud_model, microphysics_morrison, microphysics_seifert
288
289    USE control_parameters,                                                    &
290        ONLY:  constant_diffusion, child_domain,                               &
291               coupling_char, do2d_at_begin, do3d_at_begin, humidity,          &
292               initializing_actions, io_blocks, io_group, message_string,      &
293               neutral, passive_scalar, runnr, simulated_time_chr, spinup,     &
294               time_since_reference_point, user_interface_current_revision,    &
295               user_interface_required_revision, version, write_binary
296
297    USE cpulog,                                                                &
298        ONLY:  cpu_log, log_point, log_point_s, cpu_statistics
299
300    USE date_and_time_mod,                                                     &
301        ONLY:  calc_date_and_time, init_date_and_time
302
303    USE indices,                                                               &
304        ONLY:  nbgp
305
306    USE kinds
307
308    USE module_interface,                                                      &
309        ONLY:  module_interface_last_actions
310
311    USE multi_agent_system_mod,                                                &
312        ONLY:  agents_active, mas_last_actions
313
314    USE netcdf_data_input_mod,                                                 &
315        ONLY:  netcdf_data_input_inquire_file, netcdf_data_input_init,         &
316               netcdf_data_input_surface_data, netcdf_data_input_topo
317
318    USE particle_attributes,                                                   &
319        ONLY:  particle_advection
320
321    USE pegrid
322
323    USE pmc_particle_interface,                                                &
324        ONLY: pmcp_g_alloc_win
325
326    USE pmc_interface,                                                         &
327        ONLY:  nested_run, pmci_child_initialize, pmci_init,                   &
328               pmci_modelconfiguration, pmci_parent_initialize
329               
330    USE surface_data_output_mod,                                               &
331        ONLY:  surface_data_output_last_action
332
333    USE write_restart_data_mod,                                                &
334        ONLY:  wrd_global, wrd_local
335
336#if defined( __parallel) && defined( _OPENACC )
337    USE openacc
338#endif
339
340
341    IMPLICIT NONE
342
343!
344!-- Local variables
345    CHARACTER(LEN=9) ::  time_to_string  !<
346    INTEGER(iwp)     ::  i               !< loop counter for blocked I/O
347#if defined( __parallel) && defined( _OPENACC )
348    INTEGER(iwp)     :: local_comm       !< local communicator (shared memory)
349    INTEGER(iwp)     :: local_num_procs  !< local number of processes
350    INTEGER(iwp)     :: local_id         !< local id
351    INTEGER(acc_device_kind) :: device_type !< device type for OpenACC
352    INTEGER(iwp)     ::  num_devices     !< number of devices visible to OpenACC
353    INTEGER(iwp)     ::  my_device       !< device used by this process
354#endif
355
356    version = 'PALM 6.0'
357    user_interface_required_revision = 'r3703'
358
359#if defined( __parallel )
360!
361!-- MPI initialisation. comm2d is preliminary set, because
362!-- it will be defined in init_pegrid but is used before in cpu_log.
363    CALL MPI_INIT( ierr )
364
365!
366!-- Initialize the coupling for nested-domain runs
367!-- comm_palm is the communicator which includes all PEs (MPI processes)
368!-- available for this (nested) model. If it is not a nested run, comm_palm
369!-- is returned as MPI_COMM_WORLD
370    CALL cpu_log( log_point_s(70), 'pmci_init', 'start' )
371    CALL pmci_init( comm_palm )
372    CALL cpu_log( log_point_s(70), 'pmci_init', 'stop' )
373    comm2d = comm_palm
374!
375!-- Get the (preliminary) number of MPI processes and the local PE-id (in case
376!-- of a further communicator splitting in init_coupling, these numbers will
377!-- be changed in init_pegrid).
378    IF ( nested_run )  THEN
379
380       CALL MPI_COMM_SIZE( comm_palm, numprocs, ierr )
381       CALL MPI_COMM_RANK( comm_palm, myid, ierr )
382
383    ELSE
384
385       CALL MPI_COMM_SIZE( MPI_COMM_WORLD, numprocs, ierr )
386       CALL MPI_COMM_RANK( MPI_COMM_WORLD, myid, ierr )
387!
388!--    Initialize PE topology in case of coupled atmosphere-ocean runs (comm_palm
389!--    will be splitted in init_coupling)
390       CALL init_coupling
391    ENDIF
392
393#ifdef _OPENACC
394!
395!-- Select OpenACC device to use in this process. For this find out how many
396!-- neighbors there are running on the same node and which id this process is.
397    IF ( nested_run )  THEN
398       CALL MPI_COMM_SPLIT_TYPE( comm_palm, MPI_COMM_TYPE_SHARED, 0,           &
399                                 MPI_INFO_NULL, local_comm, ierr )
400    ELSE
401       CALL MPI_COMM_SPLIT_TYPE( MPI_COMM_WORLD, MPI_COMM_TYPE_SHARED, 0,      &
402                                 MPI_INFO_NULL, local_comm, ierr )
403    ENDIF
404    CALL MPI_COMM_SIZE( local_comm, local_num_procs, ierr )
405    CALL MPI_COMM_RANK( local_comm, local_id, ierr )
406
407!
408!-- This loop including the barrier is a workaround for PGI compiler versions
409!-- up to and including 18.4. Later releases are able to select their GPUs in
410!-- parallel, without running into spurious errors.
411    DO i = 0, local_num_procs-1
412       CALL MPI_BARRIER( local_comm, ierr )
413
414       IF ( i == local_id )  THEN
415          device_type = acc_get_device_type()
416          num_devices = acc_get_num_devices( device_type )
417          my_device = MOD( local_id, num_devices )
418          CALL acc_set_device_num( my_device, device_type )
419       ENDIF
420    ENDDO
421
422    CALL MPI_COMM_FREE( local_comm, ierr )
423#endif
424#endif
425
426!
427!-- Initialize measuring of the CPU-time remaining to the run
428    CALL local_tremain_ini
429
430!
431!-- Start of total CPU time measuring.
432    CALL cpu_log( log_point(1), 'total', 'start' )
433    CALL cpu_log( log_point(2), 'initialisation', 'start' )
434
435!
436!-- Open a file for debug output
437    WRITE (myid_char,'(''_'',I6.6)')  myid
438    OPEN( 9, FILE='DEBUG'//TRIM( coupling_char )//myid_char, FORM='FORMATTED' )
439
440!
441!-- Initialize dvrp logging. Also, one PE maybe split from the global
442!-- communicator for doing the dvrp output. In that case, the number of
443!-- PEs available for PALM is reduced by one and communicator comm_palm
444!-- is changed respectively.
445#if defined( __parallel )
446    CALL MPI_COMM_RANK( comm_palm, myid, ierr )
447#endif
448
449    CALL init_dvrp_logging
450
451!
452!-- Read control parameters from NAMELIST files and read environment-variables
453    CALL parin
454
455!
456!-- Check for the user's interface version
457    IF ( user_interface_current_revision /= user_interface_required_revision )  &
458    THEN
459       message_string = 'current user-interface revision "' //                  &
460                        TRIM( user_interface_current_revision ) // '" does ' // &
461                        'not match the required revision ' //                   &
462                        TRIM( user_interface_required_revision )
463        CALL message( 'palm', 'PA0169', 1, 2, 0, 6, 0 )
464    ENDIF
465
466!
467!-- Determine processor topology and local array indices
468    CALL init_pegrid
469!
470!-- Check if input file according to input-data standard exists
471    CALL netcdf_data_input_inquire_file
472!
473!-- Read topography input data if required. This is required before the
474!-- numerical grid is finally created in init_grid
475    CALL netcdf_data_input_topo 
476!
477!-- Generate grid parameters, initialize generic topography and further process
478!-- topography information if required
479    CALL init_grid
480!
481!-- Read global attributes if available. 
482    CALL netcdf_data_input_init 
483!
484!-- Read surface classification data, e.g. vegetation and soil types, water
485!-- surfaces, etc., if available. Some of these data is required before
486!-- check parameters is invoked.     
487    CALL netcdf_data_input_surface_data
488!
489!-- Check control parameters and deduce further quantities
490    CALL check_parameters
491!
492!-- Initial time for chem_emissions_mod
493    CALL init_date_and_time
494
495    CALL init_3d_model
496
497!
498!-- Coupling protocol setup for nested-domain runs
499    IF ( nested_run )  THEN
500       CALL pmci_modelconfiguration
501!
502!--    Receive and interpolate initial data on children.
503!--    Child initialization must be made first if the model is both child and
504!--    parent if necessary
505       IF ( TRIM( initializing_actions ) /= 'read_restart_data' )  THEN
506          CALL pmci_child_initialize
507!
508!--       Send initial condition data from parent to children
509          CALL pmci_parent_initialize
510!
511!--       Exchange_horiz is needed after the nest initialization
512          IF ( child_domain )  THEN
513             CALL exchange_horiz( u, nbgp )
514             CALL exchange_horiz( v, nbgp )
515             CALL exchange_horiz( w, nbgp )
516             IF ( .NOT. neutral )  THEN
517                CALL exchange_horiz( pt, nbgp )
518             ENDIF
519             IF ( .NOT. constant_diffusion )  CALL exchange_horiz( e, nbgp )
520             IF ( humidity )  THEN
521                CALL exchange_horiz( q, nbgp )
522                IF ( bulk_cloud_model  .AND.  microphysics_morrison )  THEN
523                  CALL exchange_horiz( qc, nbgp )
524                  CALL exchange_horiz( nc, nbgp )
525                ENDIF
526                IF ( bulk_cloud_model  .AND.  microphysics_seifert )  THEN
527                   CALL exchange_horiz( qr, nbgp ) 
528                   CALL exchange_horiz( nr, nbgp )
529                ENDIF
530             ENDIF
531             IF ( passive_scalar )  CALL exchange_horiz( s, nbgp )
532          ENDIF
533       ENDIF
534
535       CALL pmcp_g_alloc_win                    ! Must be called after pmci_child_initialize and pmci_parent_initialize
536    ENDIF
537
538!
539!-- Output of program header
540    IF ( myid == 0 )  CALL header
541
542    CALL cpu_log( log_point(2), 'initialisation', 'stop' )
543
544!
545!-- Integration of the non-atmospheric equations (land surface model, urban
546!-- surface model)
547    IF ( spinup )  THEN
548       CALL cpu_log( log_point(41), 'wall/soil spinup', 'start' )
549       CALL time_integration_spinup
550       CALL cpu_log( log_point(41), 'wall/soil spinup', 'stop' )
551    ENDIF
552
553!
554!-- Set start time in format hh:mm:ss
555    simulated_time_chr = time_to_string( time_since_reference_point )
556
557!
558!-- If required, output of initial arrays
559    IF ( do2d_at_begin )  THEN
560       CALL data_output_2d( 'xy', 0 )
561       CALL data_output_2d( 'xz', 0 )
562       CALL data_output_2d( 'yz', 0 )
563    ENDIF
564
565    IF ( do3d_at_begin )  THEN
566       CALL data_output_3d( 0 )
567    ENDIF
568
569!
570!-- Integration of the model equations using timestep-scheme
571    CALL time_integration
572
573!
574!-- If required, write binary data for restart runs
575    IF ( write_binary )  THEN
576
577       CALL cpu_log( log_point(22), 'wrd_local', 'start' )
578
579       CALL location_message( 'writing restart data', 'start' )
580
581       DO  i = 0, io_blocks-1
582          IF ( i == io_group )  THEN
583
584!
585!--          Open binary file
586             CALL check_open( 14 )
587!
588!--          Write control parameters and other global variables for restart.
589             IF ( myid == 0 )  CALL wrd_global
590!
591!--          Write processor specific flow field data for restart runs
592             CALL wrd_local
593!
594!--          Close binary file
595             CALL close_file( 14 )
596
597          ENDIF
598#if defined( __parallel )
599          CALL MPI_BARRIER( comm2d, ierr )
600#endif
601       ENDDO
602
603       CALL location_message( 'writing restart data', 'finished' )
604
605       CALL cpu_log( log_point(22), 'wrd_local', 'stop' )
606
607!
608!--    If required, write particle data in own restart files
609       IF ( particle_advection )  CALL lpm_write_restart_file
610       
611    ENDIF
612!
613!-- Last actions for surface output, for instantaneous and time-averaged data
614    CALL surface_data_output_last_action( 0 )
615    CALL surface_data_output_last_action( 1 )
616
617!
618!-- If required, repeat output of header including the required CPU-time
619    IF ( myid == 0 )  CALL header
620!
621!-- Perform module specific last actions
622    CALL cpu_log( log_point(4), 'last actions', 'start' )
623
624    IF ( myid == 0 .AND. agents_active ) CALL mas_last_actions ! ToDo: move to module_interface
625
626    CALL module_interface_last_actions
627
628    CALL cpu_log( log_point(4), 'last actions', 'stop' )
629
630!
631!-- Close files
632    CALL close_file( 0 )
633    CALL close_dvrp
634
635!
636!-- Write run number to file (used by palmrun to create unified cycle numbers
637!-- for output files
638    IF ( myid == 0  .AND.  runnr > 0 )  THEN
639       OPEN( 90, FILE='RUN_NUMBER', FORM='FORMATTED' )
640       WRITE( 90, '(I4)' )  runnr
641       CLOSE( 90 )
642    ENDIF
643
644!
645!-- Take final CPU-time for CPU-time analysis
646    CALL cpu_log( log_point(1), 'total', 'stop' )
647    CALL cpu_statistics
648
649#if defined( __parallel )
650    CALL MPI_FINALIZE( ierr )
651#endif
652
653 END PROGRAM palm
Note: See TracBrowser for help on using the repository browser.