Ignore:
Timestamp:
Dec 23, 2010 12:06:00 PM (13 years ago)
Author:
suehring
Message:

summary:


Gryschka:

  • Coupling with different resolution and different numbers of PEs in ocean and atmosphere is available
  • Exchange of u and v from ocean surface to atmosphere surface
  • Mirror boundary condition for u and v at the bottom are replaced by dirichlet boundary conditions
  • Inflow turbulence is now defined by flucuations around spanwise mean
  • Bugfixes for cyclic_fill and constant_volume_flow

Suehring:

  • New advection added ( Wicker and Skamarock 5th order ), therefore:
    • New module advec_ws.f90
    • Modified exchange of ghost boundaries.
    • Modified evaluation of turbulent fluxes
    • New index bounds nxlg, nxrg, nysg, nyng

advec_ws.f90


Advection scheme for scalars and momentum using the flux formulation of
Wicker and Skamarock 5th order.
Additionally the module contains of a routine using for initialisation and
steering of the statical evaluation. The computation of turbulent fluxes takes
place inside the advection routines.
In case of vector architectures Dirichlet and Radiation boundary conditions are
outstanding and not available. Furthermore simulations within topography are
not possible so far. A further routine local_diss_ij is available and is used
if a control of dissipative fluxes is desired.

check_parameters.f90


Exchange of parameters between ocean and atmosphere via PE0
Check for illegal combination of ws-scheme and timestep scheme.
Check for topography and ws-scheme.
Check for not cyclic boundary conditions in combination with ws-scheme and
loop_optimization = 'vector'.
Check for call_psolver_at_all_substeps and ws-scheme for momentum_advec.

Different processor/grid topology in atmosphere and ocean is now allowed!
Bugfixes in checking for conserve_volume_flow_mode.

exchange_horiz.f90


Dynamic exchange of ghost points with nbgp_local to ensure that no useless
ghost points exchanged in case of multigrid. type_yz(0) and type_xz(0) used for
normal grid, the remaining types used for the several grid levels.
Exchange is done via MPI-Vectors with a dynamic value of ghost points which
depend on the advection scheme. Exchange of left and right PEs is 10% faster
with MPI-Vectors than without.

flow_statistics.f90


When advection is computed with ws-scheme, turbulent fluxes are already
computed in the respective advection routines and buffered in arrays
sums_xxxx_ws_l(). This is due to a consistent treatment of statistics
with the numerics and to avoid unphysical kinks near the surface. So some if-
requests has to be done to dicern between fluxes from ws-scheme other advection
schemes. Furthermore the computation of z_i is only done if the heat flux
exceeds a minimum value. This affects only simulations of a neutral boundary
layer and is due to reasons of computations in the advection scheme.

inflow_turbulence.f90


Using nbgp recycling planes for a better resolution of the turbulent flow near
the inflow.

init_grid.f90


Definition of new array bounds nxlg, nxrg, nysg, nyng on each PE.
Furthermore the allocation of arrays and steering of loops is done with these
parameters. Call of exchange_horiz are modified.
In case of dirichlet bounday condition at the bottom zu(0)=0.0
dzu_mg has to be set explicitly for a equally spaced grid near bottom.
ddzu_pres added to use a equally spaced grid near bottom.

init_pegrid.f90


Moved determination of target_id's from init_coupling
Determination of parameters needed for coupling (coupling_topology, ngp_a, ngp_o)
with different grid/processor-topology in ocean and atmosphere

Adaption of ngp_xy, ngp_y to a dynamic number of ghost points.
The maximum_grid_level changed from 1 to 0. 0 is the normal grid, 1 to
maximum_grid_level the grids for multigrid, in which 0 and 1 are normal grids.
This distinction is due to reasons of data exchange and performance for the
normal grid and grids in poismg.
The definition of MPI-Vectors adapted to a dynamic numer of ghost points.
New MPI-Vectors for data exchange between left and right boundaries added.
This is due to reasons of performance (10% faster).

ATTENTION: nnz_x undefined problem still has to be solved!!!!!!!!
TEST OUTPUT (TO BE REMOVED) logging mpi2 ierr values

parin.f90


Steering parameter dissipation_control added in inipar.

Makefile


Module advec_ws added.

Modules


Removed u_nzb_p1_for_vfc and v_nzb_p1_for_vfc

For coupling with different resolution in ocean and atmophere:
+nx_a, +nx_o, ny_a, +ny_o, ngp_a, ngp_o, +total_2d_o, +total_2d_a,
+coupling_topology

Buffer arrays for the left sided advective fluxes added in arrays_3d.
+flux_s_u, +flux_s_v, +flux_s_w, +diss_s_u, +diss_s_v, +diss_s_w,
+flux_s_pt, +diss_s_pt, +flux_s_e, +diss_s_e, +flux_s_q, +diss_s_q,
+flux_s_sa, +diss_s_sa
3d arrays for dissipation control added. (only necessary for vector arch.)
+var_x, +var_y, +var_z, +gamma_x, +gamma_y, +gamma_z
Default of momentum_advec and scalar_advec changed to 'ws-scheme' .
+exchange_mg added in control_parameters to steer the data exchange.
Parameters +nbgp, +nxlg, +nxrg, +nysg, +nyng added in indices.
flag array +boundary_flags added in indices to steer the degradation of order
of the advective fluxes when non-cyclic boundaries are used.
MPI-datatypes +type_y, +type_y_int and +type_yz for data_exchange added in
pegrid.
+sums_wsus_ws_l, +sums_wsvs_ws_l, +sums_us2_ws_l, +sums_vs2_ws_l,
+sums_ws2_ws_l, +sums_wspts_ws_l, +sums_wssas_ws_l, +sums_wsqs_ws_l
and +weight_substep added in statistics to steer the statistical evaluation
of turbulent fluxes in the advection routines.
LOGICALS +ws_scheme_sca and +ws_scheme_mom added to get a better performance
in prognostic_equations.
LOGICAL +dissipation_control control added to steer numerical dissipation
in ws-scheme.

Changed length of string run_description_header

pres.f90


New allocation of tend when ws-scheme and multigrid is used. This is due to
reasons of perforance of the data_exchange. The same is done with p after
poismg is called.
nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng when no
multigrid is used. Calls of exchange_horiz are modified.

bugfix: After pressure correction no volume flow correction in case of
non-cyclic boundary conditions
(has to be done only before pressure correction)

Call of SOR routine is referenced with ddzu_pres.

prognostic_equations.f90


Calls of the advection routines with WS5 added.
Calls of ws_statistics added to set the statistical arrays to zero after each
time step.

advec_particles.f90


Declaration of de_dx, de_dy, de_dz adapted to additional ghost points.
Furthermore the calls of exchange_horiz were modified.

asselin_filter.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng

average_3d_data.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng

boundary_conds.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng
Removed mirror boundary conditions for u and v at the bottom in case of
ibc_uv_b == 0. Instead, dirichelt boundary conditions (u=v=0) are set
in init_3d_model

calc_liquid_water_content.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng

calc_spectra.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng for
allocation of tend.

check_open.f90


Output of total array size was adapted to nbgp.

data_output_2d.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng in loops and
allocation of arrays local_2d and total_2d.
Calls of exchange_horiz are modified.

data_output_2d.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng in loops and
allocation of arrays. Calls of exchange_horiz are modified.
Skip-value skip_do_avs changed to a dynamic adaption of ghost points.

data_output_mask.f90


Calls of exchange_horiz are modified.

diffusion_e.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng

diffusion_s.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng

diffusion_u.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng

diffusion_v.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng

diffusion_w.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng

diffusivities.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng

diffusivities.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng.
Calls of exchange_horiz are modified.

exchange_horiz_2d.f90


Dynamic exchange of ghost points with nbgp, which depends on the advection
scheme. Exchange between left and right PEs is now done with MPI-vectors.

global_min_max.f90


Adapting of the index arrays, because MINLOC assumes lowerbound
at 1 and not at nbgp.

init_3d_model.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng in loops and
allocation of arrays. Calls of exchange_horiz are modified.
Call ws_init to initialize arrays needed for statistical evaluation and
optimization when ws-scheme is used.
Initial volume flow is now calculated by using the variable hom_sum.
Therefore the correction of initial volume flow for non-flat topography
removed (removed u_nzb_p1_for_vfc and v_nzb_p1_for_vfc)
Changed surface boundary conditions for u and v in case of ibc_uv_b == 0 from
mirror bc to dirichlet boundary conditions (u=v=0), so that k=nzb is
representative for the height z0

Bugfix: type conversion of '1' to 64bit for the MAX function (ngp_3d_inner)

init_coupling.f90


determination of target_id's moved to init_pegrid

init_pt_anomaly.f90


Call of exchange_horiz are modified.

init_rankine.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng.
Calls of exchange_horiz are modified.

init_slope.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng.

header.f90


Output of advection scheme.

poismg.f90


Calls of exchange_horiz are modified.

prandtl_fluxes.f90


Changed surface boundary conditions for u and v from mirror bc to dirichelt bc,
therefore u(uzb,:,:) and v(nzb,:,:) is now representative for the height z0
nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng

production_e.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng

read_3d_binary.f90


+/- 1 replaced with +/- nbgp when swapping and allocating variables.

sor.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng.
Call of exchange_horiz are modified.
bug removed in declaration of ddzw(), nz replaced by nzt+1

subsidence.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng.

sum_up_3d_data.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng.

surface_coupler.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng in
MPI_SEND() and MPI_RECV.
additional case for nonequivalent processor and grid topopolgy in ocean and
atmosphere added (coupling_topology = 1)

Added exchange of u and v from Ocean to Atmosphere

time_integration.f90


Calls of exchange_horiz are modified.
Adaption to slooping surface.

timestep.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng.

user_3d_data_averaging.f90, user_data_output_2d.f90, user_data_output_3d.f90,
user_actions.f90, user_init.f90, user_init_plant_canopy.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng.

user_read_restart_data.f90


Allocation with nbgp.

wall_fluxes.f90


nxl-1, nxr+1, nys-1, nyn+1 replaced by nxlg, nxrg, nysg, nyng.

write_compressed.f90


Array bounds and nx, ny adapted with nbgp.

sor.f90


bug removed in declaration of ddzw(), nz replaced by nzt+1

Location:
palm/trunk/SOURCE
Files:
2 edited

Legend:

Unmodified
Added
Removed
  • palm/trunk/SOURCE

    • Property svn:mergeinfo set to (toggle deleted branches)
      /palm/branches/suehring423-666
      /palm/branches/letzel/masked_output/SOURCE296-409
  • palm/trunk/SOURCE/check_parameters.f90

    r601 r667  
    44! Current revisions:
    55! -----------------
     6!
    67!
     8! Exchange of parameters between ocean and atmosphere via PE0
     9! Check for illegal combination of ws-scheme and timestep scheme.
     10! Check for topography and ws-scheme.
     11! Check for not cyclic boundary conditions in combination with ws-scheme and
     12! loop_optimization = 'vector'.
     13! Check for call_psolver_at_all_substeps and ws-scheme for momentum_advec.
     14!
     15! Different processor/grid topology in atmosphere and ocean is now allowed!
     16!
     17! Bugfixes in checking for conserve_volume_flow_mode
     18!
     19! Adapt error messages.
    720!
    821! Former revisions:
     
    180193!
    181194!-- Check dt_coupling, restart_time, dt_restart, end_time, dx, dy, nx and ny
    182     IF ( coupling_mode /= 'uncoupled' )  THEN
     195    IF ( coupling_mode /= 'uncoupled')  THEN
    183196
    184197       IF ( dt_coupling == 9999999.9 )  THEN
     
    189202
    190203#if defined( __parallel )
    191        CALL MPI_SEND( dt_coupling, 1, MPI_REAL, target_id, 11, comm_inter, &
    192                       ierr )
    193        CALL MPI_RECV( remote, 1, MPI_REAL, target_id, 11, comm_inter, &
    194                       status, ierr )
     204       IF ( myid == 0 ) THEN
     205          CALL MPI_SEND( dt_coupling, 1, MPI_REAL, target_id, 11, comm_inter, &
     206                         ierr )
     207          CALL MPI_RECV( remote, 1, MPI_REAL, target_id, 11, comm_inter, &
     208                         status, ierr )
     209       ENDIF
     210       CALL MPI_BCAST( remote, 1, MPI_REAL, 0, comm2d, ierr)
     211       
    195212       IF ( dt_coupling /= remote )  THEN
    196213          WRITE( message_string, * ) 'coupling mode "', TRIM( coupling_mode ), &
     
    200217       ENDIF
    201218       IF ( dt_coupling <= 0.0 )  THEN
    202           CALL MPI_SEND( dt_max, 1, MPI_REAL, target_id, 19, comm_inter, ierr )
    203           CALL MPI_RECV( remote, 1, MPI_REAL, target_id, 19, comm_inter, &
    204                          status, ierr )
     219          IF ( myid == 0  ) THEN
     220             CALL MPI_SEND( dt_max, 1, MPI_REAL, target_id, 19, comm_inter, ierr )
     221             CALL MPI_RECV( remote, 1, MPI_REAL, target_id, 19, comm_inter, &
     222                            status, ierr )
     223          ENDIF   
     224          CALL MPI_BCAST( remote, 1, MPI_REAL, 0, comm2d, ierr)
     225         
    205226          dt_coupling = MAX( dt_max, remote )
    206227          WRITE( message_string, * ) 'coupling mode "', TRIM( coupling_mode ), &
     
    209230          CALL message( 'check_parameters', 'PA0005', 0, 1, 0, 6, 0 )
    210231       ENDIF
    211 
    212        CALL MPI_SEND( restart_time, 1, MPI_REAL, target_id, 12, comm_inter, &
    213                       ierr )
    214        CALL MPI_RECV( remote, 1, MPI_REAL, target_id, 12, comm_inter, &
    215                       status, ierr )
     232       IF ( myid == 0 ) THEN
     233          CALL MPI_SEND( restart_time, 1, MPI_REAL, target_id, 12, comm_inter, &
     234                         ierr )
     235          CALL MPI_RECV( remote, 1, MPI_REAL, target_id, 12, comm_inter, &
     236                         status, ierr )
     237       ENDIF
     238       CALL MPI_BCAST( remote, 1, MPI_REAL, 0, comm2d, ierr)
     239     
    216240       IF ( restart_time /= remote )  THEN
    217241          WRITE( message_string, * ) 'coupling mode "', TRIM( coupling_mode ), &
     
    220244          CALL message( 'check_parameters', 'PA0006', 1, 2, 0, 6, 0 )
    221245       ENDIF
    222 
    223        CALL MPI_SEND( dt_restart, 1, MPI_REAL, target_id, 13, comm_inter, &
    224                       ierr )
    225        CALL MPI_RECV( remote, 1, MPI_REAL, target_id, 13, comm_inter, &
    226                       status, ierr )
     246       IF ( myid == 0 ) THEN
     247          CALL MPI_SEND( dt_restart, 1, MPI_REAL, target_id, 13, comm_inter, &
     248                         ierr )
     249          CALL MPI_RECV( remote, 1, MPI_REAL, target_id, 13, comm_inter, &
     250                         status, ierr )
     251       ENDIF   
     252       CALL MPI_BCAST( remote, 1, MPI_REAL, 0, comm2d, ierr)
     253     
    227254       IF ( dt_restart /= remote )  THEN
    228255          WRITE( message_string, * ) 'coupling mode "', TRIM( coupling_mode ), &
     
    233260
    234261       simulation_time_since_reference = end_time - coupling_start_time
    235        CALL MPI_SEND( simulation_time_since_reference, 1, MPI_REAL, target_id, &
    236                       14, comm_inter, ierr )
    237        CALL MPI_RECV( remote, 1, MPI_REAL, target_id, 14, comm_inter, &
    238                       status, ierr )
     262       IF  ( myid == 0 ) THEN
     263          CALL MPI_SEND( simulation_time_since_reference, 1, MPI_REAL, target_id, &
     264                         14, comm_inter, ierr )
     265          CALL MPI_RECV( remote, 1, MPI_REAL, target_id, 14, comm_inter, &
     266                         status, ierr )   
     267       ENDIF
     268       CALL MPI_BCAST( remote, 1, MPI_REAL, 0, comm2d, ierr)
     269     
    239270       IF ( simulation_time_since_reference /= remote )  THEN
    240271          WRITE( message_string, * ) 'coupling mode "', TRIM( coupling_mode ), &
     
    245276       ENDIF
    246277
    247        CALL MPI_SEND( dx, 1, MPI_REAL, target_id, 15, comm_inter, ierr )
    248        CALL MPI_RECV( remote, 1, MPI_REAL, target_id, 15, comm_inter, &
    249                       status, ierr )
    250        IF ( dx /= remote )  THEN
    251           WRITE( message_string, * ) 'coupling mode "', TRIM( coupling_mode ), &
    252                  '":  dx = ', dx, '& is not equal to dx_remote = ', remote
    253           CALL message( 'check_parameters', 'PA0009', 1, 2, 0, 6, 0 )
    254        ENDIF
    255 
    256        CALL MPI_SEND( dy, 1, MPI_REAL, target_id, 16, comm_inter, ierr )
    257        CALL MPI_RECV( remote, 1, MPI_REAL, target_id, 16, comm_inter, &
    258                       status, ierr )
    259        IF ( dy /= remote )  THEN
    260           WRITE( message_string, * ) 'coupling mode "', TRIM( coupling_mode ), &
    261                  '":  dy = ', dy, '& is not equal to dy_remote = ', remote
    262           CALL message( 'check_parameters', 'PA0010', 1, 2, 0, 6, 0 )
    263        ENDIF
    264 
    265        CALL MPI_SEND( nx, 1, MPI_INTEGER, target_id, 17, comm_inter, ierr )
    266        CALL MPI_RECV( iremote, 1, MPI_INTEGER, target_id, 17, comm_inter, &
    267                       status, ierr )
    268        IF ( nx /= iremote )  THEN
    269           WRITE( message_string, * ) 'coupling mode "', TRIM( coupling_mode ), &
    270                  '": nx = ', nx, '& is not equal to nx_remote = ', iremote
    271           CALL message( 'check_parameters', 'PA0011', 1, 2, 0, 6, 0 )
    272        ENDIF
    273 
    274        CALL MPI_SEND( ny, 1, MPI_INTEGER, target_id, 18, comm_inter, ierr )
    275        CALL MPI_RECV( iremote, 1, MPI_INTEGER, target_id, 18, comm_inter, &
    276                       status, ierr )
    277        IF ( ny /= iremote )  THEN
    278           WRITE( message_string, * ) 'coupling mode "', TRIM( coupling_mode ), &
    279                  '": ny = ', ny, '& is not equal to ny_remote = ', iremote
    280           CALL message( 'check_parameters', 'PA0012', 1, 2, 0, 6, 0 )
     278 
     279       IF ( myid == 0 ) THEN
     280          CALL MPI_SEND( dx, 1, MPI_REAL, target_id, 15, comm_inter, ierr )
     281          CALL MPI_RECV( remote, 1, MPI_REAL, target_id, 15, comm_inter, &
     282                                                             status, ierr )
     283       ENDIF
     284       CALL MPI_BCAST( remote, 1, MPI_REAL, 0, comm2d, ierr)
     285
     286
     287       IF ( coupling_mode == 'atmosphere_to_ocean') THEN
     288
     289          IF ( dx < remote ) THEN
     290             WRITE( message_string, * ) 'coupling mode "', &
     291                   TRIM( coupling_mode ),                  &
     292           '": dx in Atmosphere is not equal to or not larger then dx in ocean'
     293             CALL message( 'check_parameters', 'PA0009', 1, 2, 0, 6, 0 )
     294          ENDIF
     295
     296          IF ( (nx_a+1)*dx /= (nx_o+1)*remote )  THEN
     297             WRITE( message_string, * ) 'coupling mode "', &
     298                    TRIM( coupling_mode ), &
     299             '": Domain size in x-direction is not equal in ocean and atmosphere'
     300             CALL message( 'check_parameters', 'PA0010', 1, 2, 0, 6, 0 )
     301          ENDIF
     302
     303       ENDIF
     304
     305       IF ( myid == 0) THEN
     306          CALL MPI_SEND( dy, 1, MPI_REAL, target_id, 16, comm_inter, ierr )
     307          CALL MPI_RECV( remote, 1, MPI_REAL, target_id, 16, comm_inter, &
     308                         status, ierr )
     309       ENDIF
     310       CALL MPI_BCAST( remote, 1, MPI_REAL, 0, comm2d, ierr)
     311
     312       IF ( coupling_mode == 'atmosphere_to_ocean') THEN
     313
     314          IF ( dy < remote )  THEN
     315             WRITE( message_string, * ) 'coupling mode "', &
     316                    TRIM( coupling_mode ), &
     317                 '": dy in Atmosphere is not equal to or not larger then dy in ocean'
     318             CALL message( 'check_parameters', 'PA0011', 1, 2, 0, 6, 0 )
     319          ENDIF
     320
     321          IF ( (ny_a+1)*dy /= (ny_o+1)*remote )  THEN
     322             WRITE( message_string, * ) 'coupling mode "', &
     323                   TRIM( coupling_mode ), &
     324             '": Domain size in y-direction is not equal in ocean and atmosphere'
     325             CALL message( 'check_parameters', 'PA0012', 1, 2, 0, 6, 0 )
     326          ENDIF
     327
     328          IF ( MOD(nx_o+1,nx_a+1) /= 0 )  THEN
     329             WRITE( message_string, * ) 'coupling mode "', &
     330                   TRIM( coupling_mode ), &
     331             '": nx+1 in ocean is not divisible without remainder with nx+1 in', &
     332             ' atmosphere'
     333             CALL message( 'check_parameters', 'PA0339', 1, 2, 0, 6, 0 )
     334          ENDIF
     335
     336          IF ( MOD(ny_o+1,ny_a+1) /= 0 )  THEN
     337             WRITE( message_string, * ) 'coupling mode "', &
     338                   TRIM( coupling_mode ), &
     339             '": ny+1 in ocean is not divisible without remainder with ny+1 in', &
     340             ' atmosphere'
     341             CALL message( 'check_parameters', 'PA0340', 1, 2, 0, 6, 0 )
     342          ENDIF
     343
    281344       ENDIF
    282345#else
     
    290353!
    291354!-- Exchange via intercommunicator
    292     IF ( coupling_mode == 'atmosphere_to_ocean' )  THEN
     355    IF ( coupling_mode == 'atmosphere_to_ocean' .AND. myid == 0 )  THEN
    293356       CALL MPI_SEND( humidity, 1, MPI_LOGICAL, target_id, 19, comm_inter, &
    294357                      ierr )
    295     ELSEIF ( coupling_mode == 'ocean_to_atmosphere' )  THEN
     358    ELSEIF ( coupling_mode == 'ocean_to_atmosphere' .AND. myid == 0)  THEN
    296359       CALL MPI_RECV( humidity_remote, 1, MPI_LOGICAL, target_id, 19, &
    297360                      comm_inter, status, ierr )
    298361    ENDIF
     362    CALL MPI_BCAST( humidity_remote, 1, MPI_LOGICAL, 0, comm2d, ierr)
     363   
    299364#endif
    300365
     
    372437          CALL message( 'check_parameters', 'PA0014', 1, 2, 0, 6, 0 )
    373438       ENDIF
     439       IF ( momentum_advec == 'ws-scheme' .OR. scalar_advec == 'ws-scheme' ) &
     440       THEN
     441          message_string = 'topography is still not allowed with ' // &
     442                           'momentum_advec = "' // TRIM( momentum_advec ) //  &
     443                           '"or scalar_advec = "' // TRIM( scalar_advec ) //'"'
     444   ! message number still needs modification
     445           CALL message( 'check_parameters', 'PA0341', 1, 2, 0, 6, 0 )
     446       END IF
     447         
    374448!
    375449!--    In case of non-flat topography, check whether the convention how to
     
    492566       CALL message( 'check_parameters', 'PA0021', 1, 2, 0, 6, 0 )
    493567    ENDIF
    494 
     568   
     569    IF( momentum_advec == 'ws-scheme' .AND. &
     570        call_psolver_at_all_substeps == .FALSE. ) THEN
     571        message_string = 'psolver must be called at each RK3 substep when "'//&
     572                      TRIM(momentum_advec) // ' "is used for momentum_advec'
     573        CALL message( 'check_parameters', 'PA0343', 1, 2, 0, 6, 0 )
     574    END IF
    495575!
    496576!-- Advection schemes:
    497     IF ( momentum_advec /= 'pw-scheme' .AND. momentum_advec /= 'ups-scheme' ) &
    498     THEN
     577    IF ( momentum_advec /= 'pw-scheme' .AND. momentum_advec /= 'ws-scheme' .AND. &
     578         momentum_advec /= 'ups-scheme' ) THEN
    499579       message_string = 'unknown advection scheme: momentum_advec = "' // &
    500580                        TRIM( momentum_advec ) // '"'
    501581       CALL message( 'check_parameters', 'PA0022', 1, 2, 0, 6, 0 )
    502582    ENDIF
    503     IF ( ( momentum_advec == 'ups-scheme'  .OR.  scalar_advec == 'ups-scheme' )&
    504                                       .AND.  timestep_scheme /= 'euler' )  THEN
    505        message_string = 'momentum_advec = "' // TRIM( momentum_advec ) // &
    506                         '" is not allowed with timestep_scheme = "' //    &
    507                         TRIM( timestep_scheme ) // '"'
     583    IF ((( momentum_advec == 'ups-scheme'  .OR.  scalar_advec == 'ups-scheme' )&
     584           .AND.  timestep_scheme /= 'euler' ) .OR. (( momentum_advec == 'ws-scheme'&
     585           .OR.  scalar_advec == 'ws-scheme') .AND. (timestep_scheme == 'euler' .OR. &
     586           timestep_scheme == 'leapfrog+euler' .OR. timestep_scheme == 'leapfrog'    &
     587           .OR. timestep_scheme == 'runge-kutta-2'))) THEN
     588       message_string = 'momentum_advec or scalar_advec = "' &
     589         // TRIM( momentum_advec ) // '" is not allowed with timestep_scheme = "' // &
     590         TRIM( timestep_scheme ) // '"'
    508591       CALL message( 'check_parameters', 'PA0023', 1, 2, 0, 6, 0 )
    509592    ENDIF
    510 
    511     IF ( scalar_advec /= 'pw-scheme'  .AND.  scalar_advec /= 'bc-scheme'  .AND.&
    512          scalar_advec /= 'ups-scheme' )  THEN
     593    IF ( scalar_advec /= 'pw-scheme'  .AND.  scalar_advec /= 'ws-scheme' .AND. &
     594        scalar_advec /= 'bc-scheme'  .AND.  scalar_advec /= 'ups-scheme' )  THEN
    513595       message_string = 'unknown advection scheme: scalar_advec = "' // &
    514596                        TRIM( scalar_advec ) // '"'
     
    563645    ENDIF
    564646
    565     IF ( momentum_advec /= 'pw-scheme' .AND. timestep_scheme(1:5) == 'runge' ) &
    566     THEN
     647    IF ( (momentum_advec /= 'pw-scheme' .AND. momentum_advec /= 'ws-scheme') &
     648         .AND. timestep_scheme(1:5) == 'runge' ) THEN
    567649       message_string = 'momentum advection scheme "' // &
    568650                        TRIM( momentum_advec ) // '" & does not work with ' // &
     
    712794          ug_vertical_gradient_level_ind(1) = nzt+1
    713795          ug(nzt+1) = ug_surface
    714           DO  k = nzt, 0, -1
     796          DO  k = nzt, nzb, -1
    715797             IF ( i < 11 ) THEN
    716798                IF ( ug_vertical_gradient_level(i) > zu(k)  .AND. &
     
    778860          vg_vertical_gradient_level_ind(1) = nzt+1
    779861          vg(nzt+1) = vg_surface
    780           DO  k = nzt, 0, -1
     862          DO  k = nzt, nzb, -1
    781863             IF ( i < 11 ) THEN
    782864                IF ( vg_vertical_gradient_level(i) > zu(k)  .AND. &
     
    10201102 
    10211103             
     1104
    10221105!
    10231106!-- Compute Coriolis parameter
     
    11591242!
    11601243!-- Non-cyclic lateral boundaries require the multigrid method and Piascek-
    1161 !-- Willimas advection scheme. Several schemes and tools do not work with
    1162 !-- non-cyclic boundary conditions.
     1244!-- Willimas or Wicker - Skamarock advection scheme. Several schemes
     1245!-- and tools do not work with non-cyclic boundary conditions.
    11631246    IF ( bc_lr /= 'cyclic'  .OR.  bc_ns /= 'cyclic' )  THEN
    11641247       IF ( psolver /= 'multigrid' )  THEN
     
    11671250          CALL message( 'check_parameters', 'PA0051', 1, 2, 0, 6, 0 )
    11681251       ENDIF
    1169        IF ( momentum_advec /= 'pw-scheme' )  THEN
     1252       IF ( momentum_advec /= 'pw-scheme' .AND. &
     1253            momentum_advec /= 'ws-scheme')  THEN
    11701254          message_string = 'non-cyclic lateral boundaries do not allow ' // &
    11711255                           'momentum_advec = "' // TRIM( momentum_advec ) // '"'
    11721256          CALL message( 'check_parameters', 'PA0052', 1, 2, 0, 6, 0 )
    11731257       ENDIF
    1174        IF ( scalar_advec /= 'pw-scheme' )  THEN
     1258       IF ( scalar_advec /= 'pw-scheme' .AND. &
     1259            scalar_advec /= 'ws-scheme' )  THEN
    11751260          message_string = 'non-cyclic lateral boundaries do not allow ' // &
    11761261                           'scalar_advec = "' // TRIM( scalar_advec ) // '"'
    11771262          CALL message( 'check_parameters', 'PA0053', 1, 2, 0, 6, 0 )
    11781263       ENDIF
     1264       IF ( (scalar_advec == 'ws-scheme' .OR. momentum_advec == 'ws-scheme' ) &
     1265          .AND. loop_optimization == 'vector' ) THEN
     1266          message_string = 'non-cyclic lateral boundaries do not allow ' // &
     1267                           'loop_optimization = vector and ' //  &
     1268                           'scalar_advec = "' // TRIM( scalar_advec ) // '"'
     1269  ! The error message number still needs modification.
     1270          CALL message( 'check_parameters', 'PA0342', 1, 2, 0, 6, 0 )
     1271       END IF
    11791272       IF ( galilei_transformation )  THEN
    11801273          message_string = 'non-cyclic lateral boundaries do not allow ' // &
     
    14071500                        TRIM( bc_uv_b ) // '"'
    14081501       CALL message( 'check_parameters', 'PA0076', 1, 2, 0, 6, 0 )
     1502    ENDIF
     1503!
     1504!-- In case of coupled simulations u and v at the ground in atmosphere will be
     1505!-- assigned with the u and v values of the ocean surface
     1506    IF ( coupling_mode == 'atmosphere_to_ocean' )  THEN
     1507       ibc_uv_b = 2
    14091508    ENDIF
    14101509
     
    21092208             hom(:,2,57,:) = SPREAD( zu, 2, statistic_regions+1 )
    21102209
     2210
    21112211          CASE ( 'u"pt"' )
    21122212             dopr_index(i) = 58
     
    22442344
    22452345       END SELECT
     2346
    22462347!
    22472348!--    Check to which of the predefined coordinate systems the profile belongs
     
    25842685!-- Upper plot limit (grid point value) for 1D profiles
    25852686    IF ( z_max_do1d == -1.0 )  THEN
     2687
    25862688       nz_do1d = nzt+1
     2689
    25872690    ELSE
    25882691       DO  k = nzb+1, nzt+1
     
    27372840
    27382841!
     2842
    27392843!-- Check netcdf precison
    27402844    ldum = .FALSE.
     
    30703174    IF ( conserve_volume_flow )  THEN
    30713175       IF ( TRIM( conserve_volume_flow_mode ) == 'default' )  THEN
    3072           IF ( bc_lr /= 'cyclic'  .OR.  bc_ns /= 'cyclic' )  THEN
    3073              conserve_volume_flow_mode = 'inflow_profile'
    3074           ELSE
    3075              conserve_volume_flow_mode = 'initial_profiles'
    3076           ENDIF
     3176
     3177          conserve_volume_flow_mode = 'initial_profiles'
     3178
    30773179       ELSEIF ( TRIM( conserve_volume_flow_mode ) /= 'initial_profiles' .AND.  &
    30783180            TRIM( conserve_volume_flow_mode ) /= 'inflow_profile' .AND.  &
     
    30823184          CALL message( 'check_parameters', 'PA0154', 1, 2, 0, 6, 0 )
    30833185       ENDIF
    3084        IF ( ( bc_lr /= 'cyclic'  .OR.  bc_ns /= 'cyclic' ) .AND. &
    3085             TRIM( conserve_volume_flow_mode ) /= 'inflow_profile' )  THEN
    3086           WRITE( message_string, * )  'noncyclic boundary conditions ', &
    3087                'require & conserve_volume_flow_mode = ''inflow_profile'''
     3186       IF ( (bc_lr /= 'cyclic'  .OR.  bc_ns /= 'cyclic') .AND. &
     3187          TRIM( conserve_volume_flow_mode ) == 'bulk_velocity' )  THEN
     3188          WRITE( message_string, * )  'non-cyclic boundary conditions ', &
     3189               'require  conserve_volume_flow_mode = ''initial_profiles'''
    30883190          CALL message( 'check_parameters', 'PA0155', 1, 2, 0, 6, 0 )
    30893191       ENDIF
     
    30913193            TRIM( conserve_volume_flow_mode ) == 'inflow_profile' )  THEN
    30923194          WRITE( message_string, * )  'cyclic boundary conditions ', &
    3093                'require & conserve_volume_flow_mode = ''initial_profiles''', &
     3195               'require conserve_volume_flow_mode = ''initial_profiles''', &
    30943196               ' or ''bulk_velocity'''
    30953197          CALL message( 'check_parameters', 'PA0156', 1, 2, 0, 6, 0 )
     
    31003202         TRIM( conserve_volume_flow_mode ) /= 'bulk_velocity' ) )  THEN
    31013203       WRITE( message_string, * )  'nonzero bulk velocity requires ', &
    3102             'conserve_volume_flow = .T. and & ', &
     3204            'conserve_volume_flow = .T. and ', &
    31033205            'conserve_volume_flow_mode = ''bulk_velocity'''
    31043206       CALL message( 'check_parameters', 'PA0157', 1, 2, 0, 6, 0 )
     
    31393241
    31403242
     3243
    31413244 END SUBROUTINE check_parameters
Note: See TracChangeset for help on using the changeset viewer.