source: palm/tags/release-3.4a/SOURCE/exchange_horiz.f90 @ 818

Last change on this file since 818 was 77, checked in by raasch, 17 years ago

New:
---

particle reflection from vertical walls implemented, particle SGS model adjusted to walls

Wall functions for vertical walls now include diabatic conditions. New subroutines wall_fluxes, wall_fluxes_e. New 4D-array rif_wall.

new d3par-parameter netcdf_64bit_3d to switch on 64bit offset only for 3D files

new d3par-parameter dt_max to define the maximum value for the allowed timestep

new inipar-parameter loop_optimization to control the loop optimization method

new inipar-parameter pt_refrence. If given, this value is used as the reference that in buoyancy terms (otherwise, the instantaneous horizontally averaged temperature is used).

new user interface user_advec_particles

new initializing action "by_user" calls user_init_3d_model and allows the initial setting of all 3d arrays

topography height informations are stored on arrays zu_s_inner and zw_w_inner and output to the 2d/3d NetCDF files

samples added to the user interface which show how to add user-define time series quantities.

calculation/output of precipitation amount, precipitation rate and z0 (by setting "pra*", "prr*", "z0*" with data_output). The time interval on which the precipitation amount is defined is set by new d3par-parameter precipitation_amount_interval

unit 9 opened for debug output (file DEBUG_<pe#>)

Makefile, advec_particles, average_3d_data, buoyancy, calc_precipitation, check_open, check_parameters, data_output_2d, diffusion_e, diffusion_u, diffusion_v, diffusion_w, diffusivities, header, impact_of_latent_heat, init_particles, init_3d_model, modules, netcdf, parin, production_e, read_var_list, read_3d_binary, sum_up_3d_data, user_interface, write_var_list, write_3d_binary

New: wall_fluxes

Changed:


General revision of non-cyclic horizontal boundary conditions: radiation boundary conditions are now used instead of Neumann conditions at the outflow (calculation needs velocity values for t-dt, which are stored on new arrays u_m_l, u_m_r, etc.), calculation of mean outflow is not needed any more, volume flow control is added for the outflow boundary (currently only for the north boundary!!), additional gridpoints along x and y (uxrp, vynp) are not needed any more, routine "boundary_conds" now operates on timelevel t+dt and is not split in two parts (main, uvw_outflow) any more, Neumann boundary conditions at inflow/outflow in case of non-cyclic boundary conditions for all 2d-arrays that are handled by exchange_horiz_2d

The FFT-method for solving the Poisson-equation is now working with Neumann boundary conditions both at the bottom and the top. This requires adjustments of the tridiagonal coefficients and subtracting the horizontally averaged mean from the vertical velocity field.

+age_m in particle_type

Particles-package is now part of the default code ("-p particles" is not needed any more).

Move call of user_actions( 'after_integration' ) below increment of times
and counters. user_actions is now called for each statistic region and has as an argument the number of the respective region (sr)

d3par-parameter data_output_ts removed. Timeseries output for "profil" removed. Timeseries are now switched on by dt_dots. Timeseries data is collected in flow_statistics.

Initial velocities at nzb+1 are regarded for volume flow control in case they have been set zero before (to avoid small timesteps); see new internal parameters u/v_nzb_p1_for_vfc.

q is not allowed to become negative (prognostic_equations).

poisfft_init is only called if fft-solver is switched on (init_pegrid).

d3par-parameter moisture renamed to humidity.

Subversion global revision number is read from mrun and added to the run description header and to the run control (_rc) file.

vtk directives removed from main program.

The uitility routine interpret_config reads PALM environment variables from NAMELIST instead using the system call GETENV.

advec_u_pw, advec_u_up, advec_v_pw, advec_v_up, asselin_filter, check_parameters, coriolis, data_output_dvrp, data_output_ptseries, data_output_ts, data_output_2d, data_output_3d, diffusion_u, diffusion_v, exchange_horiz, exchange_horiz_2d, flow_statistics, header, init_grid, init_particles, init_pegrid, init_rankine, init_pt_anomaly, init_1d_model, init_3d_model, modules, palm, package_parin, parin, poisfft, poismg, prandtl_fluxes, pres, production_e, prognostic_equations, read_var_list, read_3d_binary, sor, swap_timelevel, time_integration, write_var_list, write_3d_binary

Errors:


Bugfix: preset of tendencies te_em, te_um, te_vm in init_1d_model

Bugfix in sample for reading user defined data from restart file (user_init)

Bugfix in setting diffusivities for cases with the outflow damping layer extending over more than one subdomain (init_3d_model)

Check for possible negative humidities in the initial humidity profile.

in Makefile, default suffixes removed from the suffix list to avoid calling of m2c in
# case of .mod files

Makefile
check_parameters, init_1d_model, init_3d_model, user_interface

  • Property svn:keywords set to Id
File size: 3.9 KB
Line 
1 SUBROUTINE exchange_horiz( ar )
2
3!------------------------------------------------------------------------------!
4! Actual revisions:
5! -----------------
6!
7!
8! Former revisions:
9! -----------------
10! $Id: exchange_horiz.f90 77 2007-03-29 04:26:56Z maronga $
11!
12! 75 2007-03-22 09:54:05Z raasch
13! Special cases for additional gridpoints along x or y in case of non-cyclic
14! boundary conditions are not regarded any more
15!
16! RCS Log replace by Id keyword, revision history cleaned up
17!
18! Revision 1.16  2006/02/23 12:19:08  raasch
19! anz_yz renamed ngp_yz
20!
21! Revision 1.1  1997/07/24 11:13:29  raasch
22! Initial revision
23!
24!
25! Description:
26! ------------
27! Exchange of lateral boundary values (parallel computers) and cyclic
28! lateral boundary conditions, respectively.
29!------------------------------------------------------------------------------!
30
31    USE control_parameters
32    USE cpulog
33    USE indices
34    USE interfaces
35    USE pegrid
36
37    IMPLICIT NONE
38
39#if defined( __parallel )
40    INTEGER, DIMENSION(4)                 ::  req
41    INTEGER, DIMENSION(MPI_STATUS_SIZE,4) ::  wait_stat
42#endif
43
44    REAL ::  ar(nzb:nzt+1,nys-1:nyn+1,nxl-1:nxr+1)
45
46
47    CALL cpu_log( log_point_s(2), 'exchange_horiz', 'start' )
48
49#if defined( __parallel )
50
51!
52!-- Exchange of lateral boundary values for parallel computers
53    IF ( pdims(1) == 1  .OR.  mg_switch_to_pe0 )  THEN
54!
55!--    One-dimensional decomposition along y, boundary values can be exchanged
56!--    within the PE memory
57       IF ( bc_lr == 'cyclic' )  THEN
58          ar(:,nys:nyn,nxl-1) = ar(:,nys:nyn,nxr)
59          ar(:,nys:nyn,nxr+1) = ar(:,nys:nyn,nxl)
60       ENDIF
61
62    ELSE
63
64       req = 0
65!
66!--    Send left boundary, receive right one
67       CALL MPI_ISEND(                                                     &
68               ar(nzb,nys-1,nxl), ngp_yz(grid_level), MPI_REAL, pleft,  0, &
69                          comm2d, req(1), ierr )
70       CALL MPI_IRECV(                                                       &
71               ar(nzb,nys-1,nxr+1), ngp_yz(grid_level), MPI_REAL, pright, 0, &
72                          comm2d, req(2), ierr )
73!
74!--    Send right boundary, receive left one
75       CALL MPI_ISEND(                                                     &
76               ar(nzb,nys-1,nxr), ngp_yz(grid_level), MPI_REAL, pright, 1, &
77                          comm2d, req(3), ierr )
78       CALL MPI_IRECV(                                                       &
79               ar(nzb,nys-1,nxl-1), ngp_yz(grid_level), MPI_REAL, pleft,  1, &
80                          comm2d, req(4), ierr )
81       CALL MPI_WAITALL( 4, req, wait_stat, ierr )
82
83    ENDIF
84
85
86    IF ( pdims(2) == 1  .OR.  mg_switch_to_pe0 )  THEN
87!
88!--    One-dimensional decomposition along x, boundary values can be exchanged
89!--    within the PE memory
90       IF ( bc_ns == 'cyclic' )  THEN
91          ar(:,nys-1,:) = ar(:,nyn,:)
92          ar(:,nyn+1,:) = ar(:,nys,:)
93       ENDIF
94
95    ELSE
96
97       req = 0
98!
99!--    Send front boundary, receive rear one
100       CALL MPI_ISEND( ar(nzb,nys,nxl-1),   1, type_xz(grid_level), psouth, 0, &
101                       comm2d, req(1), ierr )
102       CALL MPI_IRECV( ar(nzb,nyn+1,nxl-1), 1, type_xz(grid_level), pnorth, 0, &
103                       comm2d, req(2), ierr )
104!
105!--    Send rear boundary, receive front one
106       CALL MPI_ISEND( ar(nzb,nyn,nxl-1),   1, type_xz(grid_level), pnorth, 1, &
107                       comm2d, req(3), ierr )
108       CALL MPI_IRECV( ar(nzb,nys-1,nxl-1), 1, type_xz(grid_level), psouth, 1, &
109                       comm2d, req(4), ierr )
110       call MPI_WAITALL( 4, req, wait_stat, ierr )
111
112    ENDIF
113
114
115#else
116
117!
118!-- Lateral boundary conditions in the non-parallel case
119    IF ( bc_lr == 'cyclic' )  THEN
120       ar(:,nys:nyn,nxl-1) = ar(:,nys:nyn,nxr)
121       ar(:,nys:nyn,nxr+1) = ar(:,nys:nyn,nxl)
122    ENDIF
123
124    IF ( bc_ns == 'cyclic' )  THEN
125       ar(:,nys-1,:) = ar(:,nyn,:)
126       ar(:,nyn+1,:) = ar(:,nys,:)
127    ENDIF
128
129#endif
130
131    CALL cpu_log( log_point_s(2), 'exchange_horiz', 'stop' )
132
133 END SUBROUTINE exchange_horiz
Note: See TracBrowser for help on using the repository browser.