Index: [Article Count Order] [Thread]

Date:  Thu, 22 Apr 1999 21:19:43 +0900
From:  Tetsuya HIYAMA <hiyama@ihas.nagoya-u.ac.jp>
Subject:  [game-jp:0028] GCSS
To:  game-jp@ihas.nagoya-u.ac.jp
Message-Id:  <9904221220.AA02379@paddy.ihas.nagoya-u.ac.jp>
X-Mail-Count: 00028

game-jp の皆様  檜山@名古屋大・管理者 です

下記のメ−ルが北大の藤吉先生から投稿されましたが、うまく流れませんでした。
代理投稿いたします。


**************************************** 以下、藤吉先生のメ−ル
   >
   > 複数受け取られる方、お許し願います。
   >
   > GEWEX Cloud System Study (GCSS)では、
   > 様々な観測事例についてモデルの比較実験を
   > 行っています。その結果については、
   > Mon.Wea.Rev. やBull.Amer.Met.に既に
   > いくつか報告されています。
   >
   > 今回、新たな比較実験の提案が届いておりますので
   > 以下に添付いたします。興味をお持ちの方は、
   > 是非ご参加下さい。
   >
   > また、新たな巻雲観測計画(CRYSTAL)の最終版も
   > できたそうですので、ご覧下さい。
   >
   > GCSS-WG1:
   > The ARM shallow cumulus over land case is now ready for first
   > release. 
   > To obtain details either
   >                ftp
   >                connect email.meto.gov.uk   (login as "anonymous")
   >                cd pub/apr/arm
   > or URL:
   >                ftp://email.meto.gov.uk/pub/apr/arm
   >
   > The following files will be found
   > ARM_README.TEX         List / description of files in directory
   > CASE_31_MAR_99.PS      Case description
   > QSAT_CODE.FOR          Fortran for saturation mixing ratio calculation
   > BC_CODE.FOR            Fortran for obtaining surface stress
   > RESULTS_31_MAR_99.PS   Preliminary results (UKMO, MPI) for time series
   >
   > Please let me know of any problems you encounter, either with getting
   > and reading the files, or with the case itself.
   > Andy
   > ------------------------------------------------------------------
   > Andrew R. Brown
   > Met O(APR)
   > Meteorological Office, Room 258
   > London Road,
   > Bracknell,
   > Berkshire.
   > UK.    RG11 2SZ
   >
   > email: arbrown@meto.gov.uk 
   > fax  : +44 (0)1344 854493
   > phone: +44 (0)1344 856461
   > ------------------------------------------------------------------
   >
   >
   > DRAFT PROPOSAL FOR WG3 FASTEX IOP16 INTERCOMPARISON
   > ===================================================
   >
   > 1. OVERALL AIMS AND METHODOLOGY
   > The overall aim of the intercomparison will be to improve the
   > parameterisation of clouds and cloud processes in extratropical
   > cyclones in GCMs. The approach to doing this is to run model simulations
   > of the system at a number of resolutions and compare the representation 
   > of the clouds and cloud related structures both between models, 
   > resolutions and with observations. 
   >
   > In order to elucidate the reasons for differences in the representations
   > it will be important to understand differences between the cloud and
   > microphysics schemes in the various models. A fundamental question in
   > the representation of cloud is the behaviour of the cloud scheme. In the
   > case of diagnostic schemes, this amounts to how much cloud/cloud cover
   > is produced by a given q etc., and the property to be tested and
   > hopefully improved is how much each scheme is able reproduce the correct
   > average cloud properties at different model resolutions. In the case of
   > prognostic schemes, where various source and sink terms represent the
   > rate of change of some chosen cloud properties, the aim is to
   > investigate both the structure of the formulation and the accuracy of
   > the model forcing terms (e.g. sub-gridscale fluxes). This implies that
   > the study should concentrate as much on the parametrizations of sub-grid
   > fluxes of heat, moisture etc. within low-resolution GCMs as on the
   > representation of cloud per-se. 
   >
   > This intercomparison should provide data for a direct test of various 
   > schemes at various resolutions. An additional tool in a second 
   > stage of the intercomparison will be one or more sensitivity tests 
   > to various model cloud scheme/microphysical parameters, for example 
   > response of cloud distribution to q etc., ice fall speed, vertical 
   > resolution etc..
   >
   > 2. INTRODUCTION TO IOP16
   > FASTEX IOP16 was a case of a rapidly moving, rapidly developing
   > frontal wave secondary cyclone. The system developed 
   > from being a trough to a well developed system in the
   > period 0Z to 12Z on 17th Feb 1997. The aircraft measurements
   > (dropsonde, radar and aircraft level) were made in the period 
   > 6Z-12Z. A feature of this system is multiple cloud heads which
   > emerge from under the cloud shield starting at around 6Z and 
   > a focus of interest may be the source of the instabilities which
   > lead to the circulations which generate these. 
   > An important feature of this case, from the modelling point of
   > view, is the rapid speed of movement of the system which is about
   > 30m/s. This may impose a constraint on how long the system can
   > be modelled for depending on the domain size. To some extent this
   > is, however, ameliorated by the rapidity of development which shortens
   > the time span of interest. A second feature is that the flight strategy,
   > which aimed to map out different parts of the system, lead,
   > fortuitously, to an approximately constant group position for the centre
   > of the aircraft tracks, thereby making direct comparison with aircraft
   > data simpler.
   > General information about IOP16 and details of the various observations
   > taken can be found at http://www.met.rdg.ac.uk/FASTEX/i16sum.html 
   >
   > 3. LAM/CRM SPECIFICATIONS
   > The input data will be from runs of the UKMO Unified Model (UM)
   > at 0.105 deg (approx 12km), 45 level resolution. The grid for
   > this model was (positive longitude=east of Greenwich meridian):
   >
   > rotated lat/long grid
   > 394x247 points, 0.105x0.105 degree resolution
   > pole lat/long: 34.5N,150 degrees
   > top left gridpoint lat/long: 10.22,-23.24 on rotated grid
   >
   > This domain is the whole area illustrated in the figures, which also 
   > shows, in blue, the approximate C130 tracks. This model has been run 
   > starting at 9Z on the 16th Feb from 50km operational model data. It 
   > was run forward from this time using operational 50km analysis 
   > boundaries and including 4 3 hourly assimilation cycles (12,15,18,21Z
   > on the 16th). 
   > Some of the basic fields from this run are posted on the web for
   > reference at http://www.met.rdg.ac.uk/~sws97hwl/GCSS/index.html 
   > Model dumps will be provided at 0Z 17th and onwards at 3hrly intervals 
   > in order to provide starting and boundary data. The data will be 
   > provided on a 0.21 degree (approx 24 km) grid, which may be reduced by 
   > one grid box to allow for grid staggering.
   > The GCSS simulations will include nested simulations at four different
   > (approximate) resolutions: 24km, 12km, 4km and 2km.
   > The 24km simulation should cover the time period 0-12Z.
   > The suggested 24km grid is shown in purple on the figures and is:
   >
   > rotated lat/long grid
   > 155x109 points,  0.21x0.21 degree resolution
   > pole lat/long: 34.5N,150 degrees
   > top left gridpoint lat/long: 8.77,-16.0 on rotated grid.
   >
   > If it is desired to run on a non-rotated or other type of grid a
   > similar grid may be defined subject to the system track from 
   > 0-12Z being covered, and a suitable surrounding area (in particular
   > the cloud head at 12Z should be included) and, of course, the
   > whole grid being inside the area of the supplied data. The model
   > data posted on the web should help the defining of such grids.
   >
   > The suggested 12km grid, for those running on a rotated grid, is:
   > rotated lat/long grid
   > 150x150 points,  0.105x0.105 degree resolution
   > pole lat/long: 34.5N,150 degrees
   > top left gridpoint lat/long: 7.0,-2.0 on rotated grid.
   > This is shown in red on the figures. 
   > Once again alternative grids may be used. In this regard it should 
   > be noted that the FASTEX-CSS simulations are already underway, using 
   > ARPEGE re-analysis data at 0.4 degree resolution to drive a simulation 
   > on the following grid:
   > NON-rotated lat/long grid
   > 150x150 points,  0.1 (latitude) x 0.17 (longitude) degree resolution
   > top left gridpoint lat/long: 60.55,-25.835
   >
   > This is shown on the figures as the outermost boundary of the green
   > area. 
   > The simulation data will also be averaged to 1 (latitude) x 1.7 
   > (longitude) degree (approx 100 km ) resolution. these 1 degree boxes 
   > are shown. Unfortunately it will not be possible to run with this 
   > grid since current data does not cover it all (as can be seen from 
   > the figures) but a similar one could be used.
   >
   > The 4km and 2km CRM runs should be centred on the position of the
   > aircraft tracks, 55N,16.4W with the axes aligned to LOCAL E-W, N-S. 
   > The range of these runs should be at least 6-12Z (possibly starting 
   > earlier). The 2km run will provide an approx 300x300km domain which 
   > will correspond to roughly the central light blue box in the figures. 
   > It should be noted that this area is very small when compared to the 
   > speed of motion of the system (it takes the system only about 2 hours 
   > to travel from one corner to the opposite one). In the light of 
   > this is it worth carrying out 2km simulations at all unless larger 
   > domains can be used??
   >
   >
   > 4. GCM SPECIFICATIONS (needs to be added)
   > It should be noted that there may be problems in obtaining a good
   > representation of the system with global climate resolution models.
   > I attempted to model the system starting from both ECMWF and 
   > UKMO global data and found that it did not develop satisfactorily.
   > What do people think about the need for this?
   >
   > 5. SCM SPECIFICATIONS (needs to be added)
   >
   > Suggest SCM simulations of innermost, 300 km domain, driven by
   > tendencies from GCM at grid centre. We can provide tendencies, or
   > modellers calculate their own. What do people need? 
   >
   > 6. OUTPUT FIELDS
   >
   > (a) Box average intercomparisons.
   > It is proposed that, whatever resolutions are run, the data analysis
   > concentrates on the approx 300x300 km box almost centred on the C-130
   > track: this should (approximately) be the domain of the 2 km CRM 
   > runs, centred at 55N,16.4W, with the axes aligned to LOCAL E-W, N_S. 
   > All models should provide average data for this gridbox (covering 
   > the area 53.5N-56.5N,18.95W-13.85W). The larger scale models should 
   > also provide data for the 8 surrounding boxes (shown light blue), 
   > interpolated (if necessary) to a true 3(latitude) x 5.1 (longitude)
   > grid box. This will enable 'large scale' gradients between boxes 
   > to be assessed, along with variability. (For the 24 and 12 km models, 
   > it would obviously be preferable to interpolate first to a fine, 
   > 0.1x0.17 degree non-rotated lat/long grid, then average up).In 
   > averaging from high to low resolution, relative gridbox area should
   > be taken into account (by weighting by cos(latitude)). The exact
   > specifications of the boxes are that they are bounded by the following
   > lattitudes and longitudes:
   >
   > 59.5N,56.5N,53.5N,50.5N
   > 24.05W,18.95W,13.85W,8.75
   >
   > Thus, the top left box, box 1, covers 56.5N-59.5N, 24.05W-18.95W
   > the bottom right box, box 9, covers 50.5N-53.5N, 13.85W-8.75W
   >
   > Box numbering is :
   > 1 2 3
   > 4 5 6
   > 7 8 9
   > where any numbering is required.
   >
   > Include basic eddy fluxes of heat, moisture, momentum, Q1, Q2, w/omega,
   > q,t,u,v,pmsl, precipitation rate + microphysical quantities. Larger 
   > scale (i.e. non-CRM) models should include parametrized fluxes 
   > (primarily convection and BL schemes) as separate outputs.
   > Since the box averaged data will be relatively small it should be
   > acceptable to exchange it as ASCII. The ordering (boxes, levels,
   > variables, times) must be made clear. The pressure or geopotential 
   > heights of the levels used will need to be provided.
   >
   > For definitions see http://www.tor.ec.gc.ca/GEWEX/GCSS/largesc.htm. 
   > Add any parameters required by (or predicted by) cloud schemes.
   >
   > (b) Gridded Data
   > It should be noted that we expect that, at least initially,
   > most of the quantitative work on the data will be carried out on
   > the box averaged data discussed in the last section. The gridded
   > data will be required in order to compare the evolution of fields
   > between different models. In order to facilitate quick comparisons
   > (eg overlaying fields) we would like to specify a 90x90 grid
   > which corresponds to the area of the 9 blue squares in the figures 
   > for participants to interpolate their gridded data onto. The 
   > specifications of this grid are:
   > non-rotated lat long grid
   > resolution 0.17 (E-W)x 0.1(N-S) degree
   > Top left point is 59.5-0.1/2=59.45N, 24.05+0.17/2=24.135W
   > Data output from top left (NW) point first, scanning across W-E then
   > N-S.
   > Data output as 32 bit IEEE reals, each 2D field preceded by a 64 word
   > header. Format and code for writing to be supplied by UKMO.
   >
   > The details of the variables required are:
   > 3hrly output
   > Single level: pmsl, T*, surface sensible/latent heat fluxes. 
   > Multiple level (25mb resolution): z,T,u,v,q - also mircophysical
   > quantities:  qcl, qcf, rain(+ snow if separate variable), graupel, 
   > condensation/evap of water/ice/rain/snow, autoconversion of
   > qcl into rain, autoconversion of qcf into snow, collection of
   > qcl by rain, collection of qcf, qcl by snow, Bergeron-Findeisen
   > mechanism, melting/freezing of cloud condensate, melting/freezing
   > of precipitating particles.
   >
   > Validating observations will be C-130/P3 aircraft data (including
   > dropsondes, radar, flight level microphysics), possibly upsondes
   > from coastal stations. 
   >
   > In addition to the above gridded data it would be helpful if 
   > participants could provide us with one or two basic fields (pmsl
   > others??)  3hrly over the full area of their domains in the form of
   > image files (gif etc)  to give an initial approximate idea of the
   > evolution in each model.
   >
   > (c) ISCCP
   > cloud fraction (plus separate ice and water fractions where possible)
   > cloud top pressure 
   > column cloud optical depth
   >
   > 7. MICROPHYSICS TESTS
   > To be carried out after the initial runs AFTER we have ascertained
   > exactly what microphysics is in each model and with that knowledge 
   > come to a decision as to what is sensible. 
   > Suggestions so far include sensitivity to ice fall speed, lambda 
   > and vertical resolution. Obviously parameter space is very large 
   > so we need a definite strategy for this before embarking.
   >
   >
   > FIGURES
   > Show IR images as background over area of UKMO 12km simulation
   > at 6Z and 12Z on 17th Feb 1997. Grids marked as described in text.
   > These are attached as gzipped postscript. (If you have any trouble 
   > with the attachment they are also on the web at the URL mentioned 
   > in section 3)
   >
   > -- 
   > Humphrey W Lean,
   > Mesoscale Modelling Group,                         
   > Room 2L62, Joint Centre for Mesoscale Meteorology, 
   > Department of Meteorology, University of Reading,
   > PO Box 243, Reading RG6 6BB, UK.               
   >
   > Tel: 0118 931 6624 (University ext 6624, GTN 1441 6624)
   > Fax: 0118 931 8791 email:  hwlean@meto.gov.uk
   >
   >
   > <file://d:\progra~1\eudora-j\attach\FASTEX_GCSS06.ps.gz>1bb9a792.jpg<file:
   > //d:\progra~1\eudora-j\attach\FASTEX_GCSS06.ps.gz> FASTEX_GCSS06.ps.gz 
   >
   >
   > <file://d:\progra~1\eudora-j\attach\FASTEX_GCSS12.ps.gz>1bb9a7f0.jpg<file:
   > //d:\progra~1\eudora-j\attach\FASTEX_GCSS12.ps.gz> FASTEX_GCSS12.ps.gz 
   >
   >
   > *********************************
   > The final version of the CRYSTAL Research Plan has been released on the
   > FIRE website (http://asd-www.larc.nasa.gov/fire/index.html).
   >
   > NASA will soon issue an NRA (solicitation for proposals) to form a CRYSTAL
   > Science Team. 
   >
   > =========================================================================
   > = David O'C. Starr                   email: starr@climate.gsfc.nasa.gov =
   > = Code 913                          express delivery: Bldg 33, Rm C-308 =
   > = NASA Goddard Space Flight Center                  voice: 301-614-6191 =
   > = Greenbelt, MD  20771                                fax: 301-614-6307 =
   > ========= 
   
   
   ***************************************
   *藤吉康志                              *  
   *寒冷海洋圏科学部門                     *
   *北海道大学低温科学研究所               *
   *〒060−0819 札幌市北区北19西8         *
   *Tel. & Fax(1)  011−706−5491         *
   * Fax(2)        011−706−7142         *
   *e-mail  fujiyo@lowtem.hokudai.ac.jp   *
   *http://stellar.lowtem.hokudai.ac.jp/  *
   ***************************************

************************************************************************
檜山哲哉  〒464-8601  名古屋市千種区不老町  名古屋大学大気水圏科学研究所
                      TEL:052-789-3478     FAX:052-789-3436           
                      e-mail:hiyama@ihas.nagoya-u.ac.jp                
************************************************************************