[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

20000927: max grids in a gempak grid file



>From: David Wojtowicz <address@hidden>
>Organization: UCAR/Unidata
>Keywords: 200009272023.e8RKN8b01440

>
>
>Hi,
>
> Is there any way to increase the maximum number of grids allowed
>in a GEMPAK grid file.  There seems to be a limit of 9999 imposed.
>Putting the entire ETA212 run from CONDUIT into a single grid file
>requires >10000.   We also write MM5 model output into GEMPAK grid
>files for visualization with GEMPAK/Garp.   We could split into
>multiple files, but then Garp is not happy.  Any advice would be
>appreciated.
>
> Thanks.
>
>--------------------------------------------------------
> David Wojtowicz, Sr. Research Programmer/Systems Manager
> Department of Atmospheric Sciences Computer Services
> University of Illinois at Urbana-Champaign
> email: address@hidden  phone: (217)333-8390
>--------------------------------------------------------
>
>
>

David,

The maximum number of headers in a gempak file determines how many
grids can be stored in a file. I have set this to 30,000 for the
next GEMPAK release.

You can increase this parameter MMHDRS in $GEMPAKHOME/include/gemprm.h
and $GEMPAKHOME/include/gemprm.$NA_OS. Whenever you change values in
the header files, you must do a complete rebuild:

cd $GEMLIB
rm *

cd $NAWIPS
make clean
make all
make install
make clean


The removal of all files in the library directory is nessesary since
the array sizes of the routines are defined at compile time by MMHDRS.

Another configuration to remember is that LLMXTM + LLSTFL <= MMHDRS.
The default for GEMPAK 5.4 was 200 times, 9800 stations, 10,000 headers
(10000 ship and grid entries). For the next release, I have set these to:
300, 29700 and 30000 respectively. The downside of increasing these
parameters is more memory used by statically allocated fortran arrays.
In 1997, I set the other parameters with most sites constrained by 16MB of RAM.
Hopefully, nobody will expect to run with 16MB with the next release.

Steve Chiswell