[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Problems creating a large array



----- Original Message -----
From: "Mark A Ohrenschall" <address@hidden>
To: <address@hidden>
Sent: Monday, July 08, 2002 6:51 PM
Subject: Problems creating a large array


> Hello,
>
> I'm trying to load a 21600 by 43200 array into netCDF -- I succeeded
> (barely) for a byte array, but am running out of memory for a short
> array. I'm using the Java API and am using the -Xms and -Xmx parameters
> to give (or try to give) the required memory to the Java VM:
>
> java -cp /home/mao/java:/home/mao/java/netcdf2.jar -Xms2048m -Xmx2048m
grid2nc.globe
> Error occurred during initialization of VM
> Could not reserve enough space for object heap
>
> When I try smaller numbers I can start the VM but I then get an out of
> memory exception.
>
> How can I load such a large array into netCDF?
>
> Thanks in advance,
>
> Mark

NetCDF is really an API for out-of-memory storage, ie disk files. What it
does is to allow you to efficiently move data between disk and memory. So
instead of moving your entire array into memory, you want to move just
pieces of it. The art of this kind of programming is to read the right
amount of data that will fit into memory, and operate on it as much as
possible, before you have to get the next piece.

No matter how much internal memory you can afford, you will always have
files bigger than that, so you have to think in terms of subsetting the
array.

If you absolutely have to have it all in memory, then you have to buy more
memory. You can try various clever compression schemes, but these are not
part of NetCDF.