Project

General

Profile

memory usage of eofspatial

Added by Andy Aschwanden over 4 years ago

Hi,

I'm trying the calculate spatial EOFs on a file with dimensions

% ncdump -h v.nc |head
netcdf v {
dimensions:
    x = 1681 ;
    y = 2881 ;

However, even on a large memory node (1.5TB RAM), I run into

 % cdo -P 28 -v eofspatial,5 v.nc eof1.nc eof2.nc 
 OpenMP:  num_procs = 28  max_threads = 28
cdo eofspatial: Using CDO_SVD_MODE 'jacobi' from  default
cdo eofspatial: Using CDO_WEIGHT_MODE 'on' from  default
cdo eofspatial:  94%cdo eofspatial: Total area = 0.118304 steradians
cdo eofspatial: Total area = 0.118304
cdo eofspatial: Calculating 5 eigenvectors and 4842961 eigenvalues in grid_space
cdo eofspatial: Allocated eigenvalue/eigenvector structures with nts=0 gridsize=4842961

Error (EOFs) : Allocation of 24520864440200 bytes failed. [ line 399 file EOFs.c ]
System error message : Cannot allocate memory
HDF5-DIAG: Error detected in HDF5 (1.8.15) thread 0:
  #000: H5T.c line 1723 in H5Tclose(): not a datatype
    major: Invalid arguments to routine
    minor: Inappropriate type

Error (cdf_close) : NetCDF: HDF error

24520864440200 bytes are about 24520 GB. I am assuming it's the SVD that requires that much memory.

Are there any ways to reduce memory usage? The 1681 x 2881 grid I'm using here is not the largest grid I'd like to use. This particular example used CDO version 1.8.1, but the same-ish error appears with 1.9.8.

Any suggestions?

Thanks.


Replies (1)

RE: memory usage of eofspatial - Added by Uwe Schulzweida about 4 years ago

There is no way to reduce the memory usage, eofspatial allocates a matrix with gridsize*gridsize.
Use eoftime if your data has more than one timestep. It's much faster and less memory intensiv if the number of timestep is less than the size of the grid.

    (1-1/1)