Project

General

Profile

why my cdo is quite slow when processing netcdf on cygwin?

Added by Dill Land over 2 years ago

Hello,

I am using CDO to process CMIP5 future climate data. For example, I want to delete 1990 from macav2metdata_pr_CanESM2_r1i1p1_historical_1990_1994_CONUS_daily.NC. It almost took around 1 hour to finish this single operation. However, if I further do anything CDO operation using the derived NC file (I mean after removing 1990). It would be very fast. Could anyone provide some suggestions? I am working on huge data; if one command costs 1 hour, I would have to spend several months to finish all data processing.

Thanks,

Dill


Replies (7)

RE: why my cdo is quite slow when processing netcdf on cygwin? - Added by Dill Land over 2 years ago

I just learned to use CDO recently. One thing I want to mention is that the nc file after deleting 1990 is much bigger than the original nc. I guess it is compressed.

Thanks,

Dill

RE: why my cdo is quite slow when processing netcdf on cygwin? - Added by Uwe Schulzweida over 2 years ago

Hello Dill,

An hour for this command already seems very long to me. It could be due to the chunking of the compression. CDO reads the data timestep by timestep, and if the chunking is done over the timesteps it is bad for the performance in CDO. To confirm this, you can send the following output:

ncdump -h -s macav2metdata_pr_CanESM2_r1i1p1_historical_1990_1994_CONUS_daily.NC
But with CDO this can't be accelerated any further.
Maybe the tool ncks from NCO can help. With this you can select all time steps before and after the year, which should be removed.
ncks -d time,min1,max1 -d time,min2,max2 ifile ofile
The numbering of the timesteps starts at 0.

Cheers,
Uwe

RE: why my cdo is quite slow when processing netcdf on cygwin? - Added by Uwe Schulzweida over 2 years ago

Yes, all compressed data is decompressed by CDO. With the -z zip option this data is then compressed again:

cdo -z zip delete,year=1990 infile outfile

RE: why my cdo is quite slow when processing netcdf on cygwin? - Added by Dill Land over 2 years ago

Hi, Uwe,

Thanks for your reply. please see the outputs in the attachment. One is for macav2metdata_pr_CanESM2_r1i1p1_historical_1990_1994_CONUS_daily.NC; the other is for the nc file after removing 1990.

I will check if ncks works in this case.

Thanks a lot.

Dill

RE: why my cdo is quite slow when processing netcdf on cygwin? - Added by Dill Land over 2 years ago

Hi, Uwe,

The thing is that any cdo operations are quite slow for these nc files. As I mentioned, if I do further cdo operations based on processed nc files, it would be very fast.

Do you have any suggestions?

Thanks,

Dill

RE: why my cdo is quite slow when processing netcdf on cygwin? - Added by Uwe Schulzweida over 2 years ago

Hi Dill,

Yes, it is as I feared. The data are chuncked over the timesteps:

     precipitation:_ChunkSizes = 162, 51, 123 ;
That means to read one timestep in CDO 162 timesteps have to be decompressed. And this happens with all timesteps. This is the worst case for the performance of CDO.
Try it with ncks and report the result.

Cheers,
Uwe

RE: why my cdo is quite slow when processing netcdf on cygwin? - Added by Dill Land over 2 years ago

yeah. nco seemed much faster and just used several minutes. I will use it to process those data.

Thanks,

Dill

    (1-7/7)