Project

General

Profile

Memory fault (coredump)

Added by Alex Farnsworth about 8 years ago

Hi there,

I am trying to use the cdo operators to convert monthly time series into an annual timeseries using:

cdo mulc ,12 −divdpy −yearavg −muldpm infile outfile

This has worked many times in the past, however I am now getting this error:

cdo mulc: Started child process "divdpy -yearavg -muldpm tdlua.temp_mm_dpth_5.monthly.nc (pipe1.1)".
cdo(2) divdpy: Started child process "yearavg -muldpm tdlua.temp_mm_dpth_5.monthly.nc (pipe2.1)".
cdo(3) yearavg: Started child process "muldpm tdlua.temp_mm_dpth_5.monthly.nc (pipe3.1)".
Memory fault(coredump)

We are currently using this build:

Climate Data Operators version 1.7.1rc2 (http://mpimet.mpg.de/cdo)
Compiled: by root on eocene.ggy.bris.ac.uk (x86_64-unknown-linux-gnu) Feb 24 2016 11:00:55
Compiler: gcc -std=gnu99 -g -O2 -fopenmp
version: gcc (GCC) 4.4.7 20120313 (Red Hat 4.4.7-16)
Features: DATA PTHREADS OpenMP HDF5 NC4/HDF5 OPeNDAP Z SSE2
Libraries: HDF5/1.8.15
Filetypes: srv ext ieg grb nc nc2 nc4 nc4c
CDI library version : 1.7.1rc1 of Feb 24 2016 11:00:44
CGRIBEX library version : 1.7.4 of Feb 3 2016 13:45:03
NetCDF library version : 4.3.3.1 of Jun 15 2015 16:30:04 $
HDF5 library version : 1.8.15
SERVICE library version : 1.4.0 of Feb 24 2016 11:00:42
EXTRA library version : 1.4.0 of Feb 24 2016 11:00:42
IEG library version : 1.4.0 of Feb 24 2016 11:00:42
FILE library version : 1.8.2 of Feb 24 2016 11:00:42

Any help would be greatly appreciated :)

Best regards,
Alex Farnsworth


Replies (6)

RE: Memory fault (coredump) - Added by Ralf Mueller about 8 years ago

could you upload the input file?

RE: Memory fault (coredump) - Added by Alex Farnsworth about 8 years ago

Hi Ralph,

Thanks for the reply, please find it attached. I suspect it is the file format as it is a Ndf5 file whereas an hdf4 netcdf works fine.

Cheers,
Alex

RE: Memory fault (coredump) - Added by Ralf Mueller about 8 years ago

I can reproduce the segfault. The CDO call chain runs through, if you add
'-f nc2' to the command line options.

In some tests I got

Error (cdf_get_vara_float) : NetCDF: HDF error
HDF5: infinite loop closing library
      R,D,G,A,S,T,D,G,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S,S

Error (cdf_close) : NetCDF: HDF error

so there could be a problem within the library, but I did not further test it, yet

The error seems to related to the writing of the output file, because it does not occur, if I use sinfov instead of a file write.

anyhow, Thanks for the report!!

RE: Memory fault (coredump) - Added by Ralf Mueller about 8 years ago

gdb points to the HDF5 library

Program received signal SIGSEGV, Segmentation fault.
0x00007ffff3852d34 in H5SL_search () from /usr/lib/libhdf5.so.10

RE: Memory fault (coredump) - Added by Alex Farnsworth about 8 years ago

Hi Ralph and Uwe

Thank you very much for the replies. Putting the '-L' flag works perfectly!

We (Greg Tourte) are also going to try and recompile the library to allow threading for hdf5 files to circumvent this issue in the long run.

Thanks again!

Alex

    (1-6/6)