Project

General

Profile

How to merge 2 netcdf4 file to one netcdf4 file using CDO?

Added by Behzad Navidi over 4 years ago

Hey all,
I want to merge 2 netcdf4 files together using CDO. I try this code below:
cdo merge precip.1982.nc precip.1983.nc precip_all.nc
But unfortunately it's got me the following error:
open failed on >precip.1982.nc
unsupported file type (library support not compiled in)
CDO was built with a NetCDF version which doesn't support netcdf4 data!

please help me with this issue:( , How to merge this 2 file?

Best Regards
Behzad Navidi


Replies (10)

RE: How to merge 2 netcdf4 file to one netcdf4 file using CDO? - Added by Karin Meier-Fleischer over 4 years ago

Hi Behzad,

without the files its not possible to help. Can you upload the files or give us the results of 'ncdump -h' of both files.

-Karin

RE: How to merge 2 netcdf4 file to one netcdf4 file using CDO? - Added by Behzad Navidi over 4 years ago

Dear Karin
Here is the results of ncdump -h for the *first file: *

behzad@behzad-x44h:~/Desktop$ ncdump -h precip.1982.nc
netcdf precip.1982 {
dimensions:
    lat = 360 ;
    lon = 720 ;
    time = UNLIMITED ; // (365 currently)
variables:
    float lat(lat) ;
        lat:actual_range = 89.75f, -89.75f ;
        lat:long_name = "Latitude" ;
        lat:units = "degrees_north" ;
        lat:axis = "Y" ;
        lat:standard_name = "latitude" ;
        lat:coordinate_defines = "center" ;
    float lon(lon) ;
        lon:long_name = "Longitude" ;
        lon:units = "degrees_east" ;
        lon:axis = "X" ;
        lon:standard_name = "longitude" ;
        lon:actual_range = 0.25f, 359.75f ;
        lon:coordinate_defines = "center" ;
    double time(time) ;
        time:long_name = "Time" ;
        time:axis = "T" ;
        time:standard_name = "time" ;
        time:coordinate_defines = "start" ;
        time:actual_range = 718800., 727536. ;
        time:delta_t = "0000-00-01 00:00:00" ;
        time:avg_period = "0000-00-01 00:00:00" ;
        time:units = "hours since 1900-01-01 00:00:00" ;
    float precip(time, lat, lon) ;
        precip:missing_value = -9.96921e+36f ;
        precip:var_desc = "Precipitation" ;
        precip:level_desc = "Surface" ;
        precip:statistic = "Total" ;
        precip:parent_stat = "Other" ;
        precip:long_name = "Daily total of precipitation" ;
        precip:cell_methods = "time: sum" ;
        precip:avg_period = "0000-00-01 00:00:00" ;
        precip:actual_range = 0.f, 816.6611f ;
        precip:units = "mm" ;
        precip:valid_range = 0.f, 1000.f ;
        precip:dataset = "CPC Global Precipitation" ;

// global attributes:
        :Conventions = "CF-1.0" ;
        :version = "V1.0" ;
        :history = "created 9/2016 by CAS NOAA/ESRL PSD" ;
        :title = "CPC GLOBAL PRCP V1.0" ;
        :References = "https://www.esrl.noaa.gov/psd/data/gridded/data.cpc.globalprecip.html" ;
        :dataset_title = "CPC GLOBAL PRCP V1.0" ;
        :Source = "ftp://ftp.cpc.ncep.noaa.gov/precip/CPC_UNI_PRCP/" ;
}

and here is the result for the *second one: *

behzad@behzad-x44h:~/Desktop$ ncdump -h precip.1983.nc
netcdf precip.1983 {
dimensions:
    lat = 360 ;
    lon = 720 ;
    time = UNLIMITED ; // (365 currently)
variables:
    float lat(lat) ;
        lat:actual_range = 89.75f, -89.75f ;
        lat:long_name = "Latitude" ;
        lat:units = "degrees_north" ;
        lat:axis = "Y" ;
        lat:standard_name = "latitude" ;
        lat:coordinate_defines = "center" ;
    float lon(lon) ;
        lon:long_name = "Longitude" ;
        lon:units = "degrees_east" ;
        lon:axis = "X" ;
        lon:standard_name = "longitude" ;
        lon:actual_range = 0.25f, 359.75f ;
        lon:coordinate_defines = "center" ;
    double time(time) ;
        time:long_name = "Time" ;
        time:axis = "T" ;
        time:standard_name = "time" ;
        time:coordinate_defines = "start" ;
        time:actual_range = 727560., 736296. ;
        time:delta_t = "0000-00-01 00:00:00" ;
        time:avg_period = "0000-00-01 00:00:00" ;
        time:units = "hours since 1900-01-01 00:00:00" ;
    float precip(time, lat, lon) ;
        precip:missing_value = -9.96921e+36f ;
        precip:var_desc = "Precipitation" ;
        precip:level_desc = "Surface" ;
        precip:statistic = "Total" ;
        precip:parent_stat = "Other" ;
        precip:long_name = "Daily total of precipitation" ;
        precip:cell_methods = "time: sum" ;
        precip:avg_period = "0000-00-01 00:00:00" ;
        precip:actual_range = 0.f, 1302.464f ;
        precip:units = "mm" ;
        precip:valid_range = 0.f, 1000.f ;
        precip:dataset = "CPC Global Precipitation" ;

// global attributes:
        :Conventions = "CF-1.0" ;
        :version = "V1.0" ;
        :history = "created 9/2016 by CAS NOAA/ESRL PSD" ;
        :title = "CPC GLOBAL PRCP V1.0" ;
        :References = "https://www.esrl.noaa.gov/psd/data/gridded/data.cpc.globalprecip.html" ;
        :dataset_title = "CPC GLOBAL PRCP V1.0" ;
        :Source = "ftp://ftp.cpc.ncep.noaa.gov/precip/CPC_UNI_PRCP/" ;
}

Thank You
Behzad

RE: How to merge 2 netcdf4 file to one netcdf4 file using CDO? - Added by Karin Meier-Fleischer over 4 years ago

Can you send the output of ‚cdo sinfo‘ of your files.

The lat units should be degrees_south.

RE: How to merge 2 netcdf4 file to one netcdf4 file using CDO? - Added by Behzad Navidi over 4 years ago

for both files, I got this result from cdo sinfo:

behzad@behzad-x44h:~/Desktop$ cdo sinfo precip.1982.nc
cdo    sinfo: Open failed on >precip.1982.nc<
              Unsupported file type (library support not compiled in)
To create a CDO application with NetCDF4 support use: ./configure --with-netcdf=<NetCDF4 root directory> ...

:(
do you think I can solve my problem using the recommendation that presents in this link:
https://code.mpimet.mpg.de/boards/1/topics/3826
by Jim Maas?
He recommends uninstalled all versions of cdo, netcdf and hdf5,

then installed the netcdf andf HDF5 libraries from the Ubuntu repositories:
sudo apt-get install libnetcdf-dev libhdf5-dev

then downloaded cdo-1.7.0, and configured it as
./configure --enable-netcdf4 --enable-zlib --with-netcdf=/usr/ --with-hdf5=/usr/
make
sudo make install

do you think my problem can be solved if I use this ?
thank you

RE: How to merge 2 netcdf4 file to one netcdf4 file using CDO? - Added by Behzad Navidi over 4 years ago

I uninstalled cdo and install it with netcdf4. then I was trying to merge my two files again, but it's giving me following error:

behzad@behzad-x44h:~/Desktop$ cdo merge precip.1982.nc precip.1983.nc precip_all.nc
Warning! ***HDF5 library version mismatched error***
The HDF5 header files used to compile this application do not match
the version used by the HDF5 library to which this application is linked.
Data corruption or segmentation faults may occur if the application continues.
This can happen when an application was compiled by one version of HDF5 but
linked with a different version of static or shared HDF5 library.
You should recompile the application or check your shared library related
settings such as 'LD_LIBRARY_PATH'.
You can, at your own risk, disable this warning by setting the environment
variable 'HDF5_DISABLE_VERSION_CHECK' to a value of '1'.
Setting it to 2 or higher will suppress the warning messages totally.
Headers are 1.10.0, library is 1.10.5
        SUMMARY OF THE HDF5 CONFIGURATION
        =================================

General Information:
-------------------
                   HDF5 Version: 1.10.5
                  Configured on: Thu Oct 24 21:08:24 +0330 2019
                  Configured by: behzad@behzad-x44h
                    Host system: x86_64-unknown-linux-gnu
              Uname information: Linux behzad-x44h 5.0.0-32-generic #34~18.04.2-Ubuntu SMP Thu Oct 10 10:36:02 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux
                       Byte sex: little-endian
             Installation point: /usr/local

Compiling Options:
------------------
                     Build Mode: production
              Debugging Symbols: no
                        Asserts: no
                      Profiling: no
             Optimization Level: high

Linking Options:
----------------
                      Libraries: static, shared
  Statically Linked Executables: 
                        LDFLAGS: 
                     H5_LDFLAGS: 
                     AM_LDFLAGS: 
                Extra libraries: -lz -ldl -lm 
                       Archiver: ar
                       AR_FLAGS: cr
                         Ranlib: ranlib

Languages:
----------
                              C: yes
                     C Compiler: /usr/bin/gcc
                       CPPFLAGS: 
                    H5_CPPFLAGS: -D_GNU_SOURCE -D_POSIX_C_SOURCE=200809L   -DNDEBUG -UH5_DEBUG_API
                    AM_CPPFLAGS: 
                        C Flags: 
                     H5 C Flags:  -std=c99  -pedantic -Wall -Wextra -Wbad-function-cast -Wc++-compat -Wcast-align -Wcast-qual -Wconversion -Wdeclaration-after-statement -Wdisabled-optimization -Wfloat-equal -Wformat=2 -Winit-self -Winvalid-pch -Wmissing-declarations -Wmissing-include-dirs -Wmissing-prototypes -Wnested-externs -Wold-style-definition -Wpacked -Wpointer-arith -Wredundant-decls -Wshadow -Wstrict-prototypes -Wswitch-default -Wswitch-enum -Wundef -Wunused-macros -Wunsafe-loop-optimizations -Wwrite-strings -finline-functions -s -Wno-inline -Wno-aggregate-return -Wno-missing-format-attribute -Wno-missing-noreturn -O
                     AM C Flags: 
               Shared C Library: yes
               Static C Library: yes

                        Fortran: no

                            C++: no

                           Java: no

Features:
---------
                   Parallel HDF5: no
Parallel Filtered Dataset Writes: no
              Large Parallel I/O: no
              High-level library: yes
                    Threadsafety: no
             Default API mapping: v110
  With deprecated public symbols: yes
          I/O filters (external): deflate(zlib)
                             MPE: no
                      Direct VFD: no
                         dmalloc: no
  Packages w/ extra debug output: none
                     API tracing: no
            Using memory checker: no
 Memory allocation sanity checks: no
          Function stack tracing: no
       Strict file format checks: no
    Optimization instrumentation: no
Bye...
Aborted (core dumped)

do you have any suggestions?
thank you
Behzad Navidi

RE: How to merge 2 netcdf4 file to one netcdf4 file using CDO? - Added by Ralf Mueller over 4 years ago

in your shell, run

export HDF5_DISABLE_VERSION_CHECK=1
and then rerun your CDO call

RE: How to merge 2 netcdf4 file to one netcdf4 file using CDO? - Added by Noam Chomsky over 4 years ago

I have the same warning about HDF5 libraries mismatch. export HDF5_DISABLE_VERSION_CHECK=1 is just a workaround, isn't it? What could be the problems for this mismatch?

RE: How to merge 2 netcdf4 file to one netcdf4 file using CDO? - Added by Ralf Mueller over 4 years ago

the problem is the following:

during the build of CDO there was a certain hdf5 library available (in the default search path or explicitly given by the person of did the build) and got included via

#include <hdf5.h>
When the CDO binary (build with dynamic linking) is executed it will search for a hdf5 installation to execute the hdf5 calls. Now this can be a completely different hdf5 installation compared to the one used for building the CDO binary. Hence a good chance for a version mismatch!

As long as you only have single installation for the software you work with this cannot happen. But on servers there are often multiple versions available for different purpose (e.g. user requirements).

The best strategy IMO is: Clean up your environment

  1. no usage of LD_LIBRARY_PATH
  2. use compilers that support rpath and use this feature
  3. no exported variables that influence the build process
Another valid strategy is: don't build stuff on your own if others can do that for you:
  1. use a package manager like Anaconda (installs pre-compiled binaries) or
  2. use the cdo package from your linux/unix distro (debian, ubuntu, fedora, opensuse, archlinux, freebsd) even is this is not the most recent version. In most cases you won't need the latest CDO release.

if you must compile yourself, avoid the installation of all possible dependencies. You can use spack for managing software packages, it's somewhat similar to conda, but it compiles the software at your machine and offers all kinds of variances for packages. All dependencies are automatically build for you

hth
ralf

RE: How to merge 2 netcdf4 file to one netcdf4 file using CDO? - Added by Miya Bhai over 4 years ago

Can you send the output of ‚cdo sinfo‘ of your files.

    (1-10/10)