Caching Input File
Added by Matthias K over 5 years ago
Hello everyone,
is there a way to cache a large input file for subsequent use in a loop?
Here is what I want to do:
for i in {1..2232}
do
cdo -O --reduce fldmean -selgridcell,"mask_$i.nc" input.nc "output_$i.nc"
done
This works fine, but takes 60s per iteration, which is most likely largely due to reading of the input.nc (20GB). As I have to repeat the 2232 iterations for 10 input files, I was wondering whether I good improve the loop.
Thanks in advance!
Replies (1)
RE: Caching Input File - Added by Ralf Mueller over 5 years ago
hi!
AFAIK there is no such thing on the commands line. usually disk have some space for caching, but not 20GB I guess. You can move the input file to /tmp
or /dev/shm
on unix-like systems. On a big server with a lot of RAM these directories should be big enough. this should limit your IO-time.
hth
ralf