[R] Memory problems, HDF5 library and R-1.2.2 garbage collection

Marcus G. Daniels mgd at swarm.org
Tue Apr 10 05:37:27 CEST 2001


>>>>> "NEN" == Norberto Eiji Nawa <eiji at isd.atr.co.jp> writes:

NEN> The problem I am facing with R-1.2.2 is that when I try to load
NEN> 50 of the 1.5MB HDF5 files (using the hdf5 library) in a loop, my
NEN> Linux box gets close to its memory limit around the file #15
NEN> (256MB RAM and 256MB swap). This happens even if I load file ->
NEN> erase all the objects -> load file -> erase all the objects...

I haven't yet had time to go through your test cases, but I did just
find and fix a garbage collection bug in the HDF5 module that could explain
the problem.  The new version is:

  ftp://ftp.swarm.org/pub/swarm/src/testing/hdf5_1.2.tar.gz

Btw, I don't see a big explosion in memory loading my datasets, but 
these aren't `compound' HDF5 types that map into data frames...
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._



More information about the R-help mailing list