[R] error loading huge .RData

Mark White mjw at celos.net
Wed Apr 24 18:29:15 CEST 2002


Liaw, Andy writes:
> > Hmm. You could be running into some sort of situation where data
> > temporarily take up more space in memory than they need to. It does
> > sound like a bit of a bug if R can write images that are bigger than
> > it can read. Not sure how to proceed though.

I regularly carry around image processing workspaces 600+ MB
and haven't had any problems so far...

> R-1.4.1/Mandrake Linux 7.1 (kernel 2.4.3)
> Dual P3-866 Xeon with 2GB RAM and 2GB swap.

...and I'm using fairly similar hardware, too, running NetBSD.

> Prof. Tierney has been trying to help me off-list.  I monitored the R.bin
> process through ktop as Prof. Tierney suggested.  The strange thing is
> that the memory usage for the R.bin process would reach nearly 1000MB
> and then R just quits with the vector heap exhausted error.
> [...] I check ulimit and it says "unlimited".

That's consistent with a datasize limit of 1024 MB, with
typical other limit values.  What does 'ulimit -a' say?

You can explicitly raise the various limits (look at the
manpage for your favourite shell under limit and/or
ulimit).  Note there are 'hard' and 'soft' limits; the
former can only be raised above a predefined ceiling by the
superuser.

Mark <><
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._



More information about the R-help mailing list