[R] error loading huge .RData

Liaw, Andy andy_liaw at merck.com
Tue Apr 23 17:36:48 CEST 2002


Dear R-help,

I've run into a problem loading .RData:  I was running a large computation,
which supposedly produce a large R object.  At the end of the session, I did
a save.image() and then quit.  The .RData has size 613,249,399 bytes.  Now I
can't get R to load this .RData file.  Whenever I tried, I get "Error:
vector memory exhausted (limit reached)".  I tried adding
"--min-vsize=1000M", but that didn't help.  I also tried R  --vanilla and
then attach(".RData"), same error.

>From what I can see, the file is not corrupted.  How can I get R to load it?

System info:
R-1.4.1 on Mandrake Linux 7.1 (kernel 2.4.3)
Dual P3-866 Xeon with 2GB RAM.

Regards,
Andy



------------------------------------------------------------------------------
Notice:  This e-mail message, together with any attachments, contains information of Merck & Co., Inc. (Whitehouse Station, New Jersey, USA) that may be confidential, proprietary copyrighted and/or legally privileged, and is intended solely for the use of the individual or entity named in this message.  If you are not the intended recipient, and have received this message in error, please immediately return this by e-mail and then delete it.

==============================================================================

-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._



More information about the R-help mailing list