[R] cant restore .Rdata

Gregory R. Warnes warnes at biostat.washington.edu
Wed Jan 27 22:12:23 CET 1999


It might be the number of cons cells, rather than the vector heap size
that is too small.  Right this moment, I'm patching my copy of R to allow
up to 16000000 cons cells.  Here's the patch  (watch for the long lines):

--start-patch--

--- R-0.63.2.orig/src/unix/system.c     Thu Nov 12 08:06:02 1998
+++ R-0.63.2/src/unix/system.c  Wed Jan 27 12:53:59 1999
@@ -502,7 +502,7 @@
                else p = &(*av)[2];
                value = strtol(p, &p, 10);
                if(*p) goto badargs;
-               if(value < R_NSize || value > 1000000)
+               if(value < R_NSize || value > 16000000)
                    REprintf("WARNING: invalid language heap size ignored\n");
                else
                    R_NSize = value;
@@ -511,7 +511,7 @@
                ac--; av++; p = *av;
                value = strtol(p, &p, 10);
                if(*p) goto badargs;
-               if(value < R_NSize || value > 1000000)
+               if(value < R_NSize || value > 16000000)
                    REprintf("WARNING: invalid language heap size '%d' ignored, using default = %d\n", value, R_NSize);
                else
                    R_NSize = value;


---end-patch---

This seems to work for me.  I've now been able to load the large "dput"
file that didn't work before.

-Greg

On Wed, 27 Jan 1999 royle at penguin.irm.r9.fws.gov wrote:

> Hi Folks,
> 
>  I loaded a couple of quite large data sets into an R session and then
> quit (after saving the image).  Now I get:
> 
> Error: a read error occured
> Fatal error: unable to restore saved data
>  (remove .RData or increase memory)
> 
> after trying to start my R session using something like:
> 
> R --vsize XXX --nsize 1000000
> 
> For any value of XXX (I went up to 300 or 400, which is as high as I could
> go.  This seems odd to me because the data sets were not that large.
> 
> In fact, the size of .RData is:
> 
> -rw-rw-r--   1 royle    royle    14848000 Jan 27 14:19 .RData
> 
> I have had much larger data sets in R before and never had this problem
> (at least one that couldn't be fixed by increasing the memory).
> 
> So, it appears that my only option at this point is to delete .RData
> which would be unfortunate since the functions contained therein are
> the result of several days of intense hacking.....
> 
> Does anyone have any ideas?
> 
> thanks in advance,
> 
> andy
> 
> -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
> r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
> Send "info", "help", or "[un]subscribe"
> (in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
> _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
> 


-------------------------------------------------------------------------------
    Gregory R. Warnes          | It is high time that the ideal of success
warnes at biostat.washington.edu  |  be replaced by the ideal of service.
                               |                       Albert Einstein
-------------------------------------------------------------------------------


-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._



More information about the R-help mailing list