[R] Error: vector memory exhausted (limit reached?)

Uwe Ligges ligges at amadeus.statistik.uni-dortmund.de
Tue Aug 28 19:04:59 CEST 2001


On Tue, 28 Aug 2001, Agustin Lobo wrote:

> 
> On Tue, 28 Aug 2001, Prof Brian Ripley wrote:
> 
> > Delete some large objects.  
> Thanks, but I could not remove anything once the memory became
> exhausted:
> 
> > ls()
> Error: vector memory exhausted (limit reached?)
> > rm(lissNPC)
> Error: vector memory exhausted (limit reached?)
>  
>  
> >And read the help page mentioned so you don't
> > get this problem again.
>  
> 
> I had tried before asking the list, but help(memory.size) produced the
> same error
> and a search of memory.size under
> rw1030/doc/html/search/SearchEngine.html
> 
> yields "No matches for "memory.size" have been found!"

Works for me...


> So, I had to reset the machine.

*Reset* the machine? Killing the RGui process doesn't work? Anyway, it's
OS dependent.


> Once the machine was up again, I launched R in the same directory and
> read help(memory.size). Then I made the following test:
> 
> 1. Just with a moderate .RData file loaded:
> > memory.size()
> [1] 12909648
> 
> 
> 2. Loading a relatively large file:
> > load("lissN.R")
> > memory.size()
> [1] 19963560
> 
> 
> 3. And then removing the loaded (from "lissN.R") objects:
> > rm(lissNPC,lissNPC.sta,lissNPC1.ady,lissNPC1.ref)
> > memory.size()
> [1] 19964496
> 
> 
> The memory use was not reduced by removing these
> large files, so it seems that "deleting some large files"
> would not have been a solution. Is this behaviour as expected
> or am I doing something (else) wrong?

Yes and no. Since no more memory was needed by other objects, no garbage
collection was done. So the behaviour is expected. Have a look at ?Memory
and ?gc.

 
> Finally, the doc says that 
> "`memory.limit' reports the limit in force on the total allocation."
> 
> and
> 
> "Command-line flag `--max-mem-size' sets the maximum value of obtainable
> memory"
> 
> How is the current memory.limit (memory.limit()/1024^2 = 47.51563
> in my labtop) set by default? Is it advisable to increase it
> beyond the RAM (and use virtual memory)? Would be here any
> significant difference between Win9* and linux systems?

Yes. This option is a significant difference, because it only exists on
Windows. Have a look at ?Memory. It says:
"The default is the smaller of the amount of physical RAM in the machine
and 256Mb" ==> In your case the default is 48Mb.
You can increase it by setting --mex-mem-size, but your system will
begin swapping heavily making use of virtual memory.
This question is already often answered in the mailing list archives, and
also mentioned in the FAQs and other documentation.

Uwe Ligges

-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._



More information about the R-help mailing list