[R] Memory failure!!!!

Uwe Ligges ligges at statistik.uni-dortmund.de
Mon Aug 9 13:13:35 CEST 2004

Monica Palaseanu-Lovejoy wrote:

> Hi,
> I am trying to increase the memory R can use. I am running R 
> under Windows on a machine with 2 GB of physical RAM and 4GB 
> of paged memory.
> I wrote in the R property windows --sdi --max-mem-size=4094M, 
> but the R itself when it is doing a bayesian modelling (geoR) it 
> stops at 1,096K and i get memory errors because it cannot 
> allocate a new segment of about 500K of memory.

In that case either
a) your memory is too fragmented and you should start a new R session or
b) or you need much more than 500K, and you are just getting the error 
from one allocation (out of many) that fails.

The next question is which version of Windows this is. Some versions do 
not support that much memory, others have a hole of 512M (???, PCI 
address space is within the 4GB address space, AFAIR)...

> I don't have Visual Basic so i cannot use the other commands 
> suggested in Help.
> Also, if i am using memory.size(max=TRUE) i get a value 
> corresponding to about 1024K, and if i am using 
> memory.limit(size=NA) i get a value of about 4000K.

I hope you mean about 4000*M*...
Or more precisely, what you get is:
 > 4094*1024*1024
[1] 4292870144

Uwe Ligges

> How can i force R to use more memory?
> thank you for any suggestion,
> Monica
> Monica Palaseanu-Lovejoy
> University of Manchester
> School of Geography
> Mansfield Cooper Bld. 3.21
> Oxford Road
> Manchester M13 9PL
> England, UK
> Tel: +44 (0) 275 8689
> Email: monica.palaseanu-lovejoy at stud.man.ac.uk
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://www.stat.math.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

More information about the R-help mailing list