[R] Testing memory limits in R??

Duncan Murdoch murdoch at stats.uwo.ca
Tue Jul 7 03:39:16 CEST 2009


On 06/07/2009 4:16 PM, Peter Dalgaard wrote:
> Scott Zentz wrote:
>> Hello Everyone,
>>
>>    We have recently purchased a server which has 64GB of memory running 
>> a 64bit OS and I have compiled R from source with the following config
>>
>> ./configure --prefix=/usr/local/R-2.9.1 --enable-Rshlib 
>> --enable-BLAS-shlib --enable-shared --with-readline --with-iconv 
>> --with-x --with-tcktk --with-aqua --with-libpng --with-jpeglib
>>
>> and I would like to verify that I can use 55GB-60GB of the 64GB of 
>> memory within R. Does anyone know how this is possible? Will R be able 
>> to access that amount of memory from a single process? I am not an R 
>> user myself but I just wanted to test this before I turned the server 
>> over to the researchers..
> 
> Hmm, it's slightly tricky because R often duplicates objects, so you may 
> hit the limit only transiently. Also, R has an internal 2GB limit on 
> single vectors. But something like this

Is it a 2 GB limit in size, or in the number of elements?  I'm still 
spending almost all my time in 32 bit land, so it's hard to check.

Duncan Murdoch

> 
> Y <- replicate(30, rnorm(2^28-1))
> 
> should create an object of about 30*2GB. Then lapply(Y, mean) should 
> generate 30 very good and very expensive approximations to 0.
> 
> (For obvious reasons, I haven't tested this on a 1GB ThinkPad X40....)
> 
>




More information about the R-help mailing list