[R] Testing memory limits in R??

David Winsemius dwinsemius at comcast.net
Mon Jul 6 23:17:28 CEST 2009


On Jul 6, 2009, at 5:01 PM, David Winsemius wrote:

>
> On Jul 6, 2009, at 4:42 PM, Jonathan Greenberg wrote:
>
>> You could probably just make a big array and watch "top" usage -- a  
>> 5gb array would do the trick -- if you can break 4gb you are golden.
>> big_vector=c(1:1000000) and keep adding zeroes...
>>
>
> Except the maximum size for a vector (and I wonder also for an  
> array?) is 2 GB.
>
> ?"Memory-limits"
>
> On a 10GB equipped machine (MacOSX with the 64 bit R 2.9.1)  I get  
> this
> > big_vector=c(1:2500000000)
> Error in 1:2.5e+09 : result would be too long a vector
>
> And that is not because of lack of machine resources. You will need  
> to create a sizeable number (say 25) of 2 GB "big_vectors" to carry  
> our this suggestion.

Maybe not as many as I thought. I just remembered that each 2 GB  
length vector will take up 8*length= 8 GB of RAM, so about 7 of them  
might do it.
 > ccc <-character(1000000)
 > object.size(ccc)
8000088 bytes
 > ccc <-character(4000000)
 > object.size(ccc)
32000088 bytes

Takes several minutes just to create just one 1.25 GB vector on my  
machine.
 > M <- matrix(1:(5000000000/4), ncol=1)

Not sure that the results of gc() done immediately after that  
assignment can be taken at face value. The "max used" column sed in  
particular appears to bear little relation to my local reality.

 > gc()
             used   (Mb) gc trigger    (Mb)   max used    (Mb)
Ncells   1164830   62.3    1835812    98.1    1835812    98.1
Vcells 625862120 4775.0 1969900386 15029.2 1875862134 14311.7

-- 
DW
>
> -- 
> DW
>
> --j
>>
>>
>> Scott Zentz wrote:
>>> Hello Everyone,
>>>
>>>  We have recently purchased a server which has 64GB of memory  
>>> running a 64bit OS and I have compiled R from source with the  
>>> following config
>>>
>>> ./configure --prefix=/usr/local/R-2.9.1 --enable-Rshlib --enable- 
>>> BLAS-shlib --enable-shared --with-readline --with-iconv --with-x -- 
>>> with-tcktk --with-aqua --with-libpng --with-jpeglib
>>>
>>> and I would like to verify that I can use 55GB-60GB of the 64GB of  
>>> memory within R. Does anyone know how this is possible? Will R be  
>>> able to access that amount of memory from a single process? I am  
>>> not an R user myself but I just wanted to test this before I  
>>> turned the server over to the researchers..
>>>
>>> Thanks!
>>> -scz
>>
>> Jonathan A. Greenberg, PhD
>> Postdoctoral Scholar
>> Center for Spatial Technologies and Remote Sensing (CSTARS)
>> University of California, Davis
>> One Shields Avenue
>> The Barn, Room 250N
>> Davis, CA 95616
>> Cell: 415-794-5043
>> AIM: jgrn307, MSN: jgrn307 at hotmail.com, Gchat: jgrn307
>>
>> ______________________________________________
>> R-help at r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>
> David Winsemius, MD
> Heritage Laboratories
> West Hartford, CT
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

David Winsemius, MD
Heritage Laboratories
West Hartford, CT




More information about the R-help mailing list