[R] Testing memory limits in R??

Jonathan Greenberg greenberg at ucdavis.edu
Mon Jul 6 22:42:37 CEST 2009


You could probably just make a big array and watch "top" usage -- a 5gb 
array would do the trick -- if you can break 4gb you are golden. 

big_vector=c(1:1000000) and keep adding zeroes...

--j

Scott Zentz wrote:
> Hello Everyone,
>
>    We have recently purchased a server which has 64GB of memory 
> running a 64bit OS and I have compiled R from source with the 
> following config
>
> ./configure --prefix=/usr/local/R-2.9.1 --enable-Rshlib 
> --enable-BLAS-shlib --enable-shared --with-readline --with-iconv 
> --with-x --with-tcktk --with-aqua --with-libpng --with-jpeglib
>
> and I would like to verify that I can use 55GB-60GB of the 64GB of 
> memory within R. Does anyone know how this is possible? Will R be able 
> to access that amount of memory from a single process? I am not an R 
> user myself but I just wanted to test this before I turned the server 
> over to the researchers..
>
> Thanks!
> -scz
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide 
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

-- 

Jonathan A. Greenberg, PhD
Postdoctoral Scholar
Center for Spatial Technologies and Remote Sensing (CSTARS)
University of California, Davis
One Shields Avenue
The Barn, Room 250N
Davis, CA 95616
Cell: 415-794-5043
AIM: jgrn307, MSN: jgrn307 at hotmail.com, Gchat: jgrn307




More information about the R-help mailing list