[Rd] cannot allocate vector of size 71773 Kb (PR#915)

Paul Gilbert pgilbert@bank-banque-canada.ca
Tue, 17 Apr 2001 11:31:00 -0400

> I get the following error when working with large data sets:
>                 > source("/usr/local/genex/rcluster/lib/rcluster/r/hcluster.r");
>                > breadth.program("uploaded_data.txt", "average", 10)
>                 Read 2 items
>                 Read 8574 items
>                 Error: cannot allocate vector of size 71773 Kb
>                 Execution halted
> Is there a way to fix this? This is running as part of the GeneX
>installation so I would have to dig through to figure out how to give R
>more memory.

In R 1.2.2 this message means the operating system is not letting R have the memory.
You do not need parameters on the R command line (and it seems better not to have
them) but you do need to do some things in the operating system before you start R.
In Unix/Linux you first need to check that your datasize and stacksize limits are not
set, as they usually are. Use limit or unlimit depending on you shell or OS. You then
need to have adequate swap space. Physical memory will make things faster, but is not
necessary. You may need a very large swap space if you are going to do much with an
80M vector, but perhaps it gets broken into smaller pieces once you get it loaded.

Given current memory prices you should probably consider more physical memory, but I
expect you will need several gigs of virtual memory to do much work with an 80M
vector. Also beware, as I recall, mkswap in Linux defaults to an older format for
swap and the newer format is faster.

Paul Gibert

r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-devel-request@stat.math.ethz.ch