[R] memory problems in unix version

Tim Hoar thoar at cgd.ucar.edu
Thu Jun 21 00:36:40 CEST 2001

I have read the manual, but am unsure how to interpret the following
one-line error message:

Error: cannot allocate vector of size 38912 Kb

I monitor the process with "top" and the machine has plenty of
memory available when this message is generated.

I understand the unix variant of R has some command-line options for memory:

  --min-vsize=N         Set vector heap min to N bytes; `4M' = 4 MegaB
  --max-vsize=N         Set vector heap max to N bytes;
  --min-nsize=N         Set min number of cons cells to N
  --max-nsize=N         Set max number of cons cells to N

Invoked with no command-line options under Solaris 2.6,
 R (Version 1.2.3 (2001-04-26)) reports

> mem.limits()
nsize vsize
   NA    NA

I understand this to mean I am able to utilize all the memory on the machine.

The painful part of this is that I used to be able to do this same sized
problem in S-PLUS (slowly, mind you). The error message is generated as the
job exits a loop.

Any hints?


## Tim Hoar, Associate Scientist              email: thoar at ucar.edu     ##
## Geophysical Statistics Project             phone: 303-497-1708       ##
## National Center for Atmospheric Research   FAX  : 303-497-1333       ##
## Boulder, CO  80307                    http://www.cgd.ucar.edu/~thoar ##

r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch

More information about the R-help mailing list