[R] Help with large datasets

Paul Gilbert pgilbert at bank-banque-canada.ca
Tue Mar 20 21:57:25 CET 2001


>Unfortunately, although my Alpha has
>1.5 gig of ram, R as it is configured seems to be set for a maximum
>of about 100Mb of workspace (as best I can tell).

With R 1.2.2 I think this is suppose to expand automatically until you reach limits
imposed by the operating system. In Unix/Linux  you may need to use unlimit to adjust
default limits. You still eventually can get

Error: cannot allocate vector of size xxxx

In Unix I think this is a limit imposed by swap space (and the amount of other
activity on the machine) and is not restricted by physical memory. If you just added
a lot of ram to your machine then it is possible your swap space is smaller than your
physical memory (which doesn't make sense and you probably should increase swap).

Some of this is just speculation on my part and since I'm having a similar problem I
hope someone more authoritative will comment.

Paul Gilbert

-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._



More information about the R-help mailing list