[R] R vs S-PLUS with regard to memory usage

Peter Dalgaard BSA p.dalgaard at biostat.ku.dk
Mon Oct 2 22:20:38 CEST 2000


"Anantha Prasad/NE/USDAFS" <aprasad at fs.fed.us> writes:

> I am trying to translate code from S-PLUS to R and R really struggles!
> After starting R with the foll.
> R --vsize 50M --nsize 6M --no-restore
> on a 400 MHz Pentium with 192 MB of memory running Linux (RH 6.2),
> I run a function that essentially picks up an external dataset with 2121
> rows
> and 30 columns and builds a lm() object and also runs step() ... the step()
> takes forever to run...(takes very little time in S-PLUS).

Notice that the --nsize takes the number of *nodes* as the value. Each
is 20 bytes, so you're allocating a 170MB chunk there. With various
other memory eaters active, that could easily push a 192MB machine
into thrashing.

The upcoming 1.2 version will be much better at handling memory, but
for now maybe reduce the nsize a bit? The vsize looks a bit hefty as
well given that the data should take up on the order of half a MB.

-- 
   O__  ---- Peter Dalgaard             Blegdamsvej 3  
  c/ /'_ --- Dept. of Biostatistics     2200 Cph. N   
 (*) \(*) -- University of Copenhagen   Denmark      Ph: (+45) 35327918
~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk)             FAX: (+45) 35327907
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._



More information about the R-help mailing list