[R] Help with large datasets

Prof Brian D Ripley ripley at stats.ox.ac.uk
Tue Mar 20 21:03:03 CET 2001


On Tue, 20 Mar 2001, Steven Boker wrote:

>
> I am a new user of R (1.2.2 on Alpha), but have been using S and then
> Splus heavily for about 10 years now.  My problem is this.  The data
> I analyze comprise large sets.  Typically I am analyzing 5000 observations
> on 90 variables over several hundred subjects.  Sometimes 500,000
> subjects with 200 variables.  Unfortunately, although my Alpha has
> 1.5 gig of ram, R as it is configured seems to be set for a maximum
> of about 100Mb of workspace (as best I can tell).  The published
> command line switch seem to be able to restrict memory parameters, but
> not enlarge the main workspace.
>
> What I'd like to do is to make the workspace much larger (10x).  Either
> on the fly, if possible, or by changing the appropriate #defines
> and recompiling so as to be able to analyze my admittedly excessive data.
>
> Is there a short happy answer to my plea?

It should happen automatically.  What behaviour makes you think there is a
maximum of 100Mb?  (We happily use 500Mb or so on a 1Gb RAM i686: more if
there are no other users.)

-- 
Brian D. Ripley,                  ripley at stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272860 (secr)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595

-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._



More information about the R-help mailing list