[R] memory

Prof Brian D Ripley ripley at stats.ox.ac.uk
Thu May 17 21:54:38 CEST 2001


On Thu, 17 May 2001, M. Edward (Ed) Borasky wrote:

> On Thu, 17 May 2001, Thomas Lumley wrote:
>
> > Well, there's clearly a bug involved. You could make a good case for the
> > bug being in the operating system, though -- you should certainly be able
> > to kill the R process without restarting the computer.
> >
> > I think that Windows is prepared to give away more memory than is good for
> > it, leaving the operating system with no room to work.  The --max-mem-size
> > option is designed to stop this: it specifies the maximum amount of memory
> > R can ask for.  It sounds like  your file is too big for read.csv in the
> > memory you have.  I believe R 1.3.0 has an improved read.table, which
> > might help.
>
> Windows 95, 98 and ME have *very* limited memory management capabilities. IIRC
> the R default is to limit memory to the amount of physical memory installed.
> My recommendation for Win 95, 98 and ME is to limit memory to *half* of
> installed physical memory. On Windows NT and 2000, you can probably get away
> with 75 percent of physical memory. You're wasting your time trying to go larger
> than these values; at best, Windows will thrash and at worst it will crash.
>
> I have not yet pushed these limits with Linux, either 2.2 or 2.4 kernel, but
> I'd be willing to bet that the 75% of physical memory is close to what you can
> get away with -- maybe 80% with X windows turned off and running in batch mode.
> Generally on *NIX boxen you want 1/4 of real memory or thereabouts for buffer
> cache, leaving 3/4 for everything else.

Pretty sweeping generalizations ... here is some actual experience with R.

On NT-based Windows and Unix-alikes, one can often use much more VM than
physical RAM: it all depends on the problem structure (and disc speed).

Windows 95/98/ME are worse, not least because of their disk subsystems, but
the empirical performance is nowhere near as poor as you claim.  I
developed a lot of the current R for Windows on a 133Mz 32Mb Win95 machine,
and if your claims were true, it just would not exist ....  Using a
workspace much over 30Mb was slow, but 30Mb was fine.  That was before the
days of the non-moving garbage collector and replacement malloc, which I
think have helped quite a bit.

People do happily run R on 8Mb Windows 95 machines ....

-- 
Brian D. Ripley,                  ripley at stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272860 (secr)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595

-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._



More information about the R-help mailing list