[R] calculating memory usage

Prof Brian Ripley ripley at stats.ox.ac.uk
Tue Sep 14 16:06:27 CEST 2004


On Tue, 14 Sep 2004, Adaikalavan Ramasamy wrote:

> Many thanks to Prof. Ripley. The problem is that memory.profile does not
> exist in *nix environment and there is probably a very good reason why.

memory.size?

> 
> I was reading help(Memory) and in the Details section :
>      You can find out the current memory consumption (the heap and cons
>      cells used as numbers and megabytes) by typing 'gc()' at the R
>      prompt.
> 
> AFAICS, Ncells is the fixed memory used by the underlying R and Vcells
> is the variable part and depends on the calculations. 
> 
> Would I be able to say that the generating 10 million random numbers
> requires approximately 73.4 Mb (= 26.3 + 80.5 - 26.3 - 7.1) of memory ?
> I double checked this against memory.size() in Windows and they seem to
> agree. Thank you.

No, only that storing 10 million numbers requires 77.3 - 1.0Mb, and

> object.size(x)/1024^2
[1] 76.29397


> > gc()
>          used (Mb) gc trigger (Mb)
> Ncells 456262 12.2     984024 26.3
> Vcells 122697  1.0     929195  7.1
> > 
> > x <- rnorm(10000000)
> > gc()
>            used (Mb) gc trigger (Mb)
> Ncells   456274 12.2     984024 26.3
> Vcells 10123014 77.3   10538396 80.5
> 
> 
> 
> 
> On Mon, 2004-09-13 at 18:47, Prof Brian Ripley wrote:
> > On Mon, 13 Sep 2004, Adaikalavan Ramasamy wrote:
> > 
> > > I am comparing two different algorithms in terms of speed and memory
> > > usage. I can calculate the processing time with proc.time() as follows
> > > but am not sure how to calculate the memory usage.
> > > 
> > >    ptm <- proc.time()
> > >    x <- rnorm(1000000)
> > >    proc.time() - ptm
> > 
> > Hmm ... see ?system.time!
> > 
> > > I would like to be within R itself since I will test the algorithm
> > > several hundred times and in batch mode. So manually looking up 'top'
> > > may not be feasible. help.seach("memory") suggests memory.profile and gc
> > > but I am not sure how to use these.
> > 
> > I don't think you can.  You can find out how much memory R is using NOW, 
> > but not the peak memory usage during a calculation.  Nor is that 
> > particularly relevant, as it depends on what was gone on before, the word 
> > length of the platform and the garbage collection settings.
> > 
> > On Windows, starting in a clean session, calling gc() and memory.size(), 
> > then calling your code and memory.size(max=TRUE) will give you a fair 
> > idea, but `top' indicates some Unix-alike.
> 
> 

-- 
Brian D. Ripley,                  ripley at stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272866 (PA)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595




More information about the R-help mailing list