[R] MemoryProblem in R-1.4.1

ripley@stats.ox.ac.uk ripley at stats.ox.ac.uk
Tue Apr 30 10:28:08 CEST 2002


On Tue, 30 Apr 2002, vito muggeo wrote:

> Hi all,
> In a simulation context, I'm applying some my function, "myfun" say, to a
> list of glm obj, "list.glm":
> >length(list.glm) #number of samples simulated
> [1] 1000
> >class(list.glm[[324]]) #any component of the list
> [1] "glm" "lm"
> >length(list.glm[[290]]$y) #sample size
> [1] 1000
>
> Because length(list.glm) and the sample size are rather large, I've splitted
> the list into 10 sub-list, say: list.glm1, list.glm2,....
> Now I'm using of course:
> out1<-lapply(list.glm1, myfun)
> out2<-lapply(list.glm2, myfun)
> ....
> However only the first works, for the second one it is:
>
> Error: cannot allocate vector of size 3 Kb
> In addition: Warning message:
> Reached total allocation of 255Mb: see help(memory.size)
>
> So I increase the memory
> > memory.limit(size=300)
> NULL
> > out2<-lapply(list.glm2, myfun) #works
> > out3<-lapply(list.glm3, myfun) #does not works
> Error: cannot allocate vector of size 31 Kb
> In addition: Warning message:
> Reached total allocation of 300Mb: see help(memory.size)
>
> Again I increase the memory.size
> > memory.limit(size=320)
> NULL
> > out3<-lapply(list.glm3, myfun) #works!
> > out4<-lapply(list.glm4, myfun) #does not work!
> .....
> So it seems I have to increase the memory.size each time before applying my
> function. This is suprising because I know that returning to the prompt the
> memory is fully available again. So being the lists similar, why the same
> memory size is not sufficient for every list?

Because the returned objects are still in memory.  My guess is that
out1 etc are large objects: try object.size to see.  I do wonder if a
simple for() loop would not work better.

> Is there any way to solve this problem or have I to modify memory.size()
> after every call.

You could start R with --max-mem-size (and that's better than increasing
memory.limit) but swapping on WinME is likely to be painfully slow.

> And if it is so, is there a limit?

Somewhere around 1.5Gb, if you have enough swap space.

> Moreover the problem does not depends on the number of simulated samples
> (i.e. length(list)), because I'm applying the function on sub-list having
> just 100 components.
>
>
> I'm running R 1.4.1 on WinMe pentium III 750 with 256RAM
>
> By the way  there would be the same problem on Linux?

Very likely, but Linux's swap space works much better than WinME, at least
up to 1Gb.

-- 
Brian D. Ripley,                  ripley at stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272860 (secr)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595

-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._



More information about the R-help mailing list