[R] Memory filling up while looping

jim holtman jholtman at gmail.com
Fri Dec 21 13:37:20 CET 2012


have you tried putting calls to 'gc' at the top of the first loop to
make sure memory is reclaimed? You can print the call to 'gc' to see
how fast it is growing.

On Thu, Dec 20, 2012 at 6:26 PM, Peter Meissner
<peter.meissner at uni-konstanz.de> wrote:
> Hey,
>
> I have an double loop like this:
>
>
> chunk <- list(1:10, 11:20, 21:30)
> for(k in 1:length(chunk)){
>         print(chunk[k])
>         DummyCatcher <- NULL
>         for(i in chunk[k]){
>                 print("i load something")
>                 dummy <- 1
>                 print("i do something")
>                 dummy <- dummy + 1
>                 print("i do put it together")
>                 DummyCatcher = rbind(DummyCatcher, dummy)
>         }
>         print("i save a chunk and restart with another chunk of data")
> }
>
> The problem now is that with each 'chunk'-cycle the memory used by R becomes
> bigger and bigger until it exceeds my RAM but the RAM it needs for any of
> the chunk-cycles alone is only a 1/5th of what I have overall.
>
> Does somebody have an idea why this behaviour might occur? Note that all the
> objects (like 'DummyCatcher') are reused every cycle so that I would assume
> that the RAM used should stay about the same after the first 'chunk' cycle.
>
>
> Best, Peter
>
>
> SystemInfo:
>
> R version 2.15.2 (2012-10-26)
> Platform: x86_64-w64-mingw32/x64 (64-bit)
> Win7 Enterprise, 8 GB RAM
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.



-- 
Jim Holtman
Data Munger Guru

What is the problem that you are trying to solve?
Tell me what you want to do, not how you want to do it.




More information about the R-help mailing list