[R] memory error / iterative procedure

Farouk Nathoo nathoo at cs.sfu.ca
Tue Oct 21 03:12:59 CEST 2003


Dear R experts,


I have been trying to run an iterative procedure in R and am having
some sort of memory build up problem. I am using R1.8.0 on Windows XP.
One single iteration of my procedure is coded into a function. This
function creates an extremely large matrix of simulated values (actually
calls WinBugs which returns simulations), does some calculations with it
and returns a single number as a result.  After this one step I no longer
need this large matrix but it seems to be stored in memory anyhow. The
code is something like:


        parameter <- 0 # initial value
   for  (i in 1:1000)
                {
                parameter <- one.step(parameter, data)
                mem <- memory.size()
                cat(parameter," ", mem,"\n")
                }


I output the memory.size() at each iteration and this grows and grows
until I run out of memory and get an allocation error. When this happens,
I record the last parameter value, quit R, start R again and rerun the
procedure starting with this most recent value. I'd rather not do it this
way! I have increased the memory limit using the memory.limit() function
and this helps a bit.

My Questions:

1. Is there any way to free the memory after each iteration since
I really don't need anything other than the most recent parameter value?

2. If I run this code on the same machine but using the Linux OS will i
have the same problem?

3. Would I be able to avoid this problem if I ran the loop in some other
language like Perl or C and called the R function to do one iteration at

I have noticed several postings about this sort of thing in the archives
but I'm still a bit unclear. Any help is greatly appreciated.

Thanks,
Farouk




More information about the R-help mailing list