[R] memory error / iterative procedure

Peter Dalgaard p.dalgaard at biostat.ku.dk
Tue Oct 21 08:58:02 CEST 2003


Farouk Nathoo <nathoo at cs.sfu.ca> writes:

>         parameter <- 0 # initial value
>    for  (i in 1:1000)
>                 {
>                 parameter <- one.step(parameter, data)
>                 mem <- memory.size()
>                 cat(parameter," ", mem,"\n")
>                 }
> 
> 
> I output the memory.size() at each iteration and this grows and grows
> until I run out of memory and get an allocation error. When this happens,
> I record the last parameter value, quit R, start R again and rerun the
> procedure starting with this most recent value. I'd rather not do it this
> way! I have increased the memory limit using the memory.limit() function
> and this helps a bit.
> 
> My Questions:
> 
> 1. Is there any way to free the memory after each iteration since
> I really don't need anything other than the most recent parameter value?

That generally shouldn't be necessary, unless you're doing something
in one.step() that causes R objects to hang around after each
iteration. A typical mistake is to have an attach() inside the loop
and end up with 1000 copies of the entire data set on the search
path... 
 
> 2. If I run this code on the same machine but using the Linux OS will i
> have the same problem?

Most likely, yes.
 
> 3. Would I be able to avoid this problem if I ran the loop in some other
> language like Perl or C and called the R function to do one iteration at

Probably not.

-- 
   O__  ---- Peter Dalgaard             Blegdamsvej 3  
  c/ /'_ --- Dept. of Biostatistics     2200 Cph. N   
 (*) \(*) -- University of Copenhagen   Denmark      Ph: (+45) 35327918
~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk)             FAX: (+45) 35327907




More information about the R-help mailing list