[Rd] exceeding memory causes crash on Linux

Peter Dalgaard BSA p.dalgaard at biostat.ku.dk
Thu Oct 9 17:49:12 MEST 2003


Paul Gilbert <pgilbert at bank-banque-canada.ca> writes:

> Paul Gilbert wrote:
> 
> >> I am having an unusual difficulty with R 1.8.0 on Mandrake 9.1
> >> running a problem that takes a large amount of memory. With R 1.7.1
> >> this ran on the machine I am using (barely), but now takes more
> >> memory than is available.  The first two times I tried with R
> >> 1.8.0, R exited after the program had run for some time, and gave
> >> no indication of anything, just returned to the shell command
> >> prompt. I ran under gdb to see if I could get a bettter indication
> >> of the problem, and this crashed Linux completely, or at least X,
> >> but I couldn't get another console either. (I haven't had anything
> >> crash Linux in a long time.) To confirm this I ran R under gdb
> >> again, and ran top to verify I was hitting memory constraints
> >> (which I was), but this time R did give a message "Error: cannot
> >> allocate a vector of size ..."
> >
> >
> > P.S. But there does not seem to be proper garbage collection after
> > this. Top showed the memory still in use and subsequent attempts to
> > run the program failed immediately trying to allocate a much smaller
> > vector. When I did gc()  explicitely it did clean up and I could
> > start the function again. The second time R exited back to the gdb
> > prompt with a message "Program terminated with signal  SIGKILL,
> > Killed. The program no longer exists."
> 
> P.P.S. I can reproduce this (at least the SIGKILL part) on a machine
> with 500MB memory and swap turned off simply with
>     z <- rnorm(50000000)
> (Turning off swap is simply to make the failure happen quickly rather
> than slowly on a bigger problem.)
> 
> >
> >
> >> I'm not worried about running the problem, but I would like a more
> >> graceful exit. Might this be related to the change in error
> >> handling?

I believe this is in the operating system, not in R. When all system
memory has been used up, the kernel goes looking for a job to kill,
and that can be R, the X server, or whatever it feels like. It is
difficult for R to do anything about it since the out-of-memory
condition can arise after the memory was allocated.

-- 
   O__  ---- Peter Dalgaard             Blegdamsvej 3  
  c/ /'_ --- Dept. of Biostatistics     2200 Cph. N   
 (*) \(*) -- University of Copenhagen   Denmark      Ph: (+45) 35327918
~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk)             FAX: (+45) 35327907



More information about the R-devel mailing list