[R] R crashes with memory errors on a 256GB machine (and system shoes only 60GB usage)

Milan Bouchet-Valat nalimilan at club.fr
Thu Jan 2 23:16:51 CET 2014


Le jeudi 02 janvier 2014 à 09:07 +0200, Xebar Saram a écrit :
> Hi All,
> 
> I have a terrible issue i cant seem to debug which is halting my work
> completely. I have R 3.02 installed on a linux machine (arch linux-latest)
> which I built specifically for running high memory use models. the system
> is a 16 core, 256 GB RAM machine. it worked well at the start but in the
> recent days i keep getting errors and crashes regarding memory use, such as
> "cannot create vector size of XXX, not enough memory" etc
> 
> when looking at top (linux system monitor) i see i barley scrape the 60 GB
> of ram (out of 256GB)
> 
> i really don't know how to debug this and my whole work is halted due to
> this so any help would be greatly appreciated
One important thing to note is that while the memory use may appear to
be low, if the memory is fragmented, R may not be able to allocate a
*contiguous* memory area for a big vector (you didn't tell us how big it
was). In that case, AFAIK the only solution is to restart R (saving the
session or objects you want to keep).


Regards




More information about the R-help mailing list