[R] cannot allocate a vector of size error, what explodes the the memory consumption?

Martin Ivanov tramni at abv.bg
Sat Aug 11 23:05:46 CEST 2012


 Dear R users,

I keep getting the message "Error: cannot allocate a vector of size 543.2 Mb" on my 64 bit Linux machine with 4 Gb of memory. I studied the problem and it occurs at the last lines of my code:

..........
a <- laply(.........);
save(a, file= "file.RData");
rm(a);

"a" is a 4-d array of size about 600 Mb. 3 minutes before R exits and displays the error message, the 
memory consumption of R (diagnosed with top) is 1.8 Gb. 2 minutes before the error, the memory consumption of R suddenly expands to 2.8 Gb and one minute before the error it is 3 Gb. This is why I suspect the problem 
is with my code. I have checked the same code with smaller data sets, it works perfectly. But with larger data sets, that is when the size of "a" gets bigger than, say, about 500 Mb, I get the error. 
Actually 99% of the time the memory consumption is up to 1.8 Gb. This is while things are going within laply.
And at the end, at these 3 last lines something causes more than doubling of the R memory consumption.
Have you got any idea what the culprit is?

Any suggestions will be appreciated.

Best regards,
Martin

-----------------------------------------------------------------
Гражданска отговорност – Цените на компаниите
http://www.sdi.bg/onlineInsurance/?utm_source=gbg&utm_medium=txtLink&utm_content=home



More information about the R-help mailing list