[R] Error: cannot allocate vector of size... but with a twist

James Muller james.muller at internode.on.net
Sun Jan 30 07:01:01 CET 2005


Apparently not. I had to re-run things (after some modifications) to 
double-check the 32-bit theory (had to check whether I'd used 3.5GB or 
3GB of swap at crash time).

R crashed with the same error (Error: cannot allocate vector of size 145 
Kb), and here is the memory usage (of my whole system, not just R) (note 
I have >4GB swap partition and 0.5GBRAM):
  RAM: 459MB
  Swap: 2.8GB
  Total: 3.2GB
So there should be 0.8GB left to eat before any failure...??

I had garbage collection giving me output as we go; here is the output 
(truncated in the middle).

[R is initialized with nothing in memory here]
 > load("/mnt/projects/cdata/data/tmpb0.RData") # ~200MB uncompressed data
 > load("/mnt/projects/cdata/data/tmpb1.RData") # ~400MB uncompressed data
 > cdata01.data <- cbind(c.b.0,c.b.1) # bind all the loaded data together
Garbage collection 1531 = 40+13+1478 (level 2) ...
361503 cons cells free (44%)
2.3 Mbytes of heap free (0%)
Garbage collection 1532 = 40+13+1479 (level 2) ...
361488 cons cells free (44%)
2.3 Mbytes of heap free (0%)
[...]
Garbage collection 2281 = 41+13+2227 (level 2) ...
350235 cons cells free (42%)
2.3 Mbytes of heap free (0%)
Garbage collection 2282 = 41+13+2228 (level 2) ...
350220 cons cells free (42%)
2.3 Mbytes of heap free (0%)
Error: cannot allocate vector of size 145 Kb
 >

This is really frustrating. This little bit took a good 90 mins to crash.

Once again, 32bit Linux OS (RH9), 4GB swap, 0.5GB RAM, R2.0

Any theories?

James


Prof Brian Ripley wrote:

> On Fri, 28 Jan 2005, James Muller wrote:
>
>> Hi,
>>
>> I have a memory problem, one which I've seen pop up in the list a few 
>> times, but which seems to be a little different. It is the Error: 
>> cannot allocate vector of size x problem. I'm running R2.0 on RH9.
>>
>> My R program is joining big datasets together, so there are lots of 
>> duplicate cases of data in memory. This (and other tasks) prompted me 
>> to... expand... my swap partition to 16Gb. I have 0.5Gb of regular, 
>> fast DDR. The OS seems to be fine accepting the large amount of 
>> memory, and I'm not restricting memory use or vector size in any way.
>>
>> R chews up memory up until the 3.5Gb area, then halts. Here's the 
>> last bit of output:
>
>
> You have, presumably, a 32-bit computer with a 4GB-per-process memory 
> limit.  You have hit it (you get less than 4Gb as the OS services need 
> some and there is some fragmentation).  The last failed allocation may 
> be small, as you see, if you are allocating lots of smallish pieces.
>
> The only way to overcome that is to use a 64-bit OS and version of R.
>
> What was the `twist' mentioned in the title?  You will find a similar 
> overall limit mentioned about weekly on this list if you look in the 
> archives.
>
>>
>>> # join the data together
>>> cdata01.data <- 
>>
>> cbind(c.1,c.2,c.3,c.4,c.5,c.6,c.7,c.8,c.9,c.10,c.11,c.12,c.13,c.14,c.15,c.16,c.17,c.18,c.19,c.20,c.21,c.22,c.23,c.24,c.25,c.26,c.27,c.28,c.29,c.30,c.31,c.32,c.33) 
>>
>> Error: cannot allocate vector of size 145 Kb
>> Execution halted
>>
>> 145--Kb---?? This has me rather lost. Maybe on overflow of some 
>> sort?? Maybe on OS problem of some sort? I'm scratching here.
>>
>> Before you question it, there is a legitimate reason for sticking all 
>> these components in the one data.frame.
>>
>> One of the problems here is that tinkering is not really feasible. 
>> This cbind took 1.5 hrs to finally halt.
>>
>> Any help greatly appreciated,
>>
>> James
>>
>> ______________________________________________
>> R-help at stat.math.ethz.ch mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide! 
>> http://www.R-project.org/posting-guide.html
>>
>




More information about the R-help mailing list