[R] Memory Size & Allocation in R

Elizabeth Purdom epurdom at stat.berkeley.edu
Tue Jan 13 20:24:53 CET 2009


Hi Brigid,

You will probably get some more informed answers in a bit, but to give 
you some quick things to try...

There's no size limit to an object like you are referring to. When 
you're having problems with small objects or simple operations, it 
sounds like you've used up the memory for running R. Try running gc() to 
get memory usage information -- what you're looking at may be static (I 
forget) giving the upper limits, rather than what you are actually using.

You may have an object that is too large or the accumulation of objects 
is too large. You can do object.size() to find it, or to look at all of 
your objects in your environment (sorted by size):

sort( sapply(ls(),function(x){object.size(get(x))}))

If you remove a large object (using rm()), call gc() again to flush out 
the memory (and see your improvement).

On the other hand, manipulating *very* large objects requires a lot of 
memory, as does some expensive operations of course, so if you are 
running into problems only when you are handling that object or running 
a certain program, it may be that you don't have enough memory on your 
computer for that task.

I also find that there are memory leaks when I use R in Windows for a 
long time. If I shut down R and start up again, especially after very 
memory intensive tasks, I often have a good bit more memory when I start 
back up.

Best,
Elizabeth

Brigid Mooney wrote:
> My apologies if this is a bit of a 'newbie' question.
> 
> I am using R v 2.8.0 in Windows and am a bit confused about the memory
> size/allocation.
> 
> A script I wrote faulted out with the error: "Error: cannot allocate vector
> of size 5.6 Mb"
> 
> After this error, I still have:
> 
>> memory.size()
> [1] 669.3517
>> memory.limit()
> [1] 1535.875
> 
> Since the memory size is well under 5.6Mb less than the memory limit, I
> assume there is some limit on object size within R.  Is this correct?  If
> so, is there a way to determine which one of my objects is too large? - So
> that I can remove it or update my script.
> 
> Also, is there a way to temporarily increase the limit on vector memory
> allocation above 5.6Mb?  I'm hoping to still retreive the data that was
> calculated in the run prior to the error.
> 
> To get at this data, I tried write.csv and got a similar error:
>> write.csv(Results, "TempOutput_011309.csv", row.names=FALSE,
> col.names=TRUE)
> Error: cannot allocate vector of size 5.5 Mb
> 
> For reference, 'Results' is a data frame with about 800K rows and 18 columns
> (1 col contains character strings of class factor, 1 col contains Date/Time
> stamps of class factor, the remaining cols are all numeric).
> 
> Any help here is greatly appreciated - Thanks!
> 
> 	[[alternative HTML version deleted]]
> 
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>




More information about the R-help mailing list