[R] Memory limit in Aggregate()

Guillaume guillaume_bs at hotmail.com
Tue Aug 2 11:45:58 CEST 2011


Dear all,
I am trying to aggregate a table (divided in two lists here), but get a
memory error.
Here is the code I'm running : 

sessionInfo()
		
print(paste("memory.limit() ", memory.limit()))
print(paste("memory.size() ", memory.size()))
print(paste("memory.size(TRUE) ", memory.size(TRUE)))
		
print(paste("size listX ", object.size(listX)))
print(paste("size listBy ", object.size(listBy)))
print(paste("length ", object.size(nrow(listX))))
		
tableAgg <- aggregate(x 	= listX
			,	by	= listBy
			,	FUN = "max")


It returns :

R version 2.9.0 Patched (2009-05-09 r48513) 
i386-pc-mingw32 
locale:
LC_COLLATE=French_France.1252;LC_CTYPE=French_France.1252;LC_MONETARY=French_France.1252;LC_NUMERIC=C;LC_TIME=French_France.1252
attached base packages:
[1];stats;graphics;grDevices;utils;datasets;methods;base
other attached packages:
[1];RODBC_1.3-2;HarpTools_1.4;HarpReport_1.9
loaded via a namespace (and not attached):
[1];tools_2.9.0
[1];"memory.limit()  4095"
[1];"memory.size()  31.92"
[1];"memory.size(TRUE)  166.94"
[1];"size listX  218312"
[1];"size listBy  408552"
[1];"length  9083"
Erreur in vector("list", prod(extent)) : 
  cannot allocate vector of length 1224643220

(the last line is translated from the french error message "impossible
d'allouer un vecteur de longueur 1224643220" )

Why would R create such a long vector (my original lists , and is there a
way to avoid this error ?

Thank you for your help,

Guillaume

--
View this message in context: http://r.789695.n4.nabble.com/Memory-limit-in-Aggregate-tp3711819p3711819.html
Sent from the R help mailing list archive at Nabble.com.



More information about the R-help mailing list