[R] out of memory?

Alberto Murta amurta at ipimar.pt
Fri Feb 2 15:35:54 CET 2001


Dear all

I have a data.frame with 915 rows and 7 columns and I want to aggregate
one of the variables using 'aggregate.data.frame': 

my.new.data <- aggregate.data.frame(my.data$VAR1, by =
list(my.data$VAR2,...,my.data$VAR7), FUN = max)

however I get the message: "Error: vector memory exhausted (limit
reached?)".
Since I didn't set a limit I assume that's the machine memory (516 MB +
130 MB of swap). Is this normal for a data.frame of those dimensions? 

platform i686-pc-linux-gnu
arch     i686
os       linux-gnu
system   i686, linux-gnu
status
major    1
minor    2.0
year     2000
month    12
day      15
language R


 
                        Alberto G. Murta                      
        IPIMAR - Institute of Fisheries and Sea Research
          Avenida de Brasilia, 1449-006 Lisboa, Portugal       
Tel: +351 213027062; Fax: +351 213015849; http://www.ipimar.pt
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._



More information about the R-help mailing list