[R] clara - memory limit

Nestor Fernandez nestor.fernandez at ufz.de
Wed Aug 3 18:44:38 CEST 2005


Dear all,

I'm trying to estimate clusters from a very large dataset using clara but the
program stops with a memory error. The (very simple) code and the error:

mydata<-read.dbf(file="fnorsel_4px.dbf")
my.clara.7k<-clara(mydata,k=7)

>Error: cannot allocate vector of size 465108 Kb

The dataset contains >3,000,000 rows and 15 columns. I'm using a windows
computer with 1.5G RAM; I also tried changing the memory limit to the maximum
possible (4000M)
Is there a way to calculate clara clusters from such large datasets?

Thanks a lot.

Nestor.-




More information about the R-help mailing list