[R] [r] How to Solve the Error( error:cannot allocate vector of size 1.1 Gb)

Kum-Hoe Hwang phdhwang at gmail.com
Thu Jan 15 09:11:36 CET 2009


Hi, Gurus

Thanks to your good helps, I have managed starting the use of a text
mining package so called "tm" in R under the OS of Win XP.

However, during running the tm package, I got another mine like memory problem.

What is a the best way to solve this memory problem among increasing a
physical RAM, or doing other recipes, etc?

###############################
###### my R Script's Outputs ######
###############################

> memory.limit(size = 2000)
NULL
> corpus.ko <- Corpus(DirSource("test_konews/"),
+  readerControl = list(reader = readPlain,
+  language = "UTF-8", load = FALSE))
> corpus.ko.nowhite <- tmMap(corpus.ko, stripWhitespace)
> corpus <- tmMap(corpus.ko.nowhite, tmTolower)
> tdm <- TermDocMatrix(corpus)
>  findAssocs(tdm, "city", 0.97)
error:cannot allocate vector of size 1.1 Gb
-------------------------------------------------------------
>
################################
Thanks for your precious time,

--
Kum-Hoe Hwang, Ph.D.

Phone : 82-31-250-3516
Email : phdhwang at gmail.com




More information about the R-help mailing list