[R] Regarding the memory allocation problem

Milan Bouchet-Valat nalimilan at club.fr
Mon Oct 29 09:43:07 CET 2012


Le lundi 29 octobre 2012 à 12:01 +0530, Purna chander a écrit :
> Dear Milan,
> 
> Thank you for telling about gc().
> 
> I'm using R 2.15.1. It's session info is displayed below:
> 
> R version 2.15.1 (2012-06-22) -- "Roasted Marshmallows"
> Copyright (C) 2012 The R Foundation for Statistical Computing
> ISBN 3-900051-07-0
> Platform: i686-redhat-linux-gnu (32-bit)
> 
> R is free software and comes with ABSOLUTELY NO WARRANTY.
> You are welcome to redistribute it under certain conditions.
> Type 'license()' or 'licence()' for distribution details.
> 
>   Natural language support but running in an English locale
> 
> R is a collaborative project with many contributors.
> Type 'contributors()' for more information and
> 'citation()' on how to cite R or R packages in publications.
> 
> Type 'demo()' for some demos, 'help()' for on-line help, or
> 'help.start()' for an HTML browser interface to help.
> Type 'q()' to quit R.
> 
> 
> ** I'm working with 3GB RAM. How can I use the available memory to
> optimize my goal.
Well, I gave you one idea, but for us to tell you more you need to give
us the code you are using. A better solution would be to spend some
money to buy more RAM, given it's quite cheap these days, and to install
a 64-bit version of Linux and R, if your computer supports it.


Regards


> On 10/26/12, Milan Bouchet-Valat <nalimilan at club.fr> wrote:
> > Le jeudi 25 octobre 2012 à 15:02 +0530, Purna chander a écrit :
> >> Dear All,
> >>
> >>
> >> My main objective was to compute the distance of 100000 vectors from a
> >> set having 900 other vectors. I've a file named "seq_vec" containing
> >> 100000 records and 256 columns.
> >> While computing, the memory was not sufficient and resulted in error
> >> "cannot allocate vector of size 152.1Mb"
> >>
> >> So I've approached the problem in the following:
> >> Rather than reading the data completely at a time, I read the data in
> >> chunks of 20000 records using scan() function. After reading each
> >> chunk, I've computed distance of each of these vectors with a set of
> >> another vectors.
> >>
> >> Even though I was successful in computing the distances for first 3
> >> chunks, I obtained similar error (cannot allocate vector of size
> >> 102.3Mb).
> >>
> >> Q) Here what I could not understand is, how come memory become
> >> insufficient when dealing with 4th chunk?
> >> Q) Suppose if i computed a matrix 'm' during calculation associated
> >> with chunk1, then is this matrix not replaced when I again compute 'm'
> >> when dealing with chunk 2?
> > R's memory management is relatively complex, i.e. objects are not always
> > replaced in memory, they are only garbage collected from time to time.
> > You may try to call gc() after each chunk to limit memory fragmentation,
> > which help reducing allocation problems a little.
> >
> > But please tell us how many RAM you have on the machine you're using,
> > and post the output of sessionInfo().
> >
> >
> > Regards
> >




More information about the R-help mailing list