[R] How to clear R memory in a for loop

Dimitri Liakhovitski dimitri.liakhovitski at gmail.com
Tue Oct 21 16:47:35 CEST 2014


I will try with .wav files and report back.
So far, I am not sure I understood what could be done (if anything) to fix it...

On Tue, Oct 21, 2014 at 2:26 AM, Prof Brian Ripley
<ripley at stats.ox.ac.uk> wrote:
> On 20/10/2014 17:53, John McKown wrote:
>>
>> On Mon, Oct 20, 2014 at 10:30 AM, Dimitri Liakhovitski <
>> dimitri.liakhovitski at gmail.com> wrote:
>>
>>> Dear Rers,
>>>
>>> I am trying to run a for-loop in R.
>>> During each iteration I read in an mp3 file and do some basic processing.
>>> If I do what I need to do for each file one by one - it works fine.
>>> But once I start running a loop, it soon runs out of memory and says:
>>> can't
>>> allocate a vector of size...
>>> In each iteration of my loop I always overwrite the previously created
>>> object and do gc().
>>>
>>> Any hints on how to fight this?
>>>
>>> Thanks a lot!
>>>
>>>
>>>
>> Please don't use HTML for messages.
>>
>> What occurs to me, from reading the other replies, is that perhaps within
>> the loop you are causing other objects to be allocated. And that can be
>> done just by doing a simple assignment, so it may not be obvious. What
>> this
>> can do is cause what we called a "sand bar" in the old days. That's where
>> you allocate a big chunk of memory for an object. Say this take up 1/2 of
>> your available space. You now create a small object. This object is
>> _probably_ right next to the large object. You now release the large
>> object. Your apparent free space is now almost what it was at the
>> beginning. But when you try to allocate another large object which is,
>> say,
>> 2/3 of the maximum space, you can't because that small object is sitting
>> right in the middle of our memory space. So you _can_ allocate 2 large
>> objects which are 1/3 your free space size, but not 1 object which is 2/3
>> of the free space size. Which can lead to your type of situation.
>>
>> This is just a SWAG based on some experience in other systems. Most
>> "garbage collection" do _not_ do memory consolidation. I don't know about
>> R.
>>
>>
> That is true of R (except for the early days which did have a moving garbage
> collector).
>
> However 'your available space' is not the amount of RAM you have but the
> process address space.  The latter is enormous on any 64-bit OS, so 'memory
> fragmentation' (as this is termed) is a thing of the past except for those
> limited to many-years-old OSes.
>
>
> --
> Brian D. Ripley,                  ripley at stats.ox.ac.uk
> Emeritus Professor of Applied Statistics, University of Oxford
> 1 South Parks Road, Oxford OX1 3TG, UK
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.



-- 
Dimitri Liakhovitski



More information about the R-help mailing list