[R] Bug with memory allocation when loading Rdata files iteratively?

Uwe Ligges ligges at statistik.tu-dortmund.de
Sat Feb 11 19:27:55 CET 2012



On 10.02.2012 01:56, Janko Thyson wrote:
> Dear list,
>
> when iterating over a set of Rdata files that are loaded, analyzed and
> then removed from memory again, I experience a *significant* increase in
> an R process' memory consumption (killing the process eventually).
>
> It just seems like removing the object via |rm()| and firing |gc()| do
> not have any effect, so the memory consumption of each loaded R object
> cumulates until there's no more memory left :-/
>
> Possibly, this is also related to XML package functionality (mainly
> |htmlTreeParse| and |getNodeSet|), but I also experience the described
> behavior when simply iteratively loading and removing Rdata files.


Please provide a reproducible example. If you manage to produce one with 
XML only, report to its maintainer. If you manage to provide one without 
XML, report to R-devel.

But please try with recent versions of XML and R (both unstated in your 
message).

Uwe Ligges


> I've put together a little example that illustrates the memory
> ballooning mentioned above which you can find here:
> http://stackoverflow.com/questions/9220849/significant-memory-issue-in-r-when-iteratively-loading-rdata-files-killing-the
>
> Is this a bug? Any chance of working around this?
>
> Thanks a lot and best regards,
> Janko
>
>
>
> 	[[alternative HTML version deleted]]
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.



More information about the R-help mailing list