[R] Memory getting eaten up with XML

Andrew Gormley GormleyA at landcareresearch.co.nz
Wed Dec 7 09:08:18 CET 2011

Today I tried the code on a MacBook and experienced the same problem. Which
makes me think there is something wrong with the way I am trying to free up
the memory...?

Andrew Gormley wrote
> Hi all. I have an issue that I cannot resolve. I am trying to read in lots
> of data that are stored in xml files. But after I read them in and copy
> the relevant data, then remove the document etc, it doesn't free up the
> memory. When I monitor it in windows task manager the memory usage just
> climbs with each iteration until R crashes. I can replicate the problem
> with the small example:
>         file.name<-"C:\\MyData.xml.gz"
>         TEMPP<-xmlParse(file.name)
>         xx <- xmlRoot(TEMPP)
>         rm(xx)
>         rm(TEMPP)
>         gc()
> Even though I remove the root node xx and the document TEMPP, the memory
> usage remains the same as it was when I first read it in... Any
> ideas/solutions?
> I am using a 32bit version of R 2.14.0 on windows XP, and the latest
> version of XML (3.6.1).
> Many thanks
> Andrew (apologies for the large footer my work appends to all my
> emails...)
> ____________________________________________
> R-help@ mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

View this message in context: http://r.789695.n4.nabble.com/Memory-getting-eaten-up-with-XML-tp4163468p4168098.html
Sent from the R help mailing list archive at Nabble.com.

More information about the R-help mailing list