[R] Memory issue

Alex van der Spek doorz at xs4all.nl
Wed May 5 11:47:32 CEST 2010


Reading a flat text file 138 Mbyte large into R with a combination of 
scan (to get the header) and read.table. After conversion of text time 
stamps to POSIXct and conversion of integer codes to factors I convert 
everything into one data frame and release the old structures containing 
the data by using rm().

Strangely, the rm() does not appear to reduce the used memory. I checked 
using memory.size(). Worse still, the amount of memory required grows. 
When I save an image the .RData image file is only 23 Mbyte, yet at some 
point in to the program, after having done nothing particularly 
difficult (two and three way frequency tables and some lattice graphs) 
the amount of memory in use is over 1 Gbyte.

Not yet a problem, but it will become a problem. This is using R2.10.0 
on Windows Vista.

Does anybody know how to release memory as rm(dat) does not appear to do 
this properly.

Regards,
Alex van der Spek



More information about the R-help mailing list