[R] Memory issues on a 64-bit debian system (quantreg)

Dirk Eddelbuettel edd at debian.org
Wed Jun 24 23:43:37 CEST 2009


On 24 June 2009 at 14:07, Jonathan Greenberg wrote:
|     I installed R 2.9.0 from the Debian package manager on our amd64 
| system that currently has 6GB of RAM -- my first question is whether 
| this installation is a true 64-bit installation (should R have access to 
|  > 4GB of RAM?)  I suspect so, because I was running an rqss() (package 
| quantreg, installed via install.packages() -- I noticed it required a 
| compilation of the source) and watched the memory usage spike to 4.9GB 
| (my input data contains > 500,000 samples).

As yu suspect, that's proof enough :) 

With a 32-bit OS, even when the system has so much ram, you'd never get to
allocate that much to a single process. I can't recall the hard limit but I
think the effective limit I have seen with R on 32bit systems with 8 gb was
around 3 gb.  So you are on 64 bit and you are squeezing the existing
hardware as well as you can.

|     With this said, after 30 mins or so of processing, I got the 
| following error:
| 
| tahoe_rq <- 
| rqss(ltbmu_4_stemsha_30m_exp.img~qss(ltbmu_eto_annual_mm.img),tau=.99,data=boundary_data)
| Error: cannot allocate vector of size 1.5 Gb

R needs an additional 1.5 GiB which it tends to need as contiguous memory.

|     The dataset is a bit big (300mb or so), so I'm not providing it 
| unless necessary to solve this memory problem.
| 
|     Thoughts?  Do I need to compile either the main R "by hand" or the 
| quantreg package?

No, you are as far as you get for free. Rebuilding R or quantreg does not
change anything.

Now you either need to buy more ram (the system is likely capabaly of 16 gb
if not more) or parcel your data into smaller chunks, ie re-work your
analysis.

Dirk

-- 
Three out of two people have difficulties with fractions.




More information about the R-help mailing list