[R] Problems with R memory usage on Linux

B. Bogart ben at ekran.org
Wed Oct 15 19:48:24 CEST 2008


Hello all,

I'm working with a large data-set, and upgraded my RAM to 4GB to help
with the mem use.

I've got a 32bit kernel with 64GB memory support compiled in.

gnome-system-monitor and free both show the full 4GB as being available.

In R I was doing some processing and I got the following message (when
collecting 100 307200*8 dataframes into a single data-frame (for plotting):

Error: cannot allocate vector of size 2.3 Mb

So I checked the R memory usage:

$ ps -C R -o size
   SZ
3102548

I tried removing some objects and running gc() R then shows much less
memory being used:

$ ps -C R -o size
   SZ
2732124

Which should give me an extra 300MB in R.

I still get the same error about R being unable to allocate another 2.3MB.

I deleted well over 2.3MB of objects...

Any suggestions as to get around this?

Is the only way to use all 4GB in R to use a 64bit kernel?

Thanks all,
B. Bogart



More information about the R-help mailing list