[R] What is the largest in memory data object you've worked with in R?

Joris Meys jorismeys at gmail.com
Sat Jun 5 23:04:46 CEST 2010


You have to take some things into account :
- the maximum memory set for R might not be the maximum memory available
- R needs the memory not only for the dataset. Matrix manipulations
require frquently double of the amount of memory taken by the dataset.
- memory allocation is important when dealing with large datasets.
There is plenty of information about that
- R has some packages to get around memory problems with big datasets.

Read this discussione for example :
http://tolstoy.newcastle.edu.au/R/help/05/05/4507.html

and this page of Matthew Keller is a good summary too :
http://www.matthewckeller.com/html/memory.html

Cheers
Joris

On Sat, Jun 5, 2010 at 12:32 AM, Nathan Stephens <nwstephens at gmail.com> wrote:
> For me, I've found that I can easily work with 1 GB datasets.  This includes
> linear models and aggregations.  Working with 5 GB becomes cumbersome.
> Anything over that, and R croaks.  I'm using a dual quad core Dell with 48
> GB of RAM.
>
> I'm wondering if there is anyone out there running jobs in the 100 GB
> range.  If so, what does your hardware look like?
>
> --Nathan
>
>        [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>



-- 
Ghent University
Faculty of Bioscience Engineering
Department of Applied mathematics, biometrics and process control

tel : +32 9 264 59 87
Joris.Meys at Ugent.be
-------------------------------
Disclaimer : http://helpdesk.ugent.be/e-maildisclaimer.php



More information about the R-help mailing list