[R] Problems working with large data

jim holtman jholtman at gmail.com
Thu Nov 15 22:41:53 CET 2007


A little more information might be useful.  If your matrix is numeric,
then a single copy will require about 250MB of memory.  What type of
system are you on and how much memory do you have?  When you say you
are having problems, what are they?  Is it a problem reading the data
in?  Are you getting allocation errors?  Is your system paging?  If
you have 2GB of memory, you should be fine depending on how many
copies of the data you have.

On Nov 15, 2007 10:53 AM,  <pedrosmarques at portugalmail.pt> wrote:
>
> Hi,
>
> I'm working with a numeric matrix with 55columns and 581012 rows and I'm having problems in allocation of memory using some of the functions in R: for example lda, rda (library MASS), princomp(package mva) and mvnorm.etest (energy package). I've read tips to use less memory in help(read.table) and managed to use some of this functions, but haven't been able to work with mvnorm.etest.
>
> I would like to know the better way to solve this problem, as well as doing it faster.
>
> Best regards,
>
> Pedro Marques
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>



-- 
Jim Holtman
Cincinnati, OH
+1 513 646 9390

What is the problem you are trying to solve?



More information about the R-help mailing list