[R] How to more efficently read in a big matrix

jim holtman jholtman at gmail.com
Sat Nov 10 05:46:00 CET 2007


If they are all numeric, you can use 'scan' to read them in.  With
that amount of data, you will need almost 1GB to contain the single
object.  If you want to do any processing, you will probably need a
machine with at least 3-4GB of physical memory, preferrably a 64-bit
version of R.  What type of computer are you using?  Do you really
need all the data in at once, or can you process it in smaller batches
(e.g., 20,000 rows at a time)?  So a little more detail on what you
actually want to do with the data would be useful, since it does
create a very large object.  BTW how large is the file you are reading
and what is its format?  Have you considered a database with this
amount of data?

On Nov 9, 2007 11:39 PM, affy snp <affysnp at gmail.com> wrote:
> Dear list,
>
> I need to read in a big table with 487 columns and 238,305 rows (row names
> and column names are supplied). Is there a code to read in the table in
> a fast way? I tried the read.table() but it seems that it takes forever :(
>
> Thanks a lot!
>
> Best,
>    Allen
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>



-- 
Jim Holtman
Cincinnati, OH
+1 513 646 9390

What is the problem you are trying to solve?



More information about the R-help mailing list