[R] How to more efficently read in a big matrix

Prof Brian Ripley ripley at stats.ox.ac.uk
Sat Nov 10 08:42:35 CET 2007


Did you read the Note on the help page for read.table, or the 'R Data 
Import/Export Manual'?  There are several hints there, some of which will 
be crucial to doing this reasonably fast.

How big is your computer?  That is 116 million items (you haven't told us 
what type they are), so you will need GBs of RAM, and preferably a 64-bit 
OS.  Otherwise you would be better off using a DBMS to store the data (see 
the Manual mentioned in my first para).

On Fri, 9 Nov 2007, affy snp wrote:

> Dear list,
>
> I need to read in a big table with 487 columns and 238,305 rows (row names
> and column names are supplied). Is there a code to read in the table in
> a fast way? I tried the read.table() but it seems that it takes forever :(
>
> Thanks a lot!
>
> Best,
>    Allen

-- 
Brian D. Ripley,                  ripley at stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272866 (PA)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595



More information about the R-help mailing list