R-beta: read.table and large datasets

Rick White rick at stat.ubc.ca
Mon Mar 9 19:50:14 CET 1998


I find that read.table cannot handle large datasets. Suppose data is a
40000 x 6 dataset

R -v 100

x_read.table("data")  gives
Error: memory exhausted
but
x_as.data.frame(matrix(scan("data"),byrow=T,ncol=6))
works fine.

read.table is less typing ,I can include the variable names in the first
line and in Splus executes faster. Is there a fix for read.table on the
way?


-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._



More information about the R-help mailing list