[R] Row limit for read.table

Frank McCown fmccown at cs.odu.edu
Wed Jan 17 16:39:56 CET 2007

I have been trying to read in a large data set using read.table, but 
I've only been able to grab the first 50,871 rows of the total 122,269 rows.

 > f <- 
header=TRUE, nrows=123000, comment.char="", sep="\t")
 > length(f$change_rate)
[1] 50871

 From searching the email archives, I believe this is due to size limits 
of a data frame.  So...

1) Why doesn't read.table give a proper warning when it doesn't place 
every read item into a data frame?

2) Why isn't there a parameter to read.table that allows the user to 
specify which columns s/he is interested in?  This functionality would 
allow extraneous columns to be ignored which would improve memory usage.

I've already made a work-around by loading the table into mysql and 
doing a select on the 2 columns I need.  I just wonder why the above 2 
points aren't implemented.  Maybe they are and I'm totally missing it.


Frank McCown
Old Dominion University

More information about the R-help mailing list