[R] Reading in 9.6GB .DAT File - OK with 64-bit R?
Jan van der Laan
rhelp at eoos.dds.nl
Fri Mar 9 15:28:08 CET 2012
You could also have a look at the LaF package which is written to
handle large text files:
Under the vignettes you'll find a manual.
Note: LaF does not help you to fit 9GB of data in 4GB of memory, but
it could help you reading your file block by block and filtering it.
RHelpPlease <rrumple at trghcsolutions.com> schreef:
> Hi Barry,
> "You could do a similar thing in R by opening a text connection to
> your file and reading one line at a time, writing the modified or
> selected lines to a new file."
> Great! I'm aware of this existing, but don't know the commands for R. I
> have a variable [560,1] to use to pare down the incoming large data set (I'm
> sure of millions of rows). With other data sets they've been small enough
> where I've been able to use the merge function after data has been read in.
> Obviously I'm having trouble reading in this large data set in in the first
> Any additional help would be great!
> View this message in context:
> Sent from the R help mailing list archive at Nabble.com.
> R-help at r-project.org mailing list
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
More information about the R-help