[R] Reading in 9.6GB .DAT File - OK with 64-bit R?

Jeff Newmiller jdnewmil at dcn.davis.ca.us
Fri Mar 9 00:06:45 CET 2012

My opinion is that you should be spending your effort on setting up a SQL engine and importing it there. If you have 32GB of RAM your current direction might work, but working with sampled data rather than population data seems pretty typical for statistical analysis.
Jeff Newmiller                        The     .....       .....  Go Live...
DCN:<jdnewmil at dcn.davis.ca.us>        Basics: ##.#.       ##.#.  Live Go...
                                      Live:   OO#.. Dead: OO#..  Playing
Research Engineer (Solar/Batteries            O.O#.       #.O#.  with
/Software/Embedded Controllers)               .OO#.       .OO#.  rocks...1k
Sent from my phone. Please excuse my brevity.

RHelpPlease <rrumple at trghcsolutions.com> wrote:

>Hi there,
>I wish to read a 9.6GB .DAT file into R (64-bit R on 64-bit Windows
>- to then delete a substantial number of rows & then convert to a .csv
>Upon the first attempt the computer crashed (at some point last night).
>I'm rerunning this now & am closely monitoring Processor/CPU/Memory.
>Apart from this crash being a computer issue alone (possibly), is R
>to handle this much data?  I read up on the FAQs page that 64-bit R can
>handle larger data sets than 32-bit.
>I'm using the read.fwf function to read in the data.  I don't have
>access to
>a database program (SQL, for instance).
>Advice is most appreciated!
>View this message in context:
>Sent from the R help mailing list archive at Nabble.com.
>R-help at r-project.org mailing list
>PLEASE do read the posting guide
>and provide commented, minimal, self-contained, reproducible code.

More information about the R-help mailing list