[R] Reading in 9.6GB .DAT File - OK with 64-bit R?

RHelpPlease rrumple at trghcsolutions.com
Thu Mar 8 19:19:18 CET 2012


Hi there,
I wish to read a 9.6GB .DAT file into R (64-bit R on 64-bit Windows machine)
- to then delete a substantial number of rows & then convert to a .csv file. 
Upon the first attempt the computer crashed (at some point last night).

I'm rerunning this now & am closely monitoring Processor/CPU/Memory.

Apart from this crash being a computer issue alone (possibly), is R equipped
to handle this much data?  I read up on the FAQs page that 64-bit R can
handle larger data sets than 32-bit.

I'm using the read.fwf function to read in the data.  I don't have access to
a database program (SQL, for instance).

Advice is most appreciated!



--
View this message in context: http://r.789695.n4.nabble.com/Reading-in-9-6GB-DAT-File-OK-with-64-bit-R-tp4457220p4457220.html
Sent from the R help mailing list archive at Nabble.com.



More information about the R-help mailing list