[R] How to import BIG csv files with separate "map"?

giusto giusto at uoregon.edu
Tue Jul 14 19:53:42 CEST 2009


Hi all,

I am having problems importing a VERY large dataset in R. I have looked into
the package ff, and that seems to suit me, but also, from all the examples I
have seen, it either requires a manual creation of the database, or it needs
a read.table kind of step. Being a survey kind of data the file is big (like
20,000 times 50,000 for a total of about 1.2Gb in plain text) the memory I
have isn't enough to do a read.table and my computer freezes every time :( 

This far I have managed to import the required subset of the data by using a
"cheat": I used GRETL to read an equivalent Stata file (released by the same
source that offered the csv file), manipulate it and export it in a format
that R can read into memory. Easy! But I am wondering, how is it possible to
do this in R entirely from scratch?

Thanks
-- 
View this message in context: http://www.nabble.com/How-to-import-BIG-csv-files-with-separate-%22map%22--tp24484588p24484588.html
Sent from the R help mailing list archive at Nabble.com.




More information about the R-help mailing list