[R] read.csv size limits

andy1983 andy1983 at excite.com
Tue Feb 27 17:38:00 CET 2007


I have been using the read.csv function for a while now without any problems.
My files are usually 20-50 MBs and they take up to a minute to import. They
have all been under 50,000 rows and under 100 columns.

Recently, I tried importing a file of a similar size (which means about the
same amount of data), but with ~500,000 columns and ~20 rows. The process is
taking forever (~1 hour so far). In Task Manager, I see the CPU is at max,
but memory slows down to a halt at around 50 MBs (far below memory limit).

Is this normal? Is there a way to optimize this operation or at least check
the progress? Will this take 2 hours or 200 hours?

All I was trying to do is transpose my extra-wide table with a process that
I assumed would take 5 minutes. Maybe R is not the solution I am looking
for?

Thanks.

-- 
View this message in context: http://www.nabble.com/read.csv-size-limits-tf3302366.html#a9186136
Sent from the R help mailing list archive at Nabble.com.



More information about the R-help mailing list