[R] Another big data size problem
ozric at web.de
Wed Jul 28 13:40:19 CEST 2004
i'm working with a ~ 250.000 * 150 data.frame and can share
your problems - i've upgraded last weekend my notebook
from 512MB -> 1024MB, it's really better especially for load, write.table ,
mysqlReadTable, mysqlWriteTable, because machine begin caching if RAM
is full. One example:
With 512MB i get after some hours no success write a table to mysql.
With 1024MB it does in some minutes.
Am Mittwoch, 28. Juli 2004 04:10 schrieb Federico Gherardini:
> Hi all,
> I'm trying to read a 1220 * 20000 table in R but I'm having lot of
> problems. Basically what it happens is that R.bin starts eating all my
> memory until it gets about 90%. At that point it locks itself in a
> uninterruptible sleep status (at least that's what top says) where it just
> sits there barely using the cpu at all but keeping its tons of memory. I've
> tried with read.table and scan but none of them did the trick. I've also
> tried some orrible hack like reading one line a time and gradually
> combining everything in a matrix using rbind... nope! It seems I can read
> up to 500 lines in a *decent* time but nothing more. The machine is a 3 GHz
> P4 with HT and 512 MB RAM running R-1.8.1. Will I have to write a little a
> C program myself to handle this thing or am I missing something?
> Thanks in advance for your help,
> R-help at stat.math.ethz.ch mailing list
> PLEASE do read the posting guide!
More information about the R-help