Fw: [R] Another big data size problem
f.gherardini at pigrecodata.net
Wed Jul 28 13:40:58 CEST 2004
On Wed, 28 Jul 2004 09:53:08 +0200
Uwe Ligges <ligges at statistik.uni-dortmund.de> wrote:
> If your data is numeric, you will need roughly
> 1220 * 20000 * 8 / 1024 / 1024 ~~ 200 MB
> just to store one copy in memory. If you need more than two copies, your
> machine with its 512MB will start to use swap space .....
> Hence either use a machine with more memory, or don't use all the data
> at once in memory, e.g. by making use of a database.
> Uwe Ligges
Well I'd be happy if it used swap space instead of locking itself up! By the way I don't think that the problem is entirely related to memory consumption. I have written a little function that reads the data row by row and does a print each time, to monitor its functioning. Everything starts to crwal to an horrible slowness long before my memory is exhausted... i.e.: after about 100 lines. It seems like R has problems managing very large objects per se? By the way I'll try to upgrade to 1.9 and see what happens...
Ernesto Jardim wrote:
>It looks like you're running linux !? if so it will be quite easy to
>create a table in MySQL, upload all the data into the database and
>access the data with RMySQL (it's _very_ fast). Probably there will be
>some operations that you can do on MySQL instead of "eating" memory in
I'll give that a try.
Thanks everybody for their time
More information about the R-help