Fw: [R] Another big data size problem
ligges at statistik.uni-dortmund.de
Wed Jul 28 12:45:57 CEST 2004
Federico Gherardini wrote:
> On Wed, 28 Jul 2004 09:53:08 +0200
> Uwe Ligges <ligges at statistik.uni-dortmund.de> wrote:
>>If your data is numeric, you will need roughly
>>1220 * 20000 * 8 / 1024 / 1024 ~~ 200 MB
>>just to store one copy in memory. If you need more than two copies, your
>>machine with its 512MB will start to use swap space .....
>>Hence either use a machine with more memory, or don't use all the data
>>at once in memory, e.g. by making use of a database.
> Well I'd be happy if it used swap space instead of locking itself up! By the way I don't think that the problem is entirely related to memory consumption. I have written a little function that reads the data row by row and does a print each time, to monitor its functioning. Everything starts to crwal to an horrible slowness long before my memory is exhausted... i.e.: after about 100 lines. It seems like R has problems managing very large objects per se? By the way I'll try to upgrade to 1.9 and see what happens...
Well, using swap space takes much time. And what looks like hanging up
is quite probably the use of swap space - you will see your hard disc
LED flashing all the time!
Are you sure that your memory was not exhausted?
Note that it is better to initialize the object to full size before
inserting -- rather than using rbind() and friends which is indeed slow
since it need to re-allocate much memory for each step.
> Ernesto Jardim wrote:
>>It looks like you're running linux !? if so it will be quite easy to
>>create a table in MySQL, upload all the data into the database and
>>access the data with RMySQL (it's _very_ fast). Probably there will be
>>some operations that you can do on MySQL instead of "eating" memory in
> I'll give that a try.
> Thanks everybody for their time
> R-help at stat.math.ethz.ch mailing list
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
More information about the R-help