[BioC] Memory Problems

James MacDonald jmacdon at med.umich.edu
Thu Dec 18 16:15:44 MET 2003

Probably. I did a little study of the amount of memory required to read
in a given number of either U95A or U133A chips on WinXP and SuSE Linux
7.3, and Linux did a better job. 

By 'better job', I mean that I didn't have to kill R after every run to
free memory back up, and it appeared to use less memory per chip. Of
course this was using R-1.7.1 and Affy 1.3.1 and 1.3.3 (back in the dark
ages ;-D). Ben Bolstad did a more comprehensive study that you can see


AFAIK, the new malloc is supposed to address some of the problems with
memory allocation under win32, but I think the Linux memory allocation
is still likely to be superior.



James W. MacDonald
Affymetrix and cDNA Microarray Core
University of Michigan Cancer Center
1500 E. Medical Center Drive
7410 CCGC
Ann Arbor MI 48109

>>> "david neil hayes" <davidneilhayes at hotmail.com> 12/18/03 09:47AM
Thanks for the insight, I will try this.  You are correct in that I am
a windows machine, R1.8.1.

Would this problem (and many related problems) be substantially
improved if 
I switched to a Linux system?


>From: "James MacDonald" <jmacdon at med.umich.edu>
>To: <davidneilhayes at hotmail.com>,<Bioconductor at stat.math.ethz.ch>
>Subject: Re: [BioC] Memory Problems
>Date: Thu, 18 Dec 2003 08:45:25 -0500
>You don't mention what version of R you are using, nor your OS.
>since you are having memory re-allocation problems, I have to assume
>are on win32 and that you are using R < 1.9.0 or 1.8.1-patched.
>My understanding of memory issues in win32 with earlier versions of R
>is that the memory allocation process is sort of one-way, so you can
>out of memory even if you are running the garbage collector to
>it. I am sure this is not technically correct, and if BDR were
>subscribed to this list he would correct me, but the effect remains;
>you allocate too much memory to big objects you will eventually run
>even if you try to reclaim it.
>The patched version of R and R-1.9.0 have a different malloc that is
>supposed to be better at reclaiming memory, so you might go to Duncan
>Murdoch's website and get one or the other.
>James W. MacDonald
>Affymetrix and cDNA Microarray Core
>University of Michigan Cancer Center
>1500 E. Medical Center Drive
>7410 CCGC
>Ann Arbor MI 48109
> >>> "david neil hayes" <davidneilhayes at hotmail.com> 12/17/03 04:15PM
> >>>
>Thanks to Dr. Huber for the response to my earlier question.  Another
>matchprobes question that may have more general interest in terms of
>usage (which in my experience has been a bigger problem than
>I have a folder of files, each file representing one affybatch object
>is a single array).  I am using the "load" command to read these
>batches of 10, then I perform a "combine" function.   I save the
>results to
>a file, then move on to the next batch of 10.
>I find that my page file usage continues to increase, even though I
>"removed" the original 10 affybatch objects and all references to
>  As
>you might expect, I quickly exhaust my RAM.  I have been unable to
>this on my own.  In talking with some of the Bioconductor staff, I
>understand this may relate to the environments used in the affy
>To reduce my memory usage I have tried:
>   affybatch <- 0
>   gc()
>   rm(affybatch)
>   putting the entire batching process in a separate function
>which I exit before
>      moving to the next batch
>It's our best dial-up Internet access offer: 6 months @$9.95/month.
>Get it
>Bioconductor mailing list
>Bioconductor at stat.math.ethz.ch 

Check your PC for viruses with the FREE McAfee online computer scan.

Bioconductor mailing list
Bioconductor at stat.math.ethz.ch 

More information about the Bioconductor mailing list