[BioC] affy & problems with memory
xiaofan.mlist at gmail.com
Sat Jan 14 10:22:02 CET 2006
I have met with similar problems (memory allocation limit) dealing with
affy. I have seen that the maximum memory allocation for R in Windows XP is
the same size as your physical memory, say, in my case it is 768M. Reading
some 40 U133A CEL files will cause the error. I managed to *avoid* this
error by *separately loading the files*, for example, I load 20 files at a
time then save the image, close R, re-open it and reload the image. I have
successfully loaded 1.4G CEL files (121 chips) on my 768M laptop, creating a
450M RData image.
However there are further problems with various data pre-processing methods.
With a certain pile of chips (about 20), it is very easy to cause a memory
allocation fault when doing rma(), expresso() and normalize(), both in
Windows XP and Linux. And the problem is no longer a typical "768M limit"
one, but related to the size of the target object. Since R seems to have
used no object compressing approaches, this might have been the real memory
limit (in Windows: physical mem + virtual mem size; in Linux: physical mem +
swap disk size).
Since R doesn't seem to have a data compressing approach as well as cleaning
and garbage collection mechanism, I would suggest that adding more RAMs into
your computer is the only way to solve the problem (alternatively you can
increase the swap/virtual memory size up to (4GB - physical mem), bearing
that extremely poor system performance).
DAMTP, University of Cambridge, CB3 0WA, UK Tel +44 7886 614030, Email
xl252 at cam.ac.uk
From: bioconductor-bounces at stat.math.ethz.ch
[mailto:bioconductor-bounces at stat.math.ethz.ch] On Behalf Of kfbargad at ehu.es
Sent: 12 January 2006 13:40
To: bioconductor at stat.math.ethz.ch
Subject: Re: [BioC] affy & problems with memory
would it be a solution to use RMAexpress to obtain expression values for a
good amount of arrays and then import them into your R session?
I haven´t got much experience on this program but it is said on the
RMAexpress webpage that the program can process upto 250 arrays
simultaneously ( I guess not U133plus) and in the history section it is
stated that the 0.4alpha3 version can run 200 arrays.
Any information on U133plus?
> My apologies for the previous (empty) mail, it slipped me :)
> > I'm currently working on that chip type myself on a system with
> > and had to increase the memory assigned to R too. However
> > "--max-mem-size=1024Mb" seems to be ok in my case. Besides the
> > how much memory is necessary for the hgu133plus2 maybe somebody
> > could answer me this two related questions:
> > * I'm running R under WinXp AND under Linux on the same machine.
> > there has never been any memory problem in linux. So is the memory
> > allocation or the assignment of the max. memory size that can be
used by R
> > different in the two systems?
> This is due to the much more flexible memory management of Linux as
> opposed to Windows. Theoretical limit on a 32-bit Windows machine as
> 2Gb; 4 Gb under Linux. Linux has also a more flexible way of
> (finding memory on your local harddrive) then Windows (the /swp
> > * With my 512MB RAM system the old maximum memory value under
> > 512MB. Is this just a coincidence or does this maximum value rise
> > if I upgrade my system to 1024MB RAM?
> No, this is not a coincidence since the maximum value is related to
> amount available on your system.
> Bioconductor mailing list
> Bioconductor at stat.math.ethz.ch
Bioconductor mailing list
Bioconductor at stat.math.ethz.ch
More information about the Bioconductor