[R] Rather open question
bates at stat.wisc.edu
Thu Jul 17 18:11:28 CEST 2003
"Steve Moore" <ebrington at hotmail.com> writes:
> I have a question that I was hoping someone may be able to shed some
> light on. If I wanted to analyse 1000-2000 arrays in R, how much
> computing power would I need in order for the program to run O.K. Any
> help that anybody could give me would be greatly appreciated.
Are the arrays that you speak of microarrays, such as manufactured by
Affymetrix? If so, it may be a good idea also to send your question to
the bioconductor mailing list <bioconductor at stat.math.ethz.ch>.
I expect that any answers to your question will require you to be more
specific about how you plan to analyze your data. Because R works "in
memory" the primary bottleneck in using R on large datasets is the
amount of memory that you have available on the computer. A typical
Linux or Windows workstation or server can address up to 4GB of memory
and you could expect to buy computers with 3GB or 4GB memory for a
small fraction of what 1000-2000 Affymetrix chips and the preparation
of your samples will cost. More than 4GB will require switching to
processors other than Pentium and Athlon. One interesting possibility
is the newly introduced AMD Opteron which is a 64-bit processor that
uses an extension of the x86 instruction set.
This is not to say that memory will be the only issue. As I said
above, it will be necessary to have some idea of what you plan to do
before meaningful advice can be offered.
More information about the R-help