[BioC] large amount of slides

R.G.W. Verhaak r.verhaak at erasmusmc.nl
Mon Jun 7 08:36:24 CEST 2004

I have succesfully ran GCRMA on a dataset of 285 HGU133a chips, on a
machine with 8 Gb RAM installed; I noticed a peak memory use of 5,5 Gb
(although I have not been monitoring it continuously). I would say 200
chips use equally less memory, so around 4 Gb.

Roel Verhaak

> Message: 9
> Date: Fri, 04 Jun 2004 10:06:14 -0500
> From: "Vada Wilcox" <v_wilcox at hotmail.com>
> Subject: [BioC] large amount of slides
> To: bioconductor at stat.math.ethz.ch
> Message-ID: <BAY19-F34SDGAIXWb9D0002ec89 at hotmail.com>
> Content-Type: text/plain; format=flowed
> Dear all,
> I have been using RMA succesfully for a while now, but in the past I have
> only used it on a small amount of slides. I would like to do my study on a
> larger scale now, with data (series of experiments) from other researchers
> as well. My questions is the following: if I want to study, let's say 200
> slides, do I have to read them all into R at once (so together I mean,
> with
> read.affy() in package affy), or is it OK to read them series by series
> (so
> all wild types and controls of one researcher at a time)?

> If it is really necessary to read all of them in at one time how much RAM
> would I need (for let's say 200 CELfiles) and how can I raise the RAM? I
> now
> it's possible to raise it by using 'max vsize = ...' but I haven't been
> able
> to do it succesfully for 200 experiments though. Can somebody help me on
> this?

More information about the Bioconductor mailing list