[BioC] how to set R memory limit

Steve Lianoglou mailinglist.honeypot at gmail.com
Tue Jan 10 17:35:37 CET 2012


Hi,

Just adding to what Sean said and using some intuition:

On Tue, Jan 10, 2012 at 10:20 AM, Sean Davis <sdavis2 at mail.nih.gov> wrote:
[snip]
> If you still cannot do what you need, let us know what you
> are trying to do by including code, errors, and the output of
> sessionInfo().  Perhaps there is a way to do things with less memory
> use.  In the end, you may need more memory, though.

I'm guessing you are (1) doing something w/ sequencing data; and (2)
trying to load all of your data at once.

Assuming that's true (and we all know how prudent making assumptions
is), I'd suggest just doing whatever is that you are doing by loading
the data for one chromosome at a time.

As an aside -- this usage pattern is quite frequent, ie. "get all
reads from chromsome X and do Y", I took a stab sometime implementing
"iterators" for the foreach package to abstract this idea (get batch
of reads and do Y), like:

xxx <- foreach(reads=getReadsByChromosome(bam.file), ...) %dopar% {
  ## compute over `reads`, and profit
}

Maybe it's worth nailing down into a `seqiterators` package, or something?

-steve

-- 
Steve Lianoglou
Graduate Student: Computational Systems Biology
 | Memorial Sloan-Kettering Cancer Center
 | Weill Medical College of Cornell University
Contact Info: http://cbio.mskcc.org/~lianos/contact



More information about the Bioconductor mailing list