[R] memory limit problem

Prof Brian Ripley ripley at stats.ox.ac.uk
Sun Apr 4 18:42:53 CEST 2004


What do you mean `did not work'?  Did it not start (you may need to reboot 
your machine to clear its memory tables) or did your task run out of 
memory?

Please do read the posting guide and its references, and try to give 
useful information about the problem you encounter.  Saying `did not work' 
without ever saying what is maximally uninformative.


On Sun, 4 Apr 2004, Yi-Xiong Sean Zhou wrote:

> I tried using --max-mem-size=1400M at the command line on 1.8.1 and did not
> work. However, 1.9.0beta works. The Os is XP professional on Dell inspiron
> 8600. 
> 
> Yi-Xiong
> 
> -----Original Message-----
> From: Prof Brian Ripley [mailto:ripley at stats.ox.ac.uk] 
> Sent: Saturday, April 03, 2004 11:20 PM
> To: Roger D. Peng
> Cc: Yi-Xiong Sean Zhou; r-help at stat.math.ethz.ch
> Subject: Re: [R] memory limit problem
> 
> That is true, but I don't see that Yi-Xiong Sean Zhou has actually yet 
> followed the instructions for 1.8.1, which is to set --max-mem-size on the 
> command line (and this is in the rw-FAQ as people have pointed out).
> 
> The issue is that on Windows the memory address space can get fragmented, 
> and this is ameliorated by reserving memory in advance -- that is what 
> using --max-mem-size (and not memory.limit) does for you.
> 
> When used as recommended, both 1.8.1 and 1.9.0beta can handle workspaces
> of up to about 1.7Gb.  1.9.0 can go higher on suitable OSes: see its
> rw-FAQ.
> 
> On Sun, 4 Apr 2004, Roger D. Peng wrote:
> 
> > In general, this is not an R problem, it is a Windows problem.  I find 
> > that these types of memory problems do not appear on Linux, for example.
> > 
> > -roger
> > 
> > Yi-Xiong Sean Zhou wrote:
> > > R1.9.0beta solves the problem for now. The memory foot print of R1.9.0
> is
> > > way smaller than R1.8.1, with only 400M. It will be interesting to see
> how
> > > R1.9.0 handles the memory problem when it needs more than 700M.
> > > 
> > > Thanks for your helps. 
> > > 
> > > Yi-Xiong
> > > 
> > > -----Original Message-----
> > > From: Roger D. Peng [mailto:rpeng at jhsph.edu] 
> > > Sent: Saturday, April 03, 2004 2:52 PM
> > > To: Yi-Xiong Sean Zhou
> > > Cc: 'Uwe Ligges'; r-help at stat.math.ethz.ch
> > > Subject: Re: [R] memory limit problem
> > > 
> > > You may want to try downloading the development version of R at 
> > > http://cran.us.r-project.org/bin/windows/base/rdevel.html.  This 
> > > version deals with Windows' deficiencies in memory management a 
> > > little better.
> > > 
> > > -roger
> > > 
> > > Yi-Xiong Sean Zhou wrote:
> > > 
> > > 
> > >>After memory.limit(1500), the error message still pop out:
> > >>
> > >>Error: cannot allocate vector of size 11529 Kb
> > >>
> > >>While 
> > >>
> > >>
> > >>
> > >>>memory.size()
> > >>
> > >>[1] 307446696
> > >>
> > >>
> > >>>memory.limit()
> > >>
> > >>[1] 1572864000
> > >>
> > >>And the system is only using 723MB physical memory, while 2G is the
> total.
> > > 
> > > 
> > >>Does anyone have a clue of what is going on? 
> > >>
> > >>
> > >>Yi-Xiong
> > >>
> > >>
> > >>-----Original Message-----
> > >>From: Uwe Ligges [mailto:ligges at statistik.uni-dortmund.de] 
> > >>Sent: Saturday, April 03, 2004 12:40 PM
> > >>To: Yi-Xiong Sean Zhou
> > >>Cc: r-help at stat.math.ethz.ch
> > >>Subject: Re: [R] memory limit problem
> > >>
> > >>
> > >>
> > >>Yi-Xiong Sean Zhou wrote:
> > >>
> > >>
> > >>>Could anyone advise me how to allocate 1.5Gbyte memory for R on a Dell
> > >>>laptop running XP professional with 2G memory?
> > >>
> > >>
> > >>See ?Memory or the the R for Windows FAQ, which tells you:
> > >>
> > >>"2.7 There seems to be a limit on the memory it uses!
> > >>
> > >>Indeed there is. It is set by the command-line flag --max-mem-size (see
> > >>How do I install R for Windows?) and defaults to the smaller of the
> > >>amount of physical
> > >>RAM in the machine and 1Gb. [...]"
> > >>
> > >>
> > >>
> > >>
> > >>>I have tried
> > >>>
> > >>>"C:\Program Files\R\rw1081\bin\Rgui.exe" --max-vsize=1400M
> > >>>
> > >>>but I only get only 500MB for R actually.
> > >>>
> > >>>
> > >>>I also tried memory.limit(2^30) in R and got error of:
> > >>
> > >>
> > >>Well, you don't want to allocate 2^30 *Mega*Bytes (see ?memory.limit),
> > >>do you? 
> > >>
> > >>
> > >>Either use the command line flag --max-mem-size=1500M or within R:
> > >> memory.limit(1500)
> > >>
> > >> 
> > >>
> > >>
> > >>>Error in memory.size(size) : cannot decrease memory limit
> > >>
> > >>
> > >>Since your limit was roughly 10^6-times off the right one, you got an
> > >>integer overflow internally, I think.
> > >>
> > >>Uwe Ligges
> > >>
> > >>
> > >> 
> > >>
> > >>
> > >>>Yi-Xiong
> > >>>
> > >>>       [[alternative HTML version deleted]]
> > >>>
> > >>>______________________________________________
> > >>>R-help at stat.math.ethz.ch mailing list
> > >>>https://www.stat.math.ethz.ch/mailman/listinfo/r-help
> > >>>PLEASE do read the posting guide!
> > >>
> > >>http://www.R-project.org/posting-guide.html
> > >>
> > >>______________________________________________
> > >>R-help at stat.math.ethz.ch mailing list
> > >>https://www.stat.math.ethz.ch/mailman/listinfo/r-help
> > >>PLEASE do read the posting guide!
> > > 
> > > http://www.R-project.org/posting-guide.html
> > > 
> > > 
> > >
> > 
> > ______________________________________________
> > R-help at stat.math.ethz.ch mailing list
> > https://www.stat.math.ethz.ch/mailman/listinfo/r-help
> > PLEASE do read the posting guide!
> http://www.R-project.org/posting-guide.html
> > 
> > 
> 
> 

-- 
Brian D. Ripley,                  ripley at stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272866 (PA)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595




More information about the R-help mailing list