[R] high RAM on Linux or Solaris platform

David Bickel dbickel at uottawa.ca
Wed Oct 31 18:58:32 CET 2007


Dr. Lumley and Prof. Ripley,

Thank you very much for your helpful responses. 

Have you found any particular distribution of Linux to work well with
64-bit R? For the cluster, I am currently considering Debian (since it
seems popular) and SUSE (since Matlab runs on it), but I remain open to
others.

Best regards,
David


-----Original Message-----
From: Prof Brian Ripley [mailto:ripley at stats.ox.ac.uk] 
Sent: Tuesday, October 30, 2007 4:51 PM
To: Thomas Lumley
Cc: David Bickel; r-help at stat.math.ethz.ch
Subject: Re: [R] high RAM on Linux or Solaris platform

On Tue, 30 Oct 2007, Thomas Lumley wrote:

> On Tue, 30 Oct 2007, David Bickel wrote:
>
>> To help me make choices regarding a platform for running high-memory
R
>> processes in parallel, I would appreciate any responses to these
>> questions:
>>
>> 1. Does the amount of RAM available to an R session depend on the
>> processor (Intel vs. Sun) or on the OS (various Linux distributions
vs.
>> Solaris)?
>
> Yes.
>
> It depends on whether R uses 64-bit or 32-bit pointers. For 64-bit R
you 
> need a 64-bit processor, an operating system that will run 64-bit 
> programs, and a compiler that will produce them.
>
> I'm not sure what the current Intel offerings are, but you can compile

> and run 64-bit on AMD Opteron (Linux) and Sun (Solaris) systems.

That is both Sparc Solaris and x86_64 Solaris (although for the latter
you 
seem to need to use the SunStudio compilers).

As far as I know all current desktop Intel processors run x86_64, and 
Xeons seem to have a price-performance edge at the moment. We have
several 
boxes with dual quad-core Xeons and lots of RAM.  (Not all for use with
R, 
some Linux, some Windows.)  Core 2 Duos do, and are commonplace in quite

low-end systems.


>> 2. Does R have any built-in limitations of RAM available to a
session?
>> For example, could it make use of 16 GB in one session given the
right
>> processor/OS platform?
>
> R does have built-in limitations even in a 64-bit system, but they are

> large. It is certainly possible to use more than 16Gb of memory.
>
> The main limit is that the length of a vector is stored in a C int,
and 
> so is no more than 2^31-1, or about two billion. A numeric vector of 
> that length would take up 16Gb on its own.

?"Memory-limits" documents them.

>> 3. Is there anything else I should consider before choosing a
processor
>> and OS?
>
> I don't think there is anything else R-specific.

-- 
Brian D. Ripley,                  ripley at stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272866 (PA)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595



More information about the R-help mailing list