[R] CPU or memory

Prof Brian Ripley ripley at stats.ox.ac.uk
Wed Nov 8 19:21:23 CET 2006


On Wed, 8 Nov 2006, Christos Hatzis wrote:

> Prof. Ripley,
>
> Do you mind providing some pointers on how "coarse-grained parallelism"
> could be implemented on a Windows environment?  Would it be as simple as
> running two R-console sessions and then (manually) combining the results of
> these simulations.  Or it would be better to run them as batch processes.

That is what I would do in any environment (I don't do such things under 
Windows since all my fast machines run Linux/Unix).

Suppose you want to do 10000 simulations.  Set up two batch scripts
that each run 5000, and save() the results as a list or matrix under 
different names, and set a different seed at the top.  Then run each via
R CMD BATCH simultaneously.  When both have finished, use an interactive 
session to load() both sets of results and merge them.

> RSiteSearch('coarse grained') did not produce any hits so this topic might
> have not been discussed on this list.
>
> I am not really familiar with running R in any mode other than the default
> (R-console in Windows) so I might be missing something really obvious. I am
> interested in running Monte-Carlo cross-validation in some sort of a
> parallel mode on a dual core (Pentium D) Windows XP machine.
>
> Thank you.
> -Christos
>
> Christos Hatzis, Ph.D.
> Nuvera Biosciences, Inc.
> 400 West Cummings Park
> Suite 5350
> Woburn, MA 01801
> Tel: 781-938-3830
> www.nuverabio.com
>
>
>
> -----Original Message-----
> From: r-help-bounces at stat.math.ethz.ch
> [mailto:r-help-bounces at stat.math.ethz.ch] On Behalf Of Prof Brian Ripley
> Sent: Wednesday, November 08, 2006 5:29 AM
> To: Stefan Grosse
> Cc: r-help at stat.math.ethz.ch; Taka Matzmoto
> Subject: Re: [R] CPU or memory
>
> On Wed, 8 Nov 2006, Stefan Grosse wrote:
>
>> 64bit does not make anything faster. It is only of use if you want to
>> use more then 4 GB of RAM of if you need a higher precision of your
>> variables
>>
>> The dual core question: dual core is faster if programs are able to
>> use that. What is sure that R cannot make (until now) use of the two
>> cores if you are stuck on Windows. It works excellent if you use
>> Linux. So if you want dual core you should work with linux (and then
>> its faster of course).
>
> Not necessarily.  We have seen several examples in which using a
> multithreaded BLAS (the only easy way to make use of multiple CPUs under
> Linux for a single R process) makes things many times slower.  For tasks
> that are do not make heavy use of linear algebra, the advantage of a
> multithreaded BLAS is small, and even from those which do the speed-up is
> rarely close to double for a dual-CPU system.
>
> John mentioned simulations.  Often by far the most effective way to use a
> multi-CPU platform (and I have had one as my desktop for over a decade) is
> to use coarse-grained parallelism: run two or more processes each doing some
> of the simulation runs.
>
>> The Core 2 duo is the fastest processor at the moment however.
>>
>> (the E6600 has a good price/performance ration)
>>
>> What I already told Taka is that it is probably always a good idea to
>> improve your code for which purpose you could ask in this mailing
>> list... (And I am very sure that you have there a lot of potential).
>> Another speeding up possibility is e.g. using the atlas library...
>> (where I am not sure if you already use it)
>>
>> Stefan
>>
>> John C Frain schrieb:
>>> *Can I extend Taka's question?*
>>> **
>>> *Many of my programs in (mainly simulations in R which are cpu bound)
>>> on a year old PC ( Intel(R) Pentium(R) M processor 1.73GHz or Dell
>>> GX380 with 2.8Gh Pentium) are taking hours and perhaps days to
>>> complete on a one year old PC.  I am looking at an upgrade but the
>>> variety of cpu's available is
>>> confusing at least.   Does any one know of comparisons of the Pentium
>>> 9x0, Pentium(r)
>>> Extreme/Core 2 Duo,   AMD(r) Athlon(r) 64 , AMD(r) Athlon(r) 64
>>> FX/Dual Core AM2 and
>>> similar chips when used for this kind of work.  Does anyone have any
>>> advice on (1)  the use of a single core or dual core cpu or (2) on
>>> the use of 32 bit and 64 bit cpu.  This question is now much more
>>> difficult as the numbers on the various chips do not necessarily
>>> refer to the relative speed of the chips.
>>> *
>>> *John
>>>
>>> * On 06/11/06, Taka Matzmoto <sell_mirage_ne at hotmail.com> wrote:
>>>
>>>
>>>> Hi R users
>>>>
>>>> Having both a faster CPU and more memory will boost computing power.
>>>> I was wondering if only adding more memory (1GB -> 2GB)  will
>>>> significantly reduce R computation time?
>>>>
>>>> Taka,
>>>>
>>>> _________________________________________________________________
>>>> Get FREE company branded e-mail accounts and business Web site from
>>>> Microsoft Office Live
>>>>
>>>> ______________________________________________
>>>> R-help at stat.math.ethz.ch mailing list
>>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>>> PLEASE do read the posting guide
>>>> http://www.R-project.org/posting-guide.html
>>>> and provide commented, minimal, self-contained, reproducible code.
>>>>
>>>>
>>>
>>>
>>>
>>>
>>
>> ______________________________________________
>> R-help at stat.math.ethz.ch mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide
>> http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>>
>
>

-- 
Brian D. Ripley,                  ripley at stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272866 (PA)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595



More information about the R-help mailing list