[R] Hardwarefor R cpu 64 vs 32, dual vs quad

Henrik Bengtsson hb at stat.berkeley.edu
Tue Sep 9 20:54:59 CEST 2008


On Tue, Sep 9, 2008 at 6:31 AM, Nic Larson <niklar at gmail.com> wrote:
> Need to buy fast computer for running R on. Today we use 2,8 MHz intel D cpu
> and the calculations takes around 15 days. Is it possible to get the same
> calculations down to minutes/hours by only changing the hardware?
> Should I go for an really fast dual 32 bit cpu and run R over linux or xp or
> go for an quad core / 64 bit cpu?
> Is it effective to run R on 64 bit (and problem free
> (running/installing))???
> Have around 2000-3000 euro to spend

Faster machines won't do that much.  Without knowing what methods and
algorithms you are running, I bet you a beer that it can be made twice
as fast by just optimizing the code.  My claim applies recursively.
In other words, by optimizing the algorithms/code you can speed up
things quite a bit.  From experience, it is not unlikely to find
bottlenecks in generic algorithms that can be made 10-100 times
faster.  Here is *one* example illustrating that even when you think
the code is "fully optimized" you can still squeeze out more:

  http://wiki.r-project.org/rwiki/doku.php?id=tips:programming:code_optim2

So, start profiling your code to narrow down the parts that takes most
of the CPU time.  help(Rprof) is a start.  There is also a Section
'Profiling R code for speed' in 'Writing R Extensions'.  Good old
verbose print out of system.time() also helps.

My $.02 ...or 2000-3000USD if it was bounty?! ;)

/Henrik

> Thanx for any tip
>
>        [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>



More information about the R-help mailing list