performance of apply

Douglas Bates
29 May 1998 10:11:04 -0500

Andreas Weingessel <> writes:

> If I just want make such a computation once, it might be not important
> whether I wait a couple of seconds or 2 minutes or 5 minutes (as it
> needs time for 15000 rows). But if I do some simulation where these
> computations are repeated hundred times, there is a difference between
> 200 seconds or 200 minutes.

I should probably have been clearer about what I was saying.  I was
indicating that some computations can be re-phrased in non-obvious
ways that are substantially faster in R.  To a numerical analyst it
would be ridiculous to use a matrix multiplication to collect row sums
because you are doing all those multiplications by 1.  However, in R/S
it is substantially faster because the looping overhear is in C not in

The other issue I was trying to indicate is that efficiency has to be
measured in how long it takes to get the work done.  This includes
both thinking (and often debugging) time and computing time.  The
trade-offs between the cost of thinking and the cost of computing have
changed remarkably over the years.  I often characterize the way I see
people computing now as using a computer as a blunt instrument with
which to bludgeon the problem to death (can you say "Markov Chain
Monte Carlo"?).  In real world terms it may be more efficient to do
that than to think deeply about the computations involved and how to
speed them up.

I have probably said too much about this.  I should go back to writing
my code, which, ironically, is C code designed to speed up simulations
for a particular type of statistical model :-)
Douglas Bates                  
Statistics Department                    608/262-2598
University of Wisconsin - Madison
r-devel mailing list -- Read
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: