# [Rd] reason for odd timings

J C Nash pro|jcn@@h @end|ng |rom gm@||@com
Sat Jan 22 02:51:08 CET 2022

```Occasionally I run some rather trivial timings to get an idea of what might
be the best way to compute some quantities.

The program below gave timings for sums of squares of 100 elements much greater
than those for 1000, which seems surprising. Does anyone know the cause of this?

This isn't holding up my work. Just causing some head scratching.

JN

> source("sstimer.R")
n  	  t(forloop) : ratio 	  t(sum) : ratio 	 t(crossprod) 	 all.equal
100 	 38719.15  :  1.766851 	 13421.12  :  0.6124391 	 21914.21 	 TRUE
1000 	 44722.71  :  20.98348 	 3093.94  :  1.451648 	 2131.33 	 TRUE
10000 	 420149.9  :  42.10269 	 27341.6  :  2.739867 	 9979.17 	 TRUE
1e+05 	 4070469  :  39.89473 	 343293.5  :  3.364625 	 102030.2 	 TRUE
1e+06 	 42293696  :  33.27684 	 3605866  :  2.837109 	 1270965 	 TRUE
1e+07 	 408123066  :  29.20882 	 35415106  :  2.534612 	 13972596 	 TRUE
>

# crossprod timer
library(microbenchmark)
suml<-function(vv) {
ss<-0.0
for (i in 1:length(vv)) {ss<-ss+vv[i]^2}
ss
}
sums<-function(vv) {
ss<-sum(vv^2)
ss
}
sumc<-function(vv) {
ss<-as.numeric(crossprod(vv))
ss
}
ll <- c(100, 1000, 10000, 100000, 1000000, 10000000)
cat(" n  \t  t(forloop) : ratio \t  t(sum) : ratio \t t(crossprod) \t all.equal \n")
for (nn in ll ){
set.seed(1234)
vv <- runif(nn)
tsuml<-microbenchmark(sl<-suml(vv), unit="us")
tsums<-microbenchmark(ss<-sums(vv), unit="us")
tsumc<-microbenchmark(sc<-sumc(vv), unit="us")
ml<-mean(tsuml\$time)
ms<-mean(tsums\$time)
mc<-mean(tsumc\$time)
cat(nn,"\t",ml," : ",ml/mc,"\t",ms," : ",ms/mc,"\t",mc,"\t",all.equal(sl, ss, sc),"\n")
}

```