[R] ar.ols() behaviour when time series variance is zero

Aman Verma aman.verma.mtl at gmail.com
Sat Apr 14 20:37:35 CEST 2012


Hello,
When the ar.ols function (in the package stats) is run with an argument 
that has a variance of zero, it returns an error:

ar.ols(c(1,1,1))
Error in qr.default(x) : NA/NaN/Inf in foreign function call (arg 1)

I believe that the reason is because the time series is automatically 
rescaled by the variance in these lines of the function:

sc <- sqrt(drop(apply(x, 2L, var))) # Variance is zero
x <- x/rep.int(sc, rep.int(n.used, nser)) # x is now c(NaN,NaN,NaN)

There is no argument that forces the function not to rescale. A simple 
solution would be to verify that the variance is not zero before rescaling.

I understand that fitting an autoregressive model to such a time series 
would not yield meaningful results. However, ar.ols is often used inside 
of other functions that may not be aware that the function will fail if 
the variance of time series is zero. If desired, I can post an example 
of how using a sandwich estimator can lead to calling ar.ols with a time 
series with a variance of zero.

Is this a bug? If not, would it be appropriate to document this 
behaviour so that other functions could check the time series before 
passing them to ar.ols()? Perhaps another solution would be to allow 
users to force the function not to rescale the time series.

My sessionInfo():
R version 2.15.0 (2012-03-30)
Platform: i386-pc-mingw32/i386 (32-bit)

locale:
[1] LC_COLLATE=English_Canada.1252  LC_CTYPE=English_Canada.1252    
LC_MONETARY=English_Canada.1252 LC_NUMERIC=C
[5] LC_TIME=English_Canada.1252

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base

loaded via a namespace (and not attached):
[1] tools_2.15.0

Thanks,
Aman Verma



More information about the R-help mailing list