# [Rd] Seeking opinions on possible change to nls() code

Martin Maechler m@ech|er @end|ng |rom @t@t@m@th@ethz@ch
Fri Aug 20 18:14:51 CEST 2021

```>>>>> J C Nash
>>>>>     on Fri, 20 Aug 2021 11:41:26 -0400 writes:

> Thanks Martin. I'd missed the intention of that option,
> but re-reading it now it is obvious.

> FWIW, this problem is quite nasty, and so far I've found
> no method that reveals the underlying dangers well. And
> one of the issues with nonlinear models is that they
> reveal how slippery the concept of inference can be when
> applied to parameters in such models.

> JN

Indeed.

Just for the public (and those reading the archives in the future).

When Doug Bates and his phd student José Pinheiro wrote
"the NLME book"  (<==> Recommended R package {nlme}
https://cran.R-project.org/package=nlme )

José C. Pinheiro and  Douglas M. Bates
Mixed-Effects Models in S and S-PLUS
Springer-Verlag (January 2000)

They teach quite a bit about non-linear regression much of which
seems not much known nor taught nowadays.

NOTABLY they teach self-starting models, something phantastic,
available in R together with nls()  but unfortunately *also* not
much known nor taught!

I have improved the help pages, notably the examples for these,
in the distant past I vaguely remember.

Your present 9-point example can indeed also be solved beautiful
by R's builtin  SSbiexp()  [Self-starting bi-exponential model]:

NLSdata <- data.frame(
time = c(  1,   2,   3,   4,   6 ,  8,  10,  12,  16),
conc = c( 0.7, 1.2, 1.4, 1.4, 1.1, 0.8, 0.6, 0.5, 0.3))

## Once you realize that the above is the "simple"  bi-exponential model,
## you should remember  SSbiexp(),  and then

"everything is easy "

try4 <- nls(conc ~ SSbiexp(time, A1, lrc1, A2, lrc2), data = NLSdata,
trace=TRUE, control=nls.control(warnOnly=TRUE))
## --> converges nicely and starts much better anyway:
## 0.1369091   (2.52e+00): par = (-0.7623289 -2.116174 -2.339856 2.602446)
## 0.01160784  (4.97e-01): par = (-0.1988961 -1.974059 -3.523139 2.565856)
## 0.01016776  (1.35e-01): par = (-0.3653394 -1.897649 -3.547569 2.862685)
## 0.01005199  (3.22e-02): par = (-0.3253514 -1.909544 -3.55429 2.798951)
## 0.01004574  (8.13e-03): par = (-0.336659 -1.904219 -3.559615 2.821439)
## 0.01004534  (2.08e-03): par = (-0.3338447 -1.905399 -3.558815 2.816159)
## 0.01004532  (5.30e-04): par = (-0.3345701 -1.905083 -3.559067 2.817548)
## 0.01004531  (1.36e-04): par = (-0.3343852 -1.905162 -3.559006 2.817195)
## 0.01004531  (3.46e-05): par = (-0.3344325 -1.905142 -3.559022 2.817286)
## 0.01004531  (8.82e-06): par = (-0.3344204 -1.905147 -3.559018 2.817263)
## 0.01004531  (7.90e-06): par = (-3.559018 -0.3344204 2.817263 -1.905147)

## even adding central differences and  'scaleOffset' .. but that's not making big diff.:
try5 <- nls(conc ~ SSbiexp(time, A1, lrc1, A2, lrc2), data = NLSdata,
trace=TRUE, control=nls.control(warnOnly=TRUE, nDcentral=TRUE, scaleOffset = 1))
## 0.1369091     (1.43e-01): par = (-0.7623289 -2.116174 -2.339856 2.602446)
## ....
## 0.01004531    (5.43e-06): par = (-3.559006 -0.3343852 2.817195 -1.905162)
fitted(try5)
##  0.6880142 1.2416734 1.3871354 1.3503718 1.1051246 0.8451185 0.6334280 0.4717800
##  0.2604932

all.equal(  coef(try4),   coef(try5)) # "Mean relative difference: 1.502088e-05"
all.equal(fitted(try4), fitted(try5)) # "Mean relative difference: 2.983784e-06"

## and a nice plot:
plot(NLSdata, ylim = c(0, 1.5), pch=21, bg="red")
abline(h=0, lty=3, col="gray")
lines(NLSdata\$time, fitted(try5), lty=2, lwd=1/2, col="orange")
tt <- seq(0, 17, by=1/8)
str(pp <- predict(try5, newdata = list(time = tt)))
## num [1:137] -0.7418 -0.4891 -0.2615 -0.0569 0.1269 ...
## - attr(*, "gradient")= num [1:137, 1:4] 1 0.914 0.836 0.765 0.699 ...
##  ..- attr(*, "dimnames")=List of 2
##  .. ..\$ : NULL
##  .. ..\$ : chr [1:4] "A1" "lrc1" "A2" "lrc2"
lines(tt, pp, col=4)

```