[R] Obtain gradient at multiple values for exponential decay model

J C Nash pro|jcn@@h @end|ng |rom gm@||@com
Sat Apr 7 14:37:41 CEST 2018


R users may want to note that there are some extensions in packages for symbolic derivatives.

In particular, Duncan Murdoch added some "all in R" tools in the package nlsr that I maintain. This
is a substitute for the nls() function that uses a fairly unsatisfactory forward difference derivative
approximation. Moreover, the solver in nlsr is a variant of the Marquardt stabilized Gauss-Newton
approach, rather than the straight Gauss-Newton that often gives a "singular gradient" error. Nothing
is free, of course, and the Marquardt approach often uses more iterations when the problem is
uncomplicated. On the other hand, it is rather like a pit bull in trying to find a solution.

There is also the Deriv package, also "all in R". Both it and nlsr allow extensions to the derivatives
table, though I think most users will need to do some homework to become comfortable with doing
that.

There is also a Google Summer of Code proposal this year to wrap the Julia Automatic Differentiation
tools to R. This would allow derivatives of coded functions to be computed avoiding approximations.
I believe it will be at least partly successful in achieving its goals.

Unfortunately (and I would rather hope that someone can say eventually that I am wrong), I believe
no tool is universally applicable.

JN

On 2018-04-07 02:19 AM, Jeff Newmiller wrote:
> I have never found the R symbolic differentiation helpful because my functions are typically quite complicated, but was
> prompted by Steve Ellison's suggestion to try it out in this case:
> 
> ################# reprex (see reprex package)
> graphdta <- read.csv( text =
> "t,c
> 0,100
> 40,78
> 80,59
> 120,38
> 160,25
> 200,21
> 240,16
> 280,12
> 320,10
> 360,9
> 400,7
> ", header = TRUE )
> 
> nd <- c( 100, 250, 300 )
> graphmodeld <- lm( log(c) ~ t, data = graphdta )
> graphmodelplin <- predict( graphmodeld
>                          , newdata = data.frame( t = nd )
>                          )
> graphmodelp <- exp(graphmodelplin)
> graphmodelp
> #>        1        2        3
> #> 46.13085 16.58317 11.79125
> # derivative of exp( a + b*t ) is b * exp( a + b*t )
> graphmodeldpdt <- coef( graphmodeld )[ 2 ] * graphmodelp
> graphmodeldpdt
> #>           1           2           3
> #> -0.31464113 -0.11310757 -0.08042364
> 
> # Ellison suggestion - fancy, only works for simple functions
> dc <- deriv( expression( exp( a + b * t ) )
>            , namevec = "t"
>            )
> dcf <- function( t ) {
>   cgm <- coef( graphmodeld )
>   a <- cgm[ 1 ]
>   b <- cgm[ 2 ]
>   eval(dc)
> }
> result <- dcf( nd )
> result
> #> [1] 46.13085 16.58317 11.79125
> #> attr(,"gradient")
> #>                t
> #> [1,] -0.31464113
> #> [2,] -0.11310757
> #> [3,] -0.08042364
> attr( result, "gradient" )[ , 1 ]
> #> [1] -0.31464113 -0.11310757 -0.08042364
> #################
> 
> On Fri, 6 Apr 2018, David Winsemius wrote:
> 
>>
>>> On Apr 6, 2018, at 8:03 AM, David Winsemius <dwinsemius using comcast.net> wrote:
>>>
>>>
>>>> On Apr 6, 2018, at 3:43 AM, g l <gnulinux using gmx.com> wrote:
>>>>
>>>>> Sent: Friday, April 06, 2018 at 5:55 AM
>>>>> From: "David Winsemius" <dwinsemius using comcast.net>
>>>>>
>>>>>
>>>>> Not correct. You already have `predict`. It is capale of using the `newdata` values to do interpolation with the
>>>>> values of the coefficients in the model. See:
>>>>>
>>>>> ?predict
>>>>>
>>>>
>>>> The ? details did not mention interpolation explicity; thanks.
>>>>
>>>>> The original question asked for a derivative (i.e. a "gradient"), but so far it's not clear that you understand the
>>>>> mathematical definiton of that term. We also remain unclear whether this is homework.
>>>>>
>>>>
>>>> The motivation of this post was simple differentiation of a tangent point (dy/dx) manually, then wondering how to
>>>> re-think in modern-day computing terms. Hence the original question about asking the appropriate functions/syntax to
>>>> read further ("curiosity"), not the answer (indeed, "homework"). :)
>>>>
>>>> Personal curiosity should be considered "homework".
>>>
>>> Besides symbolic differentiation, there is also the option of numeric differentiation. Here's an amateurish attempt:
>>>
>>> myNumDeriv <- function(x){ (exp( predict (graphmodeld, newdata=data.frame(t=x+.0001))) -
>>>                                            exp( predict (graphmodeld, newdata=data.frame(t=x) )))/
>>>                                          .0001 }
>>> myNumDeriv(c(100, 250, 350))
>>
>> I realized that this would not work in the context of your construction. I had earlier made a more symbolic version
>> using R formulae:
>>
>> graphdata<-read.csv(text='t,c
>> 0,100
>> 40,78
>> 80,59
>> 120,38
>> 160,25
>> 200,21
>> 240,16
>> 280,12
>> 320,10
>> 360,9
>> 400,7')
>> graphmodeld<-lm(log(c)~t, graphdata)
>> graphmodelp<-exp(predict(graphmodeld))
>> plot(c~t, graphdata)
>> lines(graphdata[,1],graphmodelp)
>> myNumDeriv(c(100, 250, 350), graphmodeld )
>> #----------------------------------------------
>>          1           2           3
>> -0.31464102 -0.11310753 -0.05718414
>>
>>
>>>
>>>
>>>
>>> David Winsemius
>>> Alameda, CA, USA
>>>
>>> 'Any technology distinguishable from magic is insufficiently advanced.'   -Gehm's Corollary to Clarke's Third Law
>>>
>>> ______________________________________________
>>> R-help using r-project.org mailing list -- To UNSUBSCRIBE and more, see
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>>> and provide commented, minimal, self-contained, reproducible code.
>>
>> David Winsemius
>> Alameda, CA, USA
>>
>> 'Any technology distinguishable from magic is insufficiently advanced.'   -Gehm's Corollary to Clarke's Third Law
>>
>> ______________________________________________
>> R-help using r-project.org mailing list -- To UNSUBSCRIBE and more, see
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>>
> 
> ---------------------------------------------------------------------------
> Jeff Newmiller                        The     .....       .....  Go Live...
> DCN:<jdnewmil using dcn.davis.ca.us>        Basics: ##.#.       ##.#.  Live Go...
>                                       Live:   OO#.. Dead: OO#..  Playing
> Research Engineer (Solar/Batteries            O.O#.       #.O#.  with
> /Software/Embedded Controllers)               .OO#.       .OO#.  rocks...1k
> 
> ______________________________________________
> R-help using r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.




More information about the R-help mailing list