[R] Safeguarded Newton method for function minimization

J C Nash profjcnash at gmail.com
Tue Apr 18 19:32:52 CEST 2017


Recently Marie Boehnstedt reported a bug in the nlm() function for function minimization
when both gradient and hessian are provided. She has provided a work-around for some
cases and it seems this will get incorporated into the R function eventually.

However, despite the great number of packages on CRAN, there does not appear to be a
straightforward Newton approach to function minimization. This may be because providing
the code for a hessian (the matrix of second derivatives) is a lot of work and error-prone.
(R could also use some good tools for Automatic Differentiation). I have also noted that
a number of researchers try to implement textbook methods and run into trouble when maths
and computing are not quite in sync. Therefore, I wrote a simple safeguarded Newton and
put a small package on R-forge at

https://r-forge.r-project.org/R/?group_id=395

Note that Newton's method is used to solve nonlinear equations. In fact, for function
minimization, we apply it to solve g(x) = 0 where g is the gradient and x is the vector
of parameters. In part, safeguards ensure we reduce the function f(x) at each step to avoid
some of the difficulties that may arise from a non-positive-definite hessian.

In the package, I have a very simple quadratic test, the Rosenbrock test function and
the Wood test function. The method fails on the last function -- the hessian is not
positive definite where it stops.

Before submitting this package to CRAN, I would like to see its behaviour on other
test problems, but am lazy enough to wish to avoid creating the hessian code. If anyone
has such code, it would be very welcome. Please contact me off-list. If I get some workable
examples that are open for public view, I'll report back here.

John Nash



More information about the R-help mailing list