[R] Obtaining SE from the hessian matrix

Spencer Graves spencer.graves at pdf.com
Thu Feb 19 20:21:39 CET 2004


      Minor correction:  Most likely, Prof. Lumley's statement is 
correct.  However, as I'm sure he knows, it depends on what you are 
maximizing or minimizing:  If you are maximizing the log(likelihood), 
then the NEGATIVE of the hessian is the "observed information".  This 
latter should be positive semi-definite, and if nonsingular, its inverse 
will be the covariance matrix of the standard normal approximation.  
Alternatively, if you MINIMIZE a "deviance" = (-2)*log(likelihood), then 
the HALF of the hessian is the observed information.  In the unlikely 
event that you are maximizing the likelihood itself, you need to divide 
the negative of the hessian by the likelihood to get the observed 
information. 

      hope this helps.  spencer graves

Thomas Lumley wrote:

>On Thu, 19 Feb 2004, Timur Elzhov wrote:
>  
>
>>So, what is the _right_ way for obtatining SE? Why two those formulas above
>>differ?
>>
>>    
>>
>
>If you are maximising a likelihood then the covariance matrix of the
>estimates is (asymptotically) the inverse of the negative of the Hessian.
>
>The standard errors are the square roots of the diagonal elements of the
>covariance.
>
>So if you have the Hessian you need to invert it, if you have the
>covariance matrix, you don't.
>
>	-thomas
>
>______________________________________________
>R-help at stat.math.ethz.ch mailing list
>https://www.stat.math.ethz.ch/mailman/listinfo/r-help
>PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
>  
>




More information about the R-help mailing list