[R] nls() vs lm() estimates

Janne Huttunen jmhuttun at stat.berkeley.edu
Fri Jun 13 20:21:52 CEST 2008


Héctor Villalobos wrote:
> Hi,
> 
> I'm trying to understand why the coefficients "a" and "b" for the model: W = a*L^b estimated
> via nls() differs from those obtained for the log transformed model: log(W) = log(a) + b*log(L)
> estimated via lm(). Also, if I didn't make a mistake, R-squared suggests a "better" adjustment
> for the model using coefficients estimated by lm() . Perhaps I'm doing something wrong in
> nls()?

I didn't tried your code, but in general these estimates are different: 
for the former estimate you minimize the norm of the difference W-a*L^b 
(W are ) and for the latter you minimize the norm of the difference 
log(W)-(log(a)+b*log(L)). The solution for these problems are equal. 
That which approach you should choose depends on errors, for additive 
error model the former is better choice.



-- 
Janne Huttunen
University of California
Department of Statistics
367 Evans Hall Berlekey, CA 94720-3860
email: jmhuttun at stat.berkeley.edu
phone: +1-510-502-5205
office room: 449 Evans Hall



More information about the R-help mailing list