[R] nls() vs lm() estimates

Janne Huttunen jmhuttun at stat.berkeley.edu
Fri Jun 13 20:29:32 CEST 2008

Janne Huttunen wrote:
> Héctor Villalobos wrote:
>> Hi,
>> I'm trying to understand why the coefficients "a" and "b" for the 
>> model: W = a*L^b estimated
>> via nls() differs from those obtained for the log transformed model: 
>> log(W) = log(a) + b*log(L)
>> estimated via lm(). Also, if I didn't make a mistake, R-squared 
>> suggests a "better" adjustment
>> for the model using coefficients estimated by lm() . Perhaps I'm doing 
>> something wrong in
>> nls()?
> I didn't tried your code, but in general these estimates are different: 
> for the former estimate you minimize the norm of the difference W-a*L^b 
> (W are ) and for the latter you minimize the norm of the difference 
> log(W)-(log(a)+b*log(L)). The solution for these problems are equal. 
> That which approach you should choose depends on errors, for additive 
> error model the former is better choice.

I should read what I have written before sending my message. I meant 
that the solutions of these problems are NOT equal (in general) and 
therefore estimates differ.

Janne Huttunen
University of California
Department of Statistics
367 Evans Hall Berkeley, CA 94720-3860
email: jmhuttun at stat.berkeley.edu
phone: +1-510-502-5205
office room: 449 Evans Hall

More information about the R-help mailing list