[R] regression coefficients

Spencer Graves spencer.graves at pdf.com
Tue May 20 17:55:27 CEST 2003


Dear Prof. Ripley:  Of course, you are correct on both counts.  Thanks 
for the correction and elaboration.  spencer graves

Prof Brian Ripley wrote:
> Why is s assumed known and common to the k groups?  I doubt if that is 
> what was meant (although it was too imprecise to be at all sure).
> 
> If `common' is a viable assumption, you can just fit a model with by-group
> regressions vs one with a common regression (which seems to be what you
> are testing) and use anova().
> 
> If not, the case k=2 encompasses the Welch t-test so exact distribution
> theory is not going to be possible, but by fitting a common model and
> three separate models and summing the -2log-lik for the latter you can
> easily get the LT test and refer it to its `standard' (asymptotic)
> Chi-squared distribution.
> 
> On Tue, 20 May 2003, Spencer Graves wrote:
> 
> 
>>	  I don't know of a simply function to do what you want, but I can give 
>>you part of the standard log(likelihood ratio) theory:
>>
>>	  Suppose b[i]|s ~ N.r(b, s^2*W[i]), i = 1, ..., k.  Then the 
>>log(likelihood) is a sum of k terms of the following form:
>>
>>	  l[i] = (-0.5)*(r*log(2*pi*s^2)+log|W[i]|
>>	      +(s^-2)*t(b[i]-b)%*%solve(W[i]%*%(b[i]-b)
>>
>>By differentiating with respect to b and setting to 0, we get the 
>>maximum likelihood estimate for b as follows:
>>
>>	  b.hat = solve(sum(solve(W[i]))%*%sum(solve(W[i])%*%b[i])
>>
>>In words:  b.hat = weighted average with weights inversely proportional 
>>to the variances.  Then log(likelihood ratio) is as follows:
>>
>>	  log.LR = sum((s^-2)*t(b[i]-b.hat)%*%solve(W[i])%*%(b[i]-b.hat))
>>
>>This problem should be in most good books on multivariate analysis.  I 
>>would guess that log.LR probably has an F distribution with numerator 
>>degrees of freedom = r*(k-1) and with denominator degrees of freedom = 
>>degrees of freedom in the estimate of s.  However, I don't remember for 
>>sure.  It's vaguely possible that this is an "unsolved" problem.  In the 
>>latter case, you should have all the pieces here to generate a Monte 
>>Carlo.
> 
> 
> You have assumed s is known, in which case it is a Chi-squared 
> distribution.  If s is unknown, you need to maximize over it to get an LR 
> test (separately under the null and the alternative).
> 
> 
>>lamack lamack wrote:
>>
>>>dear all, How can I compare regression coefficients across three (or 
>>>more) groups?
>>
>




More information about the R-help mailing list