[R] residuals in logistic regression model

P Ehlers ehlers at math.ucalgary.ca
Fri Nov 25 16:07:44 CET 2005


Urania,

I'm not very fond of "putting additive residuals on the righthand side".
This practice tends to obscure the fact that we're fitting a conditional
mean function:

  E(Y|x) = function(x; parameters)

We then need to assess the model fit and uncertainties of parameter
estimates. We may want to consider an additive error structure
as part of the assessment (e.g. LS regression). In linear regression
this is often included as an error term on the RHS of a model equation,
probably because it is efficient to do so.

For logistic regression, model fit may be assessed by the deviance,
which can be considered to be a sum of deviance residuals. But the
model also assumes a dispersion factor of 1.0. This assumption is
assessed (in R) with the Pearson residuals. Further, the fitting
method is iterative, so R gives us the "working" residuals of the
final fit.

Peter Ehlers

Urania Sun wrote:

>  Thanks a lot, Professor.
> 
> Now I know if I put some additive residuals in the right handside of the
> logistic regression equation, they are different with any glm returned
> residuals.
> 
> But is it ever ok or legal to put some additive residuals in the right-hand
> side of the logistic equation regardless whatever they are when I write the
> left-hand side as the log odds-ratio of proportion instead of probability?
> 
> I have a big confusion on this. Thanks a lot. Your book in the library is
> currently checked out by someone else. I may try to get one of my own.
> 
> 
> On 11/24/05, John Fox <jfox at mcmaster.ca> wrote:
> 
>>Dear Urania,
>>
>>
>>>-----Original Message-----
>>>From: Urania Sun [mailto: suncertain at gmail.com]
>>>Sent: Thursday, November 24, 2005 8:52 PM
>>>To: John Fox
>>>Cc: r-help at stat.math.ethz.ch
>>>Subject: Re: [R] residuals in logistic regression model
>>>
>>>Thanks, Professor.
>>>
>>>But is it ok to write residuals in the right hand side of the
>>>logistic regression formula? Some people said I cannot since
>>>the generalized linear model is to use a function to link the
>>>expectation to a linear model. So there should not be
>>>residuals in the right hand side.
>>>
>>>My question is that If residuals do exist (as in the glm
>>>model output), why not put them in the formula (for example,
>>>if I write the left-hand side as the estimated odds-ratio)?
>>>
>>
>>There are several kinds of residuals for generalized linear models, as I
>>mentioned (see ?residuals.glm). The residuals in the glm output are
>>deviance
>>residuals, which are the casewise (signed) components of the residual
>>deviance; differences between y and fitted-y are called response residuals
>>(and aren't generally as useful). The left-hand side of a logit model
>>transformed with the logit-link is the log-odds, not the odds or
>>odds-ratio.
>>The form of the model to which the response residuals applies has the
>>proportion, not the logit, on the left-hand side.
>>
>>These matters are discussed in the references given in ?residuals.glm, and
>>
>>in many other places, such as Sec. 6.6 of my R and S-PLUS Companion to
>>Applied Regression.
>>
>>
>>>Many thanks!
>>>
>>>Happy Thanksgiving!
>>
>>Unfortunately we celebrate Thanksgiving in Canada in October, probably
>>because the weather here in late November leaves little to be thankful
>>for.
>>
>>Regards,
>>John
>>
>>
>>>On 11/24/05, John Fox <jfox at mcmaster.ca> wrote:
>>>
>>>      Dear Urania,
>>>
>>>      The residuals method for glm objects can compute
>>>several kinds of residuals;
>>>      the default is deviance residuals. See ?residuals.glm
>>>for details and
>>>      references.
>>>
>>>      I hope this helps.
>>>      John
>>>
>>>      --------------------------------
>>>      John Fox
>>>      Department of Sociology
>>>      McMaster University
>>>      Hamilton, Ontario
>>>      Canada L8S 4M4
>>>      905-525-9140x23604
>>>      http://socserv.mcmaster.ca/jfox
>>>< http://socserv.mcmaster.ca/jfox>
>>>      --------------------------------
>>>
>>>      > -----Original Message-----
>>>      > From: r-help-bounces at stat.math.ethz.ch
>>>      > [mailto: r-help-bounces at stat.math.ethz.ch] On Behalf
>>>Of Urania Sun
>>>      > Sent: Thursday, November 24, 2005 1:36 PM
>>>      > To: r-help at stat.math.ethz.ch
>>>      > Subject: [R] residuals in logistic regression model
>>>      >
>>>      > In the logistic regression model, there is no residual
>>>      >
>>>      > log (pi/(1-pi)) = beta_0 + beta_1*X_1 + .....
>>>      >
>>>      > But glm model will return
>>>      >
>>>      > residuals
>>>      >
>>>      > What is that?
>>>      >
>>>      > How to understand this? Can we put some residual in the
>>>      > logistic regression model by replacing pi with pi'
>>>(the estimated pi)?
>>>      >
>>>      >  log (pi'/(1-pi')) = beta_0 + beta_1*X_1 + .....+ ei
>>>      >
>>>      > Thanks!
>>>      >
>>>      >       [[alternative HTML version deleted]]
>>>      >
>>>      > ______________________________________________
>>>      > R-help at stat.math.ethz.ch mailing list
>>>      > https://stat.ethz.ch/mailman/listinfo/r-help
>>>      > PLEASE do read the posting guide!
>>>      > http://www.R-project.org/posting-guide.html<http://www.r-project.org/posting-guide.html>
>>><http://www.R-project.org/posting-guide.html
>>
>><http://www.r-project.org/posting-guide.html>>
>>
>>>
>>>
>>>
>>>
>>
> 
> 	[[alternative HTML version deleted]]
> 
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html




More information about the R-help mailing list