[R] logistric regression: model revision

Dennis Murphy djmuser at gmail.com
Mon Nov 7 19:30:39 CET 2011


Since you didn't provide a reproducible example, here are a couple of
possibilities to check, but I have utterly no idea if they're
applicable to your problem or not:

     * does costdis1 consist of 0's and 1's?
     * is costdis1 a factor?

In the first model, you treat costdis1 as a pure quadratic and in the
second model, it is a linear term. The two models are not nested.
Modeling a term as a pure quadratic is a very strong assumption - the
more usual practice is to fit both a linear and quadratic term in
costdis1 to allow more flexibility in the fitted surface, but that
would require costdis1 to be numeric.

HTH,
Dennis

On Mon, Nov 7, 2011 at 7:58 AM, Sally Ann Sims <sallysims at earthlink.net> wrote:
> Hello,
>
> I am working on fitting a logistic regression model to my dataset.  I removed the squared term in the second version of the model, but my model output is exactly the same.
>
> Model version 1:  GRP_GLM<-glm(HB_NHB~elev+costdis1^2,data=glm_1,family=binomial(link=logit))
> summary(GRP_GLM)
>
>
> Model version 2:  QM_1<-glm(HB_NHB~elev+costdis1,data=glm_2,family=binomial(link=logit))
> summary(QM_1)
>
>
> The call in version 2 has changed:
> Call:
> glm(formula = HB_NHB ~ elev + costdis1, family = binomial(link = logit),
>    data = glm_2)
> But I’m getting the exact same results as I did in the model where costdis1 is squared.
>
> Any ideas what I might do to correct this?  Thank you.
>
> Sally
>        [[alternative HTML version deleted]]
>
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
>



More information about the R-help mailing list