[R] Percentage effects in logistic regression

Roberto Patuelli roberto.patuelli at usi.ch
Mon Nov 9 19:54:27 CET 2009


Dear Daniel,

Thanks for your prompt reply.
Indeed I was aware of the possibility of computing at mean(x) or doing the 
mean afterwards.
But what you suggest is marginal effects, right? Isn't that the effect on y 
of a 1-unit increase in x (what I was not interested in)? I'm interested in 
the effect on y of a 1% increase in x (called percentage effects, right?).

Could you please clarify?

Thanks
Roberto


----- Original Message ----- 
From: "Daniel Malter" <daniel at umd.edu>
To: "Patuelli Roberto" <roberto.patuelli at usi.ch>; <r-help at r-project.org>
Sent: Monday, November 09, 2009 7:44 PM
Subject: AW: [R] Percentage effects in logistic regression


Somebody might have done this, but in fact it's not difficult to compute the
marginal effects yourself (which is the beauty of R). For a univariate
logistic regression, I illustrate two ways to compute the marginal effects
(one corresponds to the mfx, the other one to the margeff command in Stata).
With the first you compute the marginal effect based on the mean fitted
values; with the second you compute the marginal effect based on the fitted
values for each observation and then mean over the individual marginal
effects. Often the second way is considered better. You can easily extend
the R-code below to a multivariate regression.

#####
#####Simulate data and run regression
#####

set.seed(343)
x=rnorm(100,0,1)      #linear predictor
lp=exp(x)/(1+exp(x)) #probability
y=rbinom(100,1,lp) #Bernoulli draws with probability lp

#Run logistic regression
reg=glm(y~x,binomial)
summary(reg)

#####
#####Regression output
#####

Coefficients:
            Estimate Std. Error z value Pr(>|z|)
(Intercept)   0.1921     0.2175   0.883 0.377133
x             0.9442     0.2824   3.343 0.000829 ***
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

(Dispersion parameter for binomial family taken to be 1)

    Null deviance: 138.47  on 99  degrees of freedom
Residual deviance: 125.01  on 98  degrees of freedom
AIC: 129.01

#####
#####Compute marginal effects
#####

#Way 1
mean(fitted(reg))*mean(1-fitted(reg))*coefficients(reg)[2]

0.2356697

#Way 2
mean(fitted(reg)*(1-fitted(reg))*coefficients(reg)[2])

0.2057041


#####
#####Check with Stata
#####

Logistic regression                               Number of obs   =
100
                                                  LR chi2(1)      =
13.46
                                                  Prob > chi2     =
0.0002
Log likelihood = -62.506426                       Pseudo R2       =
0.0972

----------------------------------------------------------------------------
--
           y |      Coef.   Std. Err.      z    P>|z|     [95% Conf.
Interval]
-------------+--------------------------------------------------------------
--
           x |   .9441896   .2824403     3.34   0.001     .3906167
1.497762
       _cons |   .1920529   .2174531     0.88   0.377    -.2341474
.6182532
----------------------------------------------------------------------------
--

#####
#####Compute marginal effects in Stata
#####

#Way 1
Marginal effects after logit
      y  = Pr(y) (predict)
         =  .52354297
----------------------------------------------------------------------------
--
variable |      dy/dx    Std. Err.     z    P>|z|  [    95% C.I.   ]      X
---------+------------------------------------------------------------------
--
       x |   .2355241      .07041    3.35   0.001   .097532  .373516
-.103593
----------------------------------------------------------------------------
--

#Way 2
Average marginal effects on Prob(y==1) after logit

----------------------------------------------------------------------------
--
           y |      Coef.   Std. Err.      z    P>|z|     [95% Conf.
Interval]
-------------+--------------------------------------------------------------
--
           x |   .2057041   .0473328     4.35   0.000     .1129334
.2984747
----------------------------------------------------------------------------
--


HTH,
Daniel



-------------------------
cuncta stricte discussurus
-------------------------

-----Ursprüngliche Nachricht-----
Von: r-help-bounces at r-project.org [mailto:r-help-bounces at r-project.org] Im
Auftrag von Roberto Patuelli
Gesendet: Monday, November 09, 2009 12:04 PM
An: r-help at r-project.org
Betreff: [R] Percentage effects in logistic regression

Dear ALL,

I'm trying to figure out what the percentage effects are in a logistic
regression. To be more clear, I'm not interested in the effect on y of a
1-unit increase in x, but on the percentage effect on y of a 1% increase in
x (in economics this is also often called an "elasticity").
For example, if my independent variables are in logs, the betas can be
directly interpreted as percentage effects both in OLS and Poisson
regression. What about the logistic regression?

Is there a package (maybe effects?) that can compute these automatically?

Thanks and best regards,
Roberto Patuelli



********************
Roberto Patuelli, Ph.D.
Istituto Ricerche Economiche (IRE) (Institute for Economic Research)
Università della Svizzera Italiana (University of Lugano) via Maderno 24, CP
4361
CH-6904 Lugano
Switzerland
Phone: +41-(0)58-666-4166
Fax: +39-02-700419665

______________________________________________
R-help at r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.




More information about the R-help mailing list