0

I am running a Poisson regression on some data and I have to interpret the marginal effects on the dependent variable when one of the regressors decreases by 45 units.

I understand the marginal effects of a Poisson model are as follows:

$\frac{\delta E(y_{i}|x_{i})}{\delta x_{ji}}$ = $ \beta_{j}$ x $E(y_{i}|x_{i})$

This, however, is the mfx for a change of one unit in the regressor. What about 45 units? Would I just have to multiply this computed effect by -45?

The regression I am running includes both the normal regressor and a squared term, too.

Kamyen
  • 5

1 Answers1

0

There are at least two possible solutions.

The simplest one is to define a new variable $\tilde x = \frac{x}{45}$ that is scaled so that a one unit change in $\tilde x$ corresponds to 45 units change in $x$. You can fit the model with $\tilde x$ in place of $x$ and calculate the derivative. This is algebraically equivalent to the calculation you have scaling the derivative by $\Delta x$ (though the SEs are arguably easier with the variable transformation).

The second solution is to calculate a finite difference (FD) in predictions, varying $x$. In a Poisson model, the expected value of outcome $y$ given $x$ and $w$ would be $$E[y \vert x,w] = \exp(\alpha + \beta \cdot x +\gamma \cdot w ) = \exp(\alpha) \cdot \exp(\beta \cdot x) \cdot \exp(\gamma \cdot w ).$$

Similarly, $$E[y \vert x'=x+45,w] = \exp(\alpha) \cdot \exp(\beta \cdot x)\cdot \exp(\beta \cdot45) \cdot \exp(\gamma \cdot w ).$$

This means that $$\Delta = E[y \vert x'=x+45,w]- E[y \vert x,w] = E[y\vert x,w] \cdot \left[ \exp(\beta \cdot 45)-1 \right].$$

These two methods will not agree exactly for larger changes since derivatives are approximations for an infinitesimal change in $x$ and the finite difference is exact. You may recall something from calculus about the derivative being the slope of a line tangent to the curve. The FD is preferred here since it is exact.

Here is an example using Stata and the cars dataset with a 5 unit increase in $MPG$ and price:

. sysuse auto, clear
(1978 Automobile Data)

. /* (1) Variable Transformation */ . gen mpg_fiver = mpg/5

. poisson price i.foreign c.mpg_fiver, nolog

Poisson regression Number of obs = 74 LR chi2(2) = 31019.35 Prob > chi2 = 0.0000 Log likelihood = -28478.503 Pseudo R2 = 0.3526


   price |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]

-------------+---------------------------------------------------------------- foreign | Foreign | .2849739 .0034436 82.75 0.000 .2782245 .2917233 mpg_fiver | -.2624522 .0015504 -169.28 0.000 -.2654909 -.2594135 _cons | 9.723688 .006087 1597.46 0.000 9.711758 9.735618


. margins, dydx(mpg_fiver)

Average marginal effects Number of obs = 74 Model VCE : OIM

Expression : Predicted number of events, predict() dy/dx w.r.t. : mpg_fiver


         |            Delta-method
         |      dy/dx   Std. Err.      z    P>|z|     [95% Conf. Interval]

-------------+---------------------------------------------------------------- mpg_fiver | -1618.085 9.854164 -164.20 0.000 -1637.399 -1598.771


. /* (2) Finite Difference */ . poisson price i.foreign c.mpg, nolog

Poisson regression Number of obs = 74 LR chi2(2) = 31019.35 Prob > chi2 = 0.0000 Log likelihood = -28478.503 Pseudo R2 = 0.3526


   price |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]

-------------+---------------------------------------------------------------- foreign | Foreign | .2849739 .0034436 82.75 0.000 .2782245 .2917233 mpg | -.0524904 .0003101 -169.28 0.000 -.0530982 -.0518827 _cons | 9.723688 .006087 1597.46 0.000 9.711758 9.735618


. margins, at(mpg == generate(mpg)) at(mpg == generate(mpg+5)) contrast(atcontrast(ar._at))

Contrasts of predictive margins Number of obs = 74 Model VCE : OIM

Expression : Predicted number of events, predict()

1._at : mpg = mpg

2._at : mpg = mpg+5


         |         df        chi2     P>chi2

-------------+---------------------------------- _at | 1 34626.79 0.0000



         |            Delta-method
         |   Contrast   Std. Err.     [95% Conf. Interval]

-------------+------------------------------------------------ _at | (2 vs 1) | -1423.169 7.648041 -1438.158 -1408.179


. margins, expression(exp(predict(xb))(exp(5_b[mpg])-1)) // same as above, but you have to do the math yourself

Predictive margins Number of obs = 74 Model VCE : OIM

Expression : exp(predict(xb))(exp(5_b[mpg])-1)


         |            Delta-method
         |     Margin   Std. Err.      z    P>|z|     [95% Conf. Interval]

-------------+---------------------------------------------------------------- _cons | -1423.169 7.648041 -186.08 0.000 -1438.158 -1408.179


. /* (3) Rescale the derivative, same as (1) / . margins, expression(exp(predict(xb))_b[mpg]*5)

Predictive margins Number of obs = 74 Model VCE : OIM

Expression : exp(predict(xb))_b[mpg]5


         |            Delta-method
         |     Margin   Std. Err.      z    P>|z|     [95% Conf. Interval]

-------------+---------------------------------------------------------------- _cons | -1618.085 9.854164 -164.20 0.000 -1637.399 -1598.771


State calculates the Average Marginal Effects above, which is the average of the derivative or the finite difference in the estimation sample. I think the code is pretty digestible for a non-Stata user, but please ask for clarifications if that's not the case.

As you can see the variable transformation in (1) and the scaled derivative in (3) are the same, but larger than the finite difference because of the approximation/extrapolation error.

The extension to the quadratic specification is pretty straightforward, though the derivative will be slightly more complicated. When $$E[y \vert x,w] = \exp(\alpha + \beta \cdot x + \eta x^2 + \gamma \cdot w ),$$ then, using the chain rule, $$\frac{\partial E[y \vert x,w]}{\partial x} =E[y \vert x,w]\cdot(\beta + 2 \cdot \eta \cdot x) $$

Here is the Stata version of the AME (canned and by-hand):

. poisson price i.foreign c.mpg##c.mpg, nolog

Poisson regression Number of obs = 74 LR chi2(3) = 39564.07 Prob > chi2 = 0.0000 Log likelihood = -24206.141 Pseudo R2 = 0.4497


   price |      Coef.   Std. Err.      z    P>|z|     [95% Conf. Interval]

-------------+---------------------------------------------------------------- foreign | Foreign | .282981 .003498 80.90 0.000 .2761251 .2898369 mpg | -.1883174 .0014391 -130.85 0.000 -.1911381 -.1854967 | c.mpg#c.mpg | .0029567 .0000302 98.05 0.000 .0028976 .0030158 | _cons | 11.17249 .0161772 690.63 0.000 11.14078 11.20419


. margins, dydx(mpg)

Average marginal effects Number of obs = 74 Model VCE : OIM

Expression : Predicted number of events, predict() dy/dx w.r.t. : mpg


         |            Delta-method
         |      dy/dx   Std. Err.      z    P>|z|     [95% Conf. Interval]

-------------+---------------------------------------------------------------- mpg | -431.2165 2.476789 -174.10 0.000 -436.0709 -426.3621


. margins, expression(exp(predict(xb))(_b[mpg]+ 2_b[c.mpg#c.mpg]*mpg))

Predictive margins Number of obs = 74 Model VCE : OIM

Expression : exp(predict(xb))(_b[mpg]+ 2_b[c.mpg#c.mpg]*mpg)


         |            Delta-method
         |     Margin   Std. Err.      z    P>|z|     [95% Conf. Interval]

-------------+---------------------------------------------------------------- _cons | -431.2165 2.476789 -174.10 0.000 -436.0709 -426.3621


The finite difference approach is the same as before:

. margins, at(mpg == generate(mpg)) at(mpg == generate(mpg+5)) contrast(atcontrast(ar._at))

Contrasts of predictive margins Number of obs = 74 Model VCE : OIM

Expression : Predicted number of events, predict()

1._at : mpg = mpg

2._at : mpg = mpg+5


         |         df        chi2     P>chi2

-------------+---------------------------------- _at | 1 29391.03 0.0000



         |            Delta-method
         |   Contrast   Std. Err.     [95% Conf. Interval]

-------------+------------------------------------------------ _at | (2 vs 1) | -1414.86 8.252891 -1431.035 -1398.685


The difference is now larger since the quadratic spec here is "curvier," making the approximation much worse.

dimitriy
  • 35,430
  • Sorry for the late answer, but this was perfect. Thank you so much! Ended up using the second approach for simplicity and given that in the context it was required, it was more fitting. – Kamyen Dec 28 '20 at 01:00