1

I have a question regarding time fixed effects and their definition in empricial Papers.

Authors often talk about (1) estimating an OLS Regression and employing time and country fixed effects. What I don't understand that sometimes these same Authors in the same Paper do another regression and say they implement a (2) fixed effects model and include year fixed effects.

Can someone please explain to me what the difference between both specifications are? Is (1) just a linear regression using OLS and employing time dummy variables (by using the lm function in R), to exert the changes caught by the year. So in this case do they call the time dummy variables "fixed effects" or are they using a fixed effect model you would code in R using the plm package?

Here is an example to clarify what i mean: See Hibbeln (2020) Simple is Simply not Enough – Features versus Labels of Complex Financial Securities*: https://deliverypdf.ssrn.com/delivery.php?ID=668089000120105019080011005080098106020020059065037078000115088006104097006096109071022118037001014005040068007075003124112077052021093009085106072094127106109067069026001016083102112105027067085026119088074066127095086014078029071080118011106003021118&EXT=pdf&INDEX=TRUE

P. 14-15: we run the following pooled OLS regression (...) and other macroeconomic factors, we implement trading day fixed effects. Additionally, we control for unobservable differences in originator characteristics using originator fixed effects. With these fixed effects, we also control for country fixed effects.

P. 18: As we are interested in the within-tranche effect of receiving the STS labels, we implement a fixed effects model on tranche-level (Equation 2). By including tranche fixed effects λi, we control for all time-constant tranche-specific characteristics in general and for the security design features in particular.

If an author is talking about a linear regression (OLS) and then using fixed effects like mentioned above is he doing a fixed effects model or an OLS Regression with time dummies, but he's just calling those time dummies "fixed effects"?

Other authors use the term OLS Regression and time dummies, while some say OLS Regression and fixed effects, that's what makes it difficult for me to understand what is done.

  • 3
    Please do not simply delete and repost a closed question. Instead, please follow our guidance to edit and improve the closed post. Any edit will automatically put it up for a vote on reopening, and if enough voters agree that the post now is acceptable, it will be reopened. – Stephan Kolassa Nov 07 '23 at 15:46
  • I'm not sure why you are trying to make a distinction between fixed effects and categorical dummy variables, usually 'fixed' means 'not random', i.e. they are 'interesting in themselves' (specifically marginal estimates or predictions at specific values of those effects). See also here and here for more explanation on fixed vs. random. – PBulls Nov 07 '23 at 19:54
  • @PBulls I'm examining the effect of complex asset backed securities deals on the demanded Credit-Spread by Investors. For my thesis I've collected a few Papers that have a similiar research question to mine. That's why i want to unterstand the distinction the authors seem to make, so i can just understand what the author is doing with the regression so i can apply it on my regression to see if i get similiar results. Due to the questions shared above i've problems identifying if a linear regression is used with time Dummy variables or a Fixed- Effect Model – VivaFalastin Nov 07 '23 at 21:18
  • @StephanKolassa sorry about that, i thought the question got deleted – VivaFalastin Nov 07 '23 at 21:18

1 Answers1

1

This is an economics-flavored answer, which is appropriate given your research question. Other fields have their own jargon that is fairly different.

There are typically two ways to estimate fixed effects:

  1. add dummies to the regression
  2. wipe them out with transformation like demeaning or first-differencing.

You can also do both. For example, with large N and small T panels (aka short panels), people will frequently wipe out the entity effects with the demeaning transformation but still include time dummies. Your setting sounds more like a long panel (relatively small N, large T), where you may have to take a more time-series approach.

Here's an example in Stata of (1) and (2) done with demeaning, but without covariates:

. webuse pig, clear
(Longitudinal analysis of pig weights)

. xtset id week

Panel variable: id (strongly balanced) Time variable: week, 1 to 9 Delta: 1 unit

. xtreg weight i.week, fe vce(cluster id)

Fixed-effects (within) regression Number of obs = 432 Group variable: id Number of groups = 48

R-squared: Obs per group: Within = 0.9857 min = 9 Between = 0.0000 avg = 9.0 Overall = 0.9311 max = 9

                                            F(8,47)           =     755.48

corr(u_i, Xb) = 0.0000 Prob > F = 0.0000

                                (Std. err. adjusted for 48 clusters in id)

         |               Robust
  weight | Coefficient  std. err.      t    P>|t|     [95% conf. interval]

-------------+---------------------------------------------------------------- week | 2 | 6.760417 .1639224 41.24 0.000 6.430647 7.090186 3 | 13.84375 .3134776 44.16 0.000 13.21311 14.47439 4 | 19.375 .3375316 57.40 0.000 18.69597 20.05403 5 | 25.13542 .4579635 54.89 0.000 24.21411 26.05672 6 | 31.42708 .4699493 66.87 0.000 30.48167 32.3725 7 | 37.4375 .5593798 66.93 0.000 36.31217 38.56283 8 | 44.28125 .6311341 70.16 0.000 43.01157 45.55093 9 | 50.19792 .7815318 64.23 0.000 48.62568 51.77016 | _cons | 25.02083 .374458 66.82 0.000 24.26752 25.77415 -------------+---------------------------------------------------------------- sigma_u | 3.9534984 sigma_e | 2.0729165 rho | .78436522 (fraction of variance due to u_i)


. regress weight i.week i.id, vce(cluster id)

Linear regression Number of obs = 432 F(7, 47) = . Prob > F = . R-squared = 0.9865 Root MSE = 2.0729

                                (Std. err. adjusted for 48 clusters in id)

         |               Robust
  weight | Coefficient  std. err.      t    P>|t|     [95% conf. interval]

-------------+---------------------------------------------------------------- week | 2 | 6.760417 .173866 38.88 0.000 6.410643 7.11019 3 | 13.84375 .3324932 41.64 0.000 13.17486 14.51264 4 | 19.375 .3580064 54.12 0.000 18.65478 20.09522 5 | 25.13542 .4857437 51.75 0.000 24.15823 26.11261 6 | 31.42708 .4984565 63.05 0.000 30.42432 32.42985 7 | 37.4375 .5933118 63.10 0.000 36.24391 38.63109 8 | 44.28125 .6694188 66.15 0.000 42.93455 45.62795 9 | 50.19792 .8289396 60.56 0.000 48.53031 51.86553 | id | 2 | 2.666667 1.44e-13 1.9e+13 0.000 2.666667 2.666667 3 | -.2777778 1.44e-13 -1.9e+12 0.000 -.2777778 -.2777778 4 | -.1111111 1.44e-13 -7.7e+11 0.000 -.1111111 -.1111111 5 | -1.555556 1.44e-13 -1.1e+13 0.000 -1.555556 -1.555556 6 | -2.166667 1.44e-13 -1.5e+13 0.000 -2.166667 -2.166667 7 | -.7222222 1.44e-13 -5.0e+12 0.000 -.7222222 -.7222222 8 | -.2777778 1.44e-13 -1.9e+12 0.000 -.2777778 -.2777778 9 | -5.222222 1.44e-13 -3.6e+13 0.000 -5.222222 -5.222222 10 | 2.944444 1.44e-13 2.1e+13 0.000 2.944444 2.944444 11 | 2.277778 1.44e-13 1.6e+13 0.000 2.277778 2.277778 12 | .9444444 1.44e-13 6.6e+12 0.000 .9444444 .9444444 13 | -.6666667 1.44e-13 -4.6e+12 0.000 -.6666667 -.6666667 14 | -1.666667 1.44e-13 -1.2e+13 0.000 -1.666667 -1.666667 15 | 4.333333 1.44e-13 3.0e+13 0.000 4.333333 4.333333 16 | -2.611111 1.44e-13 -1.8e+13 0.000 -2.611111 -2.611111 17 | 9 1.44e-13 6.3e+13 0.000 9 9 18 | -1.777778 1.44e-13 -1.2e+13 0.000 -1.777778 -1.777778 19 | 7.666667 1.44e-13 5.3e+13 0.000 7.666667 7.666667 20 | 2.222222 1.44e-13 1.5e+13 0.000 2.222222 2.222222 21 | .3888889 1.44e-13 2.7e+12 0.000 .3888889 .3888889 22 | 4.611111 1.44e-13 3.2e+13 0.000 4.611111 4.611111 23 | 6.722222 1.44e-13 4.7e+13 0.000 6.722222 6.722222 24 | 3 1.44e-13 2.1e+13 0.000 3 3 25 | -4.277778 1.44e-13 -3.0e+13 0.000 -4.277778 -4.277778 26 | -4 1.44e-13 -2.8e+13 0.000 -4 -4 27 | -3.111111 1.44e-13 -2.2e+13 0.000 -3.111111 -3.111111 28 | -.7222222 1.44e-13 -5.0e+12 0.000 -.7222222 -.7222222 29 | 7.333333 1.44e-13 5.1e+13 0.000 7.333333 7.333333 30 | -5.666667 1.44e-13 -3.9e+13 0.000 -5.666667 -5.666667 31 | 1.055556 1.44e-13 7.4e+12 0.000 1.055556 1.055556 32 | 5.388889 1.44e-13 3.8e+13 0.000 5.388889 5.388889 33 | 1.666667 1.44e-13 1.2e+13 0.000 1.666667 1.666667 34 | 5.111111 1.44e-13 3.6e+13 0.000 5.111111 5.111111 35 | 2.722222 1.44e-13 1.9e+13 0.000 2.722222 2.722222 36 | 1.555556 1.44e-13 1.1e+13 0.000 1.555556 1.555556 37 | 1.777778 1.44e-13 1.2e+13 0.000 1.777778 1.777778 38 | 5.611111 1.44e-13 3.9e+13 0.000 5.611111 5.611111 39 | 2.944444 1.44e-13 2.1e+13 0.000 2.944444 2.944444 40 | 1.111111 1.44e-13 7.7e+12 0.000 1.111111 1.111111 41 | -3.722222 1.44e-13 -2.6e+13 0.000 -3.722222 -3.722222 42 | 1.333333 1.44e-13 9.3e+12 0.000 1.333333 1.333333 43 | .2222222 1.44e-13 1.5e+12 0.000 .2222222 .2222222 44 | 4.777778 1.44e-13 3.3e+13 0.000 4.777778 4.777778 45 | 8.888889 1.44e-13 6.2e+13 0.000 8.888889 8.888889 46 | 4.555556 1.44e-13 3.2e+13 0.000 4.555556 4.555556 47 | 11.11111 1.44e-13 7.7e+13 0.000 11.11111 11.11111 48 | 8.055556 1.44e-13 5.6e+13 0.000 8.055556 8.055556 | _cons | 23.28241 .3971727 58.62 0.000 22.4834 24.08142


dimitriy
  • 35,430
  • Hello dimitriy, thx for your reply! I will consider your points. To my dataset: I've 500 Tranches (3-7 Tranches contain to a deal) over the last 10 years, where each Tranches of a deal is just collected once at the time the Deal is launched. From my understanding - because each tranche is just collected once and their data is not collected again at a different time period - my dataset is pooled cross sectional data. So I'd say that i have Large N (>500 Observations) and a large T (10 years). Would it still be able to use your approach? I also read that a FE-Model isn't possible with my dataset – VivaFalastin Nov 08 '23 at 08:39
  • 1
    This is not my field and I don't know your research question, so hard for me to say. That should be a separate question. There are some new techniques for doing high-dimensional fixed effects that you can use that are variants of (2). Just search for that phrase. – dimitriy Nov 08 '23 at 15:28