I have three log-transformed time series variables, which are: I(1) and not cointegrated. I read in a lecture slide that I can rethink my model by adding an LDV. This solves my autocorrelation problem (I add LDV to solve for the autocorrelation), but the effect of external regressor drops significantly. I am trying out these combinations to see how they vary:
Y(t) = X_1(t) + error
Y(t) = X_1 (t) + X_1 (t-1) + Y (t-1) + error
Y(t) = X_1 (t-4) + X_2 (t) + error
Y(t) = X_1 (t-4) + X_2 (t)+ Y (t-1) + error (I did a sequential test to identify the lag oder of 4 for the variable X_1)
For equation (2) I find that the sign of my coefficient change and for equation (4) the coefficient drops significantly for both variables when compared to equation 3.
I went through this question, which partly answers my question of using LDV in my model, but I wanted to understand why the sign changes for equation 2.
I have moderate knowledge in statistics so please bare with me if the question does not make sense.
modelx1 = dynlm(y~x1)
Estimate Std. Error t value Pr(>|t|)
(Intercept) 1.8868 0.0306 61.6351 0
x1 -0.3724 0.0091 -41.0770 0
modelx1.ar1 = dynlm(y~ L(y, 1)+ L (x1, 0:1))
Estimate Std. Error t value Pr(>|t|)
(Intercept) 0.17596 0.15926 1.10489 0.27581
L(y, 1) 0.91166 0.09068 10.05375 0.00000
L(x1, 0:1)0 -0.41712 0.14231 -2.93118 0.00556
L(x1, 0:1)1 0.38155 0.16092 2.37113 0.02264

