I'm experimenting with some data in R and I've found that though there is statistical significance between two variables, however their changes are not statistically significant.
I first ran a standard regression of revenue on price, adding a quadratic term to account for diminishing returns from increase in price. Giving us the formula:
$$y_{Revenue}=\beta_0+\beta_1Price+\beta_2Price^2$$
The results which were produced are:
> summary(lm(Wage~Price+I(Price^2)))
Call:
lm(formula = Wage ~ Price+ I(Price^2))
Residuals:
Min 1Q Median 3Q Max
-131.87 -87.77 -27.60 44.15 244.66
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) -1.650e+03 2.645e+02 -6.238 5.44e-06 ***
Price 3.640e-01 3.640e-02 9.999 5.28e-09 ***
I(Price^2) -1.026e-05 1.129e-06 -9.086 2.41e-08 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error: 116.9 on 19 degrees of freedom
(7 observations deleted due to missingness)
Multiple R-squared: 0.8816, Adjusted R-squared: 0.8691
F-statistic: 70.72 on 2 and 19 DF, p-value: 1.577e-09
The second regression I ran was a regression of the change in revenue on change in price. Giving the formula: $$\Delta y_{Revenue}=\alpha_0+\alpha_1 \Delta Price+\alpha_2 \Delta Price^2$$
Call:
lm(formula = diff(Revenue) ~ diff(Price) + diff(I(Price^2)))
Residuals:
Min 1Q Median 3Q Max
-82.52 -42.55 -11.98 19.20 142.36
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 5.093e+01 2.649e+01 1.923 0.07046 .
diff(Price) 1.343e-01 7.165e-02 1.874 0.07727 .
diff(I(Price^2)) -4.987e-06 1.691e-06 -2.950 0.00857 **
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Residual standard error: 62.29 on 18 degrees of freedom
(7 observations deleted due to missingness)
Multiple R-squared: 0.4521, Adjusted R-squared: 0.3912
F-statistic: 7.426 on 2 and 18 DF, p-value: 0.004449
Why do these variables lose their degree of statistical significance while in the regular regression they are significant on the 1% level and how do you interpret such results economically?
