For consistency of OLS estimator for linear model
$$ y_i = \beta^T x_i + \epsilon_i, \; i = 1,\cdots, n, $$
the model assumptions are usually (the ones I am familiar with)
- The sequence of random vectors $\{ (x_i, \epsilon_i) \}$ is (jointly) strictly stationary and ergodic.
- $\mathbb{E}[x_i \epsilon_i] = 0$ for all $i$.
LLN then gives consistency. Now if one of the regressors, say $(x_i)_1$, is $i$---deterministic linear time trend if $i = t$ and we have a time series, is consistency still possible? I suppose one trick is to run the regression on the first differences
$ y_i - y_{i-1} = \beta^T (x_i - x_{i-1}) + (\epsilon_i - \epsilon_{i-1}), \; i = 1,\cdots, n, $
and assume the additional orthogonality condition $\mathbb{E}[\epsilon_i x_{i \pm 1}] = 0$. Are there standard/more sophisticated ways to deal with this? What about more general time trends---cyclic, exponential, etc?