7

Let suppose to have a realization $\mathbf{X}=(\mathbf{X}_1,\dots, \mathbf{X}_n)$, where $\mathbf{X}_i \in \mathcal{R}^d$, from a $d-$variate Gaussian process.

Let also suppose that $E(\mathbf{X}_i)= \mathbf{0}_d$ and $Cov(\mathbf{X}_i)= \boldsymbol{\Sigma}$.

If I indicate with $C_{ij}$ the portion of the covariance matrix of $\mathbf{X}$ that rules the dependence between $\mathbf{X}_i$ and $\mathbf{X}_j$, and i assume that this must depend on the distance $|i-j|$, which are the necessary conditions needed to have the following? $$ f(\mathbf{X}_i|\mathbf{X}_{i-1},\mathbf{X}_{1-2},\dots,\mathbf{X}_1) = f(\mathbf{X}_i|\mathbf{X}_{i-1}) $$ where $f()$ is the normal density, i.e. the process is Markovian.

A reference with a proof is highly appreciated.

Thanks

niandra82
  • 1,250

2 Answers2

7

Since the process is Gaussian and zero-mean, formulas for conditional means and variances for the multivariate normal distribution imply that you can write each $\mathbf{X}_t$ as a linear combinations of $\mathbf{X}_{t-1}$, $\mathbf{X}_{t-2}$, ... plus an independent Gaussian error term. But since the process is also Markovian, only the first term will have a non-zero matrix of coeffients $\boldsymbol{\phi}_1$. Thus, we can write $$ \mathbf{X}_t = \boldsymbol{\phi}_1\mathbf{X}_{t-1}+\mathbf{w}_t. \tag{1} $$ The Markov property also implies that the error terms $\mathbf{w}_1, \mathbf{w}_2, \dots$ must all be independent and covariance stationary implies that $\boldsymbol{\phi}_1$ and the variance matrix of $\mathbf{w}_t$ remains constant across time. Hence, the process you describe is a vector AR(1) process. Its autocovariance matrix function is $$ \boldsymbol\Gamma(k)=\mbox{Cov}(\mathbf{X}_{t-k},\mathbf{X}_{t})=\boldsymbol\Gamma(0) (\boldsymbol{\phi}_1')^k $$ for $k\ge 1$, see Wei, 2007, section 16.3.1. A further restriction is that $\boldsymbol\phi_1$ must have eigenvalues with modulus smaller than 1, otherwise the process is not stationary. Note also that $\boldsymbol\Gamma(-k)=\boldsymbol\Gamma(k)'$ by definition of the auto covariance matrix function (so the autocovariance matrix function can not depend only on $|i-j|$).

Jarle Tufto
  • 10,939
0

A sufficient and necessary condition for a Gaussian process to be also Markovian is to have a triangular covariance function, i.e., $$ \mathrm{Cov}[X_{t_{1}}, X_{t_2}] = r_1(\min(t_1, t_2))r_2(\max(t_1, t_2)), $$ for unique (up to multiplicative constants) functions $r_1$ and $r_2$. The first full proof of this equivalence was provided in this paper by I. S. Borisov.

Aguazz
  • 11