5

In this problem we consider a model of stochastic growth. In particular, consider the following system of SDEs:

\begin{align} dX_t &= Y_t dt + \sigma_XdZ_{1t}\\ dY_t &= -\lambda Y_t dt + \sigma_Y \rho dZ_{1t} + \sigma_Y\sqrt{1-\rho^2}dZ_{2t}\\ X_0 &= 0\\ Y_0 &= 0 \end{align}

where $Z_{1t}$ and $Z_{2t}$ are independent Brownian motions. \

Compute $E_t[(X_{t+T}-X_t)]$ and $E_t[(X_{t+T} - X_t)^2]$ as a functions of $T$ and $Y_t$. To illustrate the difference between short-run and long-run risk, compute:
$\lim_{T \to 0}\frac{E_t[(X_{t+T} - X_t)^2]}{T}$
and $\lim_{T \to \infty}\frac{E_t[(X_{t+T} - X_t)^2]}{T}$ \

user48018
  • 61
  • 2

2 Answers2

6
  1. The sum of two independent brownian motions is also a brownian motion (which is formally proved here)

  2. First of all we solve for $Y_t$. We notice that $d\bigl( Y e^{\lambda t} \bigr) = dY e^{\lambda t} + \lambda Y e^{\lambda t}$.

\begin{align} dY_t &= -\lambda Y_t dt + \sigma_Y \rho dZ_{1t} + \sigma_Y \sqrt{1 - \rho^2} dZ_{2t} \\ &= -\lambda Y_t dt + \sigma_Y dW_t \\ dY_t + \lambda Y dt &= \sigma_Y dW_t \\ d\bigl( Y_t e^{\lambda t} \bigr) &= e^{\lambda t} \sigma_Y dW_t \\ \Bigl[ Y_t e^{\lambda t} \Bigr]^T_0 &= \sigma_Y \int^T_0 e^{\lambda t} dW_t \\ Y_T &= \sigma_Y e^{-\lambda T}\int^T_0 e^{\lambda t} dW_t \\ \end{align}

Where I've defined $W_t = \rho Z_{1t} + \sqrt{1 - \rho^2} Z_{2t}$. This is gaussian distributed with an expectation of 0 and a variance ${\frac {\sigma^2_Y} {2 \lambda}} \bigl( 1 - e^{-2\lambda T} \bigr)$ coming from the Ito isometry

Now plugging this in and solving for $X_t$:

\begin{align} dX_t &= Y_t dt + \sigma_X dZ_{1t} \\ &= \bigl( \sigma_Y e^{-\lambda t}\int^t_0 e^{\lambda s} dW_s \bigr) dt + \sigma_X dZ_{1t} \\ \Bigl[ X_t \Bigr]^T_0 &= \sigma_Y \int^T_0 e^{-\lambda t} \bigl( \int^t_0 e^{\lambda s} dW_s \bigr) dt + \sigma_X Z_{1t} \end{align}

We can solve $\int^T_0 e^{-\lambda t} \bigl( \int^t_0 e^{\lambda s} dW_s \bigr) dt$ using stochastic integration by parts, as done here, using $A = \int^t_0 e^{-\lambda s} ds$ and $B = \int^t_0 e^{\lambda s} dW_s$ gives

\begin{align} \Bigl[ A_t \cdot B_t \Bigr]^T_0 &= \int^T_0 e^{-\lambda t} \bigl( \int^t_0 e^{\lambda s} dW_s \bigr) dt + \int^T_0 \bigr( \int^t_0 e^{-\lambda s} ds \bigr) e^{\lambda t} dW_t \\ \int^T_0 e^{-\lambda t} \bigl( \int^t_0 e^{\lambda s} dW_s \bigr) dt &= -\int^T_0 \bigr( \int^t_0 e^{-\lambda s} ds \bigr) e^{\lambda t} dW_t + \bigl( \int^T_0 e^{-\lambda s} ds \bigr) \cdot \bigl( \int^T_0 e^{\lambda t} dW_t \bigr) \\ &= -{\frac {1} \lambda}\int^T_0 (e^{\lambda t} - 1) dW_t + {\frac {1} \lambda} (1 - e^{-\lambda T}) \int^T_0 e^{\lambda t} dW_t \\ &= -{\frac {1} \lambda}\int^T_0 (e^{\lambda t} - 1) - e^{\lambda t}(1 - e^{-\lambda T}) dW_t\\ &= {\frac {1} \lambda}\int^T_0 \bigl(1 - e^{-\lambda (T-t)} \bigr) dW_t \end{align}

And substituting this in above, we have \begin{align} X_T &= {\frac {\sigma_Y} \lambda}\int^T_0 \bigl(1 - e^{-\lambda (T-t)} \bigr) dW_t + \sigma_X Z_{1T} \end{align}

This is the sum of two (correlated) gaussians, so it is also a gaussian as required

  1. From above, we have

\begin{align} \Bigl[ Y_s e^{\lambda s} \Bigr]^{T+t}_t &= \sigma_Y \int^{T+t}_t e^{\lambda s} dW_s \\ Y_{T+t} &= e^{-\lambda T} Y_t + e^{-\lambda (T+t)} \sigma_Y \int^{T+t}_t e^{\lambda s} dW_s \end{align}

Conditioning on $Y_t$, we can now find $X_{T+t}$ as above

\begin{align} {\mathbb E}\bigl[(X_{T+t} - X_t \bigr)] &= {\mathbb E}\bigl[ \int_t^{T+t} dX_s \bigr] \\ &= {\mathbb E}\bigl[\int^{T+t}_t Y_s ds + \int^{T+t}_t \sigma_X dZ_{1s} \bigr]\\ &= {\mathbb E}\bigl[\int^{T+t}_t Y_s ds\bigr]\\ &= {\mathbb E}\bigl[\int^{T}_0 Y_{u+t} du\bigr] \\ &= {\mathbb E}\bigl[\int^{T}_0 \Bigl( e^{-\lambda u} Y_t + e^{-\lambda (u+t)} \sigma_Y \int^{u+t}_t e^{\lambda s} dW_s \Bigr) du \bigr] \\ &= {\mathbb E}\bigl[\int^{T}_0 e^{-\lambda u} Y_t du \bigr]\\ &= {\frac 1 {\lambda}} Y_t \bigl( 1 - e^{-\lambda T} \bigr) \end{align}

(where I've changed variables from $s$ to $u = s - t$) which makes sense - $Y_t$ is mean-reverting so we expect future values to be closer to zero than current values

\begin{align} {\mathbb E}\bigl[(X_{T+t} - X_t \bigr)^2] &= {\mathbb E}\bigl[ \bigl( \int_t^{T+t} dX_s \bigr)^2 \bigr] \\ &= {\mathbb E}\bigl[\Bigl(\int^{T+t}_t Y_s ds + \int^{T+t}_t \sigma_X dZ_{s1} \Bigr)^2 \bigr]\\ &= {\mathbb E}\bigl[\bigl( \int^{T+t}_t Y_s ds \bigr)^2 + \int^{T+t}_t \sigma_X^2 dt + 2 \int^{T+t}_t Y_s ds \int^{T+t}_t \sigma_X dZ_{1s} \bigr]\\ &= {\frac {Y_t^2} {\lambda^2}} \bigl( 1 - e^{-\lambda T} \bigr)^2 + \sigma_X^2 T + 2 {\mathbb E}\bigl[ \int^{T+t}_t Y_s ds \int^{T+t}_t \sigma_X dZ_{1s} \bigr] \end{align}

For clarity I break out the last term separately: \begin{align} {\mathbb E}\bigl[ \int^{T+t}_t Y_s ds \int^{T+t}_t \sigma_X dZ_{1s} \bigr] &= {\mathbb E}\bigl[ \int^T_0 Y_{u+t} du \int^T_0 \sigma_X dZ_{u1} \bigr] \\ &= {\mathbb E}\bigl[ \int^T_0 \Bigl( e^{-\lambda u} Y_t + e^{-\lambda (u+t)} \sigma_Y \int^{u+t}_t e^{\lambda s} dW_s \Bigr) du \int^T_0 \sigma_X dZ_{1u} \bigr] \\ &= {\mathbb E}\bigl[ \int^T_0 \Bigl( e^{-\lambda (u+t)} \sigma_Y \int^{u+t}_t e^{\lambda s} dW_s \Bigr) du \int^T_0 \sigma_X dZ_{1u} \bigr] \\ &= {\mathbb E}\bigl[ \int^T_0 \Bigl( e^{-\lambda (u+t)} \sigma_Y \int^{u+t}_t e^{\lambda s} \rho dZ_{1u} \Bigr) du \int^T_0 \sigma_X dZ_{1u} \bigr] \\ &= {\frac {\rho \sigma_Y} {\lambda}} {\mathbb E}\bigl[ \int^T_0 \bigl( 1 - e^{-\lambda T} \bigr) dZ_{1u} \int^T_0 \sigma_X dZ_{1u} \bigr] \\ &= {\frac {\rho \sigma_X \sigma_Y} {\lambda}} \int^T_0 \bigl( 1 - e^{-\lambda T} \bigr) du\\ &= {\frac {T \rho \sigma_X \sigma_Y} {\lambda}}\bigl( 1 - e^{-\lambda T} \bigr) \end{align}

and plugging this back in to the block above we have \begin{align} {\mathbb E}\bigl[(X_{T+t} - X_t \bigr)^2] &= {\frac {Y_t^2} {\lambda^2}} \bigl( 1 - e^{-\lambda T} \bigr)^2 + \sigma_X^2 T + 2 {\frac {T \rho \sigma_X \sigma_Y} {\lambda}} \bigl( 1 - e^{-\lambda T} \bigr) \end{align}

Thinking about the behaviour of this process as $T \to \infty$, we see that the $( 1 - e^{-\lambda T})$ terms go to zero and we're left with a variance of $\sigma_X^2 T$, which is just standard geometric brownian motion's variance.

As $T \to 0$, $( 1 - e^{-\lambda T}) \to \lambda T$ which cancel out all of the $\lambda$s so the expression becomes \begin{align} {\frac 1 T} \lim_{T \to 0} {\mathbb E}\bigl[(X_{T+t} - X_t \bigr)^2] &= Y_t^2 T + \sigma_X^2 - 2 \rho \sigma_X \sigma_Y T \end{align}

so as expected, short term variance at $t$ increases with the level of $Y_t$, and decreases if the processes are more positively correlated.

Wow what a question!

StackG
  • 3,016
  • 1
  • 10
  • 23
  • 2
    Great answer! People like you contribute a lot to make this site better. – rvignolo Oct 17 '20 at 03:24
  • 1
    Good answer! But a minor typo:$dX_{t} = Y_{t}, dt + \sigma_{X}, dZ_{t}^{(1)}$ (not $-\lambda Y_{t}, dt$). This leads to an answer of $\mathbb{E}{t}[X{t+T}-X_{t}] = \frac{1}{\lambda}Y_{t}(1-e^{-\lambda T})$. – Christopher K Nov 26 '20 at 17:34
  • Thanks, good spot. Let me correct that. – StackG Nov 28 '20 at 02:45
2

Great problem! First recall that

$$\begin{cases} dY_{t} = -\lambda Y_{t}\, dt + \sigma_{Y} \rho \, dZ_{t}^{(1)} + \sigma_{Y}\sqrt{1-\rho^{2}}\, dZ_{t}^{(2)} \\ Y_{0} = 0. \end{cases}$$ Use It^o calculus to show that $d(e^{\lambda t}Y_{t}) = \sigma_{Y}e^{\lambda t} dW_{t},$ where $W_{t} = \rho \, dZ_{t}^{(1)} + \sqrt{1-\rho^{2}}\, dZ_{t}^{(2)}$ is standard Brownian motion (keep in mind that $d[W_{t},Z_{t}^{(1)}] =\rho\, dt$), and derive $$Y_{t} = \sigma_{Y}e^{-\lambda t} \int_{0}^{t} e^{\lambda s}\, dW_{s}$$ as well as $$X_{t} = \int_{0}^{t} Y_{s}\, ds + \sigma_{X}Z_{t}^{(1)}.$$ We will first compute $\mathbb{E}_{t}[Y_{s}]$ and $\mathbb{E}_{t}[X_{t+T}-X_{t}]$. Since $e^{\lambda t}Y_{t}$ is a martingale, $$\mathbb{E}_{t}[e^{\lambda s}Y_{s}] = e^{\lambda t}Y_{t} \implies \mathbb{E}_{t}[Y_{s}] = Y_{t}e^{\lambda (t-s)},$$ and so \begin{align*} \mathbb{E}_{t}[X_{t+T}-X_{t}] &= \mathbb{E}_{t} \left [\int_{t}^{t+T} Y_{s}\, ds + \sigma_{X}(Z_{t+T}^{(1)}-Z_{t}^{(1)}) \right ] \\ &= \int_{t}^{t+T} \mathbb{E}_{t}[Y_{s}]\, ds \\ &= \int_{t}^{t+T} e^{\lambda(t-s)}Y_{t}\, ds \\ &= \frac{1}{\lambda}Y_{t}(1-e^{-\lambda T}). \end{align*} Now we use It^o isometry to compute $\mathbb{E}_{t}[Y_{s}Y_{r}]$: \begin{align*} \mathbb{E}_{t} \left [e^{\lambda (s+r)}Y_{s}Y_{r} \right ] &= \mathbb{E}_{t} \left [\left (e^{\lambda t}Y_{t} + \sigma_{Y} \int_{t}^{s} e^{\lambda u}\, dW_{u} \right )\cdot \left (e^{\lambda t}Y_{t} + \sigma_{Y} \int_{t}^{r} e^{\lambda v}\, dW_{v} \right ) \right ] \\ &= e^{2\lambda t}Y_{t}^{2} + \sigma_{Y}^{2} \mathbb{E}_{t} \left [\left (\int_{t}^{\min\{s,r\}} e^{\lambda u}\, dW_{u} \right )^{2} \right ] \\ &= e^{2\lambda t}Y_{t}^{2} + \sigma_{Y}^{2} \int_{t}^{\min\{s,r\}} e^{2\lambda u}\, du \\ &= e^{2\lambda t} \left (Y_{t}^{2} + \frac{\sigma_{Y}^{2}}{2\lambda}(e^{2\lambda (\min\{s,r\}-t)}-1) \right ) \end{align*} and so $$\mathbb{E}_{t} [Y_{s}Y_{r}] = e^{-\lambda (s+r-2t)}Y_{t}^{2} + \frac{\sigma_{Y}^{2}}{2\lambda}(e^{-\lambda|s-r|}-e^{-\lambda (s+r-2t)})$$ as well as $$\mathrm{cov}(Y_{s},Y_{r}) = \frac{\sigma_{Y}^{2}}{2\lambda}(e^{-\lambda |s-r|}-e^{-\lambda (s+r-2t)}).$$ Next, calculate for $s \leq r$ \begin{align*} \mathbb{E}_{t} [e^{\lambda s}Y_{s}(Z_{r}^{(1)}-Z_{t}^{(1)})] &= \mathbb{E}_{t} \left [e^{\lambda t}Y_{t}(Z_{r}^{(1)}-Z_{t}^{(1)}) + \sigma_{Y} \int_{t}^{s} e^{\lambda u}\, dW_{u} \cdot \int_{t}^{r} dZ_{v}^{(1)} \right ] \\ &= \sigma_{Y} \rho \int_{t}^{s} e^{\lambda u} \, du \\ &= \frac{\sigma_{Y}\rho}{\lambda}(e^{\lambda s}-e^{\lambda t}) \end{align*} and $$\mathbb{E}_{t} [Y_{s}(Z_{r}^{(1)}-Z_{t}^{(1)})] = \frac{\sigma_{Y}\rho}{\lambda}(1-e^{-\lambda (s-t)}).$$ Finally, \begin{align*} &\mathbb{E}_{t}[(X_{t+T}-X_{t})^{2}] \\ &\quad = \mathbb{E}_{t} \left [\left (\int_{t}^{t+T} Y_{s}\, ds \right )^{2} -2\sigma_{X}(Z_{t+T}^{(1)}-Z_{t}^{(1)})\left (\int_{t}^{t+T} Y_{s}\, ds \right ) + \sigma_{X}^{2}(Z_{t+T}^{(1)}-Z_{t}^{(1)})^{2} \right ] \\ &\quad = \int_{t}^{t+T}\int_{t}^{t+T} \mathbb{E}_{t}[Y_{s}Y_{r}]\, ds\, dr - 2\sigma_{X}\int_{t}^{t+T} \mathbb{E}_{t}[Y_{s}(Z_{t+T}^{(1)}-Z_{t}^{(1)})]\, ds + \sigma_{X}^{2}T \\ &\quad = \int_{t}^{t+T}\int_{t}^{t+T} e^{-\lambda (s+r-2t)}Y_{t}^{2} + \frac{\sigma_{Y}^{2}}{2\lambda}(e^{-\lambda|s-r|}-e^{-\lambda (s+r-2t)})\, ds\, dr \\ &\qquad - \frac{2\sigma_{X}\sigma_{Y}\rho}{\lambda} \int_{t}^{t+T} (1-e^{-\lambda (s-t)})\, ds + \sigma_{X}^{2}T \\ &\quad = \frac{1}{\lambda^{2}}Y_{t}^{2}(1-e^{-\lambda T})^{2} + \frac{\sigma_{Y}^{2}}{2\lambda}\cdot \frac{2(\lambda T + e^{-\lambda T}-1)}{\lambda^{2}} - \frac{\sigma_{Y}^{2}}{2\lambda^{3}}(1-e^{-\lambda T})^{2}\\ &\qquad - \frac{2\sigma_{X}\sigma_{Y}\rho}{\lambda} \left (T - \frac{1}{\lambda}(1-e^{-\lambda T}) \right ) + \sigma_{X}^{2}T. \end{align*} At last, we use asymptotics $\frac{1}{\kappa}(1-e^{-\kappa T}) \sim T - \frac{\kappa}{2}T^{2}$ as $T \rightarrow 0$ to get \begin{align*} & \frac{1}{T} \mathbb{E}_{t}[(X_{t+T}-X_{t})^{2}] \\ &\quad = \frac{1}{T} \left (Y_{t}^{2}T^{2} + \frac{\sigma_{Y}^{2}T^{2}}{2\lambda } - \frac{\sigma_{Y}^{2}T^{2}}{2\lambda } - \frac{2\sigma_{X}\sigma_{Y}\rho}{\lambda} \cdot \frac{\lambda T^{2}}{2} + \sigma_{X}^{2}T \right ) + \mathcal{O}(T^{2}) \\ &\quad = Y_{t}^{2}T + \sigma_{X}^{2} - \sigma_{X}\sigma_{Y}\rho T + \mathcal{O}(T^{2}). \end{align*}