3

My instructor has mostly self contained notes, where our textbook is mostly a reference.

She has it written that:

$$S_t = S_0e^{(\mu - \frac{\sigma^2}{2})t + \sigma W_t} \iff dS_t = S_t(\mu dt + \sigma dW_t)$$

I feel that basic differentiation of the exponential implies that on the right hand side we should have $$dS_t = S_t \left((\mu - \frac{\sigma^2}{2})dt + \sigma dW_t \right)$$.

I'd appreciate understanding why the $\frac{\sigma^2}{2}$ disappears from the differentiation when this is a basic rule about differentiating the exponential.

SRKX
  • 11,126
  • 4
  • 42
  • 83
user7348
  • 298
  • 1
  • 7

1 Answers1

7

You derivation here is flawed because you are deriving with respect to two processes and you do not take into account that the variable $W_t$ is stochastic and hence $S_t$ is as well.

So, to derive $S_t$ from $dS_t$, you have to apply Ito's Lemma, see this question for details. This is the "classic" way you see it.

If you want to do it the other way round, setting $S_t = f(W_t,t) = S_0 \exp \left[ (\mu - \frac{\sigma^2}{2}) t + \sigma W_t \right]$ and applying Ito's Lemma gives you:

$$ df(W_t,t) = \frac{\partial f}{ \partial t } dt + \frac{\partial f}{ \partial W_t } dW_t + \frac{\partial^2 f}{ (\partial W_t)^2 } d \langle W\rangle_t$$ $$ df(W_t,t) = S_t \left(\mu - \frac{\sigma^2}{2} \right) dt + S_t \sigma dW_t + \frac{1}{2} S_t \sigma^2 dt$$ $$ df(W_t,t) = S_t \left[ \left(\mu - \frac{\sigma^2}{2} + \frac{\sigma^2}{2} \right) dt + \sigma dW_t \right]$$ $$ df(W_t,t) = S_t ( \mu dt + \sigma dW_t ) = dS_t$$

So, essentially Ito's Lemma adds a term for the quadratic variation of a stochastic process, $d \langle W\rangle_t$, which is 0 for deterministic processes. This is where the $\frac{\sigma^2}{2}$ appears (or disappears depending how you see it).

SRKX
  • 11,126
  • 4
  • 42
  • 83