Suppose we have sampled a Brownian motion $(B_t)_{t\ge0}$ at $0=t_0<t_1<\cdots$. How can we obtain a sample at the midpoints of $[t_{i-1},t_i]$ from those samples? I've read that this is possible by a Brownian bridge, but wasn't able to figure out how exactly.
-
@Xi'an Thank you for your comment, but how would I do that numerically? I guess you want me to consider $\left(B_{t_{i-1}},B_{\frac{t_{i-1}+t_i}2},B_{t_i}\right)$, right? – 0xbadf00d Dec 16 '22 at 15:55
-
1All you need for simulating a Normal value are the mean and variance of its distribution. See https://stats.stackexchange.com/questions/30588/deriving-the-conditional-distributions-of-a-multivariate-normal-distribution/30600#30600 for a formula. You don't need to simulate all the midpoints at once, because the value at the midpoint of $[t_{i-1},t_i],$ conditional on $(t_{i-1}, t_i),$ is independent of all times $t \lt t_{i-1}.$ Thus, you can generate them one at a time in any order. – whuber Dec 16 '22 at 16:03
-
@whuber Yes, I know that I only need mean and variance. However, I still got trouble to understand this. We somehow need to sample from $B_{\frac{t_{i-1}+t_i}2}$ only knowing $B_{t_{i-1}}$ and $B_{t_i}$, right? Why does it help to condition on the latter both? And what is the variable we are conditioning at all? – 0xbadf00d Dec 16 '22 at 16:08
-
Is your question, "how can we sample from a Brownian bridge?" or is it "how is it that we can use a Brownian bridge to sample a Brownian motion given two other points of that Brownian motion?" – Sextus Empiricus Dec 17 '22 at 11:00
-
@SextusEmpiricus It is the latter. But as I wrote under the other answer, the reason why I'm asking is this paper: arxiv.org/pdf/0909.2438.pdf. I try to understand what's described on p. 11; it begins at the paragraph starting with "To do better we will use non-uniform") – 0xbadf00d Dec 17 '22 at 11:37
-
"how is it that we can use a Brownian bridge to sample a Brownian motion given two other points of that Brownian motion?" ... Because the Brownian bridge is defined as a Brownian motion given/conditional two points of the Brownian motion. – Sextus Empiricus Dec 17 '22 at 13:39
2 Answers
The triplet $(B(t_1),B(t_2),B(t_3))$ is jointly distributed as a Normal vector $$\mathcal N_3(0_3,\Sigma)\quad\text{with}\quad\Sigma=\left[\begin{matrix} t_1 &t_1 &t_1\\ t_1 &t_2 &t_2\\ t_1 &t_2 &t_3 \end{matrix}\right] \quad\text{when}\ \ \ t_1<t_2<t_3$$ Hence the conditional distribution of $B(t_2)$ conditional on $(B(t_1),B(t_3))$ [which are assumed know by the very definition of a Brownian bridge] is given by $$B(t_2)\mid(B(t_1),B(t_3))=(x,y)\sim\mathcal N_1\left(\frac{(t_3 − t_2)x + (t_2-t_1)y}{t_3 − t_1} ,\ \frac{(t_2 − t_1)(t_3 − t_2)}{t_3 − t_1}\right).$$
-
Could you please tell me how we prove that $(B_{t_1},B_{t_2},B_{t_3})$ is normally distributed? – 0xbadf00d Dec 16 '22 at 19:22
-
Meanwhile, I've figured it out: Let $k\in\mathbb N$, $0\le t_1<\cdots<t_k$ and $$X:=\left(B_{t_1},\ldots,B_{t_k}\right).$$ If $\lambda\in\mathbb R^k$, then $$\langle\lambda,X\rangle=\sum_{i=1}^k\sum_{j=i}^k\lambda_j\left(B_{t_i}-B_{t_{i-1}}\right),$$ where $t_0:=0$. Since $$\left(B_{t_1}-B_{t_0},\ldots,B_{t_k}-B_{t_{k-1}}\right)$$ is independent, $X$ is normally distributed. Should be correct, but please corret me, if I'm wrong. – 0xbadf00d Dec 16 '22 at 19:47
-
Moreover, if $1\le i<j\le k$, then $$\operatorname{Cov}[X_i,X_j]=\underbrace{\operatorname E\left[X_{t_i}\left(X_{t_j}-X_{t_i}\right)\right]}{=:0}+\operatorname E\left[X{t_i}^2\right]=t_i,$$ since $\left(X_{t_i},X_{t_j}-X_{t_i}\right)$ is independent. – 0xbadf00d Dec 16 '22 at 19:55
-
So, in the special case $t_2=\frac{t_1+t_3}2$, we sample from $$\mathcal N\left(\frac{x_1+x_3}2,\frac{t_3-t_1}4\right).$$ But in which sense is this a sample from $B_{t_2}$? I thought the whole idea would be the following: We've sampled $B_{t_1},B_{t_2}\ldots,$ and now we want to refine one of the time intervals, say $[t_{i-1},t_i]$, without sampling the whole thing again. Or is this wrong? (The reason why I'm asking is this paper: https://arxiv.org/pdf/0909.2438.pdf. I try to understand what's described on p. 11; it begins at the paragraph starting with "To do better we will use non-uniform") – 0xbadf00d Dec 16 '22 at 20:44
-
Okay, in my understanding, this would only be a refinement, if we could show the following: Let $t>s>r\ge0$ and $W$ be a real-valued random variable with $$\operatorname P\left[W\in;\cdot;\mid B_r,B_t\right]=\mathcal N_{\frac{(t-s)(s-r)}{t-r}}\left(\frac{(t-s)B_r+(s-r)B_t}{t-r},;\cdot;\right)$$ (this denotes the normal distribution kernel with variance $\frac{(t-s)(s-r)}{t-r}$), then $(B_r,W,B_t)$ is independent and $$(B_r,W,B_t)\sim(B_r,B_s,B_t).$$ Can we actually show this? – 0xbadf00d Dec 16 '22 at 23:03
Could you please tell me how we prove that $(B_{t_1},B_{t_2},B_{t_3})$ is normally distributed?
The joint normal distribution stems from two aspects of Brownian motion:
The distribution of the increment/distance over a certain time is normal distributed (described by the diffusion equation). $$f(x,\Delta t) = \frac{1}{\sqrt{4\pi D}} \frac{e^{-\frac{(x - x_0)^2}{4 D \Delta t}}}{\sqrt{\Delta t}}$$ Einstein, Albert. "Über die von der molekularkinetischen Theorie der Wärme geforderte Bewegung von in ruhenden Flüssigkeiten suspendierten Teilchen." Annalen der physik 4 (1905).
The displacements in non-overlapping time intervals is independent (Markov property, the future changes are only dependent on the current state but and not on past states/history of the system)...
So the you can describe the changes in two subsequent intervals $B_{t_2}-B_{t_1} = \epsilon_a$ and $B_{t_3}-B_{t_2} = \epsilon_b$ as independent normal distributed variables (with variance depending on the length of the time interval and the diffusion constant). And the positions can be described as $B_{t_2} = B_{t_1} + \epsilon_a$ and $B_{t_3} = B_{t_1} + \epsilon_a + \epsilon_b$, which defines the joint normal distribution of $B_{t_2}$ and $B_{t_3}$ from which you can derive the conditional distribution of $B_{t_2}$ given $B_{t_3}$.
Above we simulated 200 Brownian motions as cumulative sums of Gaussian white noise. On the left you see the paths. On the right you see the joint distribution of the position at two different time points. Highlighted are paths for conditional on the position of $B_{t_3}$. The position of $B_{t_2}$ given the position of $B_{t_3}$ can be derived based on the joint distribution of the two.
- 77,915
