0

Let $(P_t)_{t \geq 0}$ be a Poisson process with $\lambda > 0$. I want to show that $\mathbb{E}(P_s|P_s) = P_s$. I compute $$\mathbb{E}(P_s|P_s) = \sum_{x \in \text{Img}(P_s)} x \cdot \mathbb{P}(P_s = n|P_s = k) = \sum_{n=0}^{\infty} n \cdot \frac{\mathbb{P}(P_s = n \cap P_s = k)}{\mathbb{P}(P_s = k)}$$ Here I am stuck, since I don't basically don't know how to compute $$\mathbb{P}(P_s = n|P_s = k) = \frac{\mathbb{P}(P_s = n \cap P_s = k)}{\mathbb{P}(P_s = k)}$$

1 Answers1

1

This is true for any random variable; here's how you check it in the discrete case using conditional probabilities:

$\mathbb{P}(P_s=n \cap P_s=k)$ is easier than it looks. It's the probability that $P_s$ is equal to both $n$ and $k$. That's zero unless $n=k$, because $P_s$ can only have one value at a time.

So if $n\neq k$, $\mathbb{P}(P_s=n \mid P_s=k)=0$

If $n=k$, $$\mathbb{P}(P_s=n \cap P_s=k)=\mathbb{P}(P_s=k \cap P_s=k)=\mathbb{P}(P_s=k),$$ so if $n=k$, $\mathbb{P}(P_s=n \mid P_s=k)=1$

Now in the sum $$E[P_s|P_s=k]=\sum_n n\times\mathbb{P}(P_s=n \mid P_s=k)$$ the only term that doesn't vanish is for $n=k$, giving $k\times(P_s=k \mid P_s=k)=k$

More generally, the conditional expectation $E[Y|X]$ is the best (squared-error) prediction of $Y$ when you know $X$. The best prediction of $P_s$ when you know $P_s$ is $P_s$, (related to this linked answer)

Thomas Lumley
  • 38,062