4

In today's statistics class, we saw properties of the distribution function, i.e. defined by $F(x) = P(X\leq x)$ for a random variable $X$. One of these was:

$F(x)$ is right continuous.

The proof was:

Let $E_n$ be a decreasing sequence of events s.t. $\cap_{i=1}^{\infty} E_i = E$, with $E = \{X\leq x_0\}$ Then $$ F(x_0) = P(X\leq x_0) = P(E) = P(\cap_i E_n) = \lim_i P(E_i) = F(x_0^{+}). $$

Surely I must be missing something since I don't see how she jumped from the third equality to the next one. Could you tell me why those equalities in the proof are true?

laurab
  • 145
  • 2
    I believe the proof assumes you are familiar with the monotonicity and continuity of probability measures. Have you checked previous lectures? – utobi Feb 03 '23 at 20:01
  • I'm not sure which properties you are referring to...Can you expand a bit? – laurab Feb 03 '23 at 20:02
  • 1
    The usual demonstration considers the complements of the $E_n$ and breaks them into their successive differences, so that $E = (\Omega\setminus E_1) \cup (E_1\setminus E_2) \cup (E_2\setminus E_3)\cup\cdots,$ and applies the sigma-additivity axiom of probability. The limit of the probabilities, which is just the limit of the values of the survival function $1-F,$ exists by the least upper bound property of real numbers. – whuber Feb 03 '23 at 20:04
  • 1
    Thank you @whuber, I think utobi's answer follows kind of what you are suggesting. – laurab Feb 03 '23 at 20:50
  • 1
    That's right. Please notice that the proof you quote isn't quite valid. A correct proof, as at the end of utobi's answer, begins with the definition of right continuity and goes on from there. (In fact, that answer implicitly uses a few simple limit theorems already, because a general definition of right continuity does not suppose that the sequence decreases steadily downwards, but only that (a) all numbers in the sequence are equal to or greater than $x_0$ and (b) their limit equals $x_0.$) – whuber Feb 03 '23 at 21:07
  • 1
    answered here: https://stats.stackexchange.com/questions/25238/how-can-i-prove-that-the-cumulative-distribution-function-is-right-continuous/25239#25239 – Zen Feb 03 '23 at 21:36
  • 2
    I voted to reopen this thread because it concerns a fact asserted without proof in the supposed duplicate. – whuber Feb 04 '23 at 16:07
  • 1
    @whuber could we close it again, untill it is is more clear which equality/step is exactly the culprit. "she jumped from the third equality to the next one" It should be made more clear which equality this question is exactly about. Is it the third equality that is unclear, or the fourth (next one) equality? Without such more exact specifications this question is too much like the question 'how to proof the right continuity' (and also how it is currently being answered). – Sextus Empiricus Feb 04 '23 at 16:18
  • 1
    @Sextus I would do that except the accepted answer here clarifies all those issues and distinguishes the two threads. – whuber Feb 04 '23 at 17:59
  • 1
    I am sorry to disagree with you Sextus, but the proof in the linked is not helpful to me, since it doesn't explains the ingredients needed to jump to the required result, whereas utobi's answer does it thoroughly, at least IMO. – laurab Feb 06 '23 at 20:29

1 Answers1

14

First of all, let's write the desired result mathematically: $F$ is right continuous-means really $F(x) = F(x^+)$ for all $x$, where $$ F(x^{+}) = \lim_{y\to x^+} F(y). $$

I'm going to assume you are familiar with the axioms of probability. To prove this result you need a couple of preliminary properties of the probability measure. For, let $(\Omega, \mathcal F, P)$ be the usual probability triple.

Property 1. For any two given sets $E_i, E_j$, s.t. $E_i\subseteq E_j$, then $P(E_j\cap E_i^c) = P(E_j) - P(E_i)$.

Proof. Since $E_j$ includes $E_i$, then $E_j = E_i \cup (E_j\cap E_i^c)$. This union forms a partition of $E_j$, thus $P(E_j) = P(E_i) + P(E_j\cap E_i^c)$ and the result follows.

$\blacksquare$

Property 2. If $E_1,E_2,\ldots$ is an expanding sequence of sets in $\mathcal F$, that is $E_n\subseteq E_{n+1}$ for all $n$, and $E = \cup_{n=1}^\infty E_n$, then $P(E) = \lim_{n\to\infty} P(A_n)$

Proof. We can write $E = E_1 \cup (E_2\cap E_1^c)\cup (E_3\cap E_2^c)\cup\cdots(E_n\cap E_{n-1}^c)\cdots$, which is again a disjoint union (I leave this as an exercise to you). By one of the axioms of probability,

\begin{align*} P(E) &= P(E_1) + P(E_2\cap E_1^c) + P(E_3\cap E_2^c) + \cdots\\ &= P(E_1) + P(E_2)- P(E_1) + P(E_3) - P(E_2)+\cdots (\text{using Property 1})\\ &= \lim_n P(E_n). \end{align*} $\blacksquare$

Property 3. If $E_1,E_2,\ldots$ is a contracting sequence of sets in $\mathcal F$, that is $E_{n+1}\subseteq E_n$, for all $n$, and $E = \cap_n E_n$, then $P(E) = \lim_n P(E_n).$

Proof. If $E = \cap_{n=1}^\infty E_n$, then on of the DeMorgan laws imply $E^c = \cup_n E_n^c$. Now $E_{n+1}\subseteq E_n$, hence $E_n^c\subseteq E_{n+1}^c$. Thus the sets $E_n^c$ form an expanding sequence so by Property 2 $\lim_n P(E_n^c) = P(E^c)$, that is $1-\lim_n P(E_n) = 1 - P(E)$ and the result follows.

$\blacksquare$

Now the proof of the desired result is essentially there.

Proof. Let $x_1,x_2,\ldots$ be a sequence of real numbers striclty decreasing to $x_0$; that is $x_n$ approach $x_0$ from above. Furthermore, let $E_n = \{X\leq x_n\}$. The $A_n$ form a contracting sequence whose limit, i.e. the intersection is $A =\{X\leq x_0\}$. To show that $\cap A_n = \{X\leq x_0\}$ reason as follows. If $\omega\in \{X\leq x_n\}$ for all $n$, then since $x_n\to x_0$, $\omega\in \{X\leq x_0\}$. Conversely, if $\omega\in \{X\leq x_0\}$, then since $x_0\leq x_n$ for all $n$, $\omega\in \{X\leq x_n\}$ for all $n$. Thus by Property 2,

$$ F(x) = P(E) = \lim_n P(E_n) = \lim_n F(x_n) = F(x_0). $$ $\blacksquare$

User1865345
  • 8,202
utobi
  • 11,726