Questions tagged [convergence]

Convergence generally means that a sequence of a certain sample quantity approaches a constant as the sample size tends to infinity. Convergence is also a property of an iterative algorithm to stabilize on some aim value.

Convergence refers to the investigation of the behavior of certain sample quantities when the sample size approaches infinity. Two important types of convergence are convergence in probability and almost sure convergence.

Convergence in probability
A sequence of random variables $X_1,...,X_n$ converges in probability to a random variable $X$ if $$\lim_{x \to \infty}P(|X_n-X|\leq\epsilon)=1 $$ for every $\epsilon > 0$. This means that at the limit as $n$ increases to infinity almost all of the probability mass becomes concentrated around $X$ in a small interval. This type of convergence is used in the weak law of large numbers.

Almost sure convergence
Similar to the previous statement, a squence of random variables $X_1,...,X_n$ converges almost surely to a random variable $X$ if $$P(\lim_{x \to \infty}|X_n-X|< \epsilon)=1$$ for every $\epsilon > 0$. Here, compared to the previous case, the limit is achieved with probability one. Almost sure convergence is used in the strong law of large numbers and it implies convergence in probability (note that convergence in probability does not imply almost sure convergence).

1165 questions
7
votes
2 answers

Limit of a convolution and sum of distribution functions

I need to prove an induction step. $X_i$ are independently distributed with the distribution function $1-F_i=x^{-\alpha}L_{i}(x)$ where $\alpha \geq 0$ and $L_{i}(x)$ is regularly varying (If the limit…
Chris
  • 1,339
6
votes
2 answers

Convergence in $L_1$ counterexample

I am looking for an example of a sequence of r.v. $X_n$ that converges to $X$ in $L_1$, but such that $X_n^2$ does not converge to $X^2$ in $L_1$. Anyone has something in mind?
5
votes
2 answers

Does convergence in mean imply convergence almost surely if the limit is zero and the sequence is nonnegative?

Say $X_k$ is a non-negative sequence and it is known that it convergences in mean to zero. It feels like it should also convergence almost surely due to the fact that the only value a non-negative random variable can take and still average to zero…
4
votes
1 answer

In the proof of the delta method (about little oh_p)

In the proof of the delta method related with the convergence in distribution, I couldn't understand the statement below. When $ \sqrt{n} (X_n - \mu) \rightarrow ^D N(0, \sigma^2 ) $ , \begin{equation} f(X_n) = f(\mu) + f'(\mu)(X_n -\mu ) + o_p…
4
votes
1 answer

Limiting argument within limit relation

Suppose we know that there exists sequences $a_n$ and $b_n$ such that $$\lim_{n \to \infty} F^{n}(a_n x+ b_n)=H(x)$$ with $F$ a distribution function and $H$ a continuous distribution function. Now let $x_n$ be a sequence with limit $x^{*}$.…
Joogs
  • 809
3
votes
1 answer

Convergence in probability and inequality

Is it true that if $$ 0 \leq x_n \leq y_n \xrightarrow{p} 0,$$ then $$x_n \xrightarrow{p} 0?$$ If this is true, I would be very grateful if you could give me some references where I can find the proof.
Kolibris
  • 615
3
votes
4 answers

Absolute convergence requirement for Expectation

In calculating the expectation of a discrete random variable $X$, we not only require that $\Sigma x_iP(X=x_i)$ converges, but also converges absolutely. I understand this requirement as probably stemming from the fact that a rearrangement of…
3
votes
1 answer

Continuous mapping theorem for convergence in probability

I have seen the continuous mapping theorem (CMT) used to justify the convergence in probability of the difference of two sequences of random variables when it is known that each sequence converges in probability: If $X_{1}, \dotsb, X_{n} \sim…
half-pass
  • 3,740
2
votes
2 answers

convergence in probability example in Casella-Berger

In Casella-Berger Statistical Inference page 234 Example 5.5.8, they define a sequence of uniform random variables $X_1, X_2, \cdots, X_n, \cdots$ such that $X_i \sim U(0,1)$ and $s \in [0,1]$ and : $X_1(s) = s + I_{[0,1]}(s)$, $X_2(s) = s +…
t-student
  • 113
2
votes
0 answers

How (and under what conditions) can convergence in distribution be determined

I am studying a population of individuals over many generations. Each individual can take on a single trait from a range of several traits. In each generation I calculate the the share of the most common trait in the population. The result looks…
Peter R
  • 31
  • 1
2
votes
1 answer

Slutsky's theorem

If I have a set of $N$ i.i.d. random variables $X$ with sample mean $\bar{X}=\frac{1}{N}\sum_i^N X_i$, does Slutsky's theorem http://en.wikipedia.org/wiki/Slutsky%27s_theorem imply that $$ E\left[\frac{X_i X_i}{\bar{X}}\right] =…
mrkprc1
  • 93
1
vote
1 answer

Convergence of squared sample average

If $y_{1}, y_{2}, . . . , y_{N}$form a sample of independent standard-normally distributed random variables and $\bar{y}$ is the sample average. Is it correct to say that $$\bar{y}^2 \overset{p}{\rightarrow} E[\bar{y}^2] =0$$
user407052
  • 11
  • 1
1
vote
1 answer

Link between convergence in distribution and almost sure convergence

Let $X_n$, $Y_n$ and $Y$ be random vectors. If $X_n \xrightarrow[]{d} X$ and $X_n - Y_n \xrightarrow[]{a.s.} 0$. Can we prove that $Y_n \xrightarrow[]{d} X$ ?
1
vote
0 answers

Convergence in exponentially weighted total variation implies convergence in total variation?

Am I correct to interpret the following as (also) saying that if a sequence of log-concave densities converges in distribution, then it also converges in total variation? (Meaning that the weighted total variation from (c) is equal to total…
12345
  • 213
1
vote
0 answers

Order of convergence of a product of two convergent sequences

Let $a_n$ be a sequence that converges to $A$ with order of $n^\alpha$, that is $a_n = A + \mathcal{O}(n^\alpha)$ and $b_n$ is another sequence that converges to B with order of $n^\beta$; i.e. $b_n = B + \mathcal{O}(n^\beta)$. What is the order of…
Morcus
  • 121
1
2