This is ericpony's blog

Tuesday, October 21, 2014

A note on the strong law of large numbers

I.i.d. r.v.s with zero mean

Consider a sequence of i.i.d. random variables $X_0, X_1, ....$ When the variance is finite, the sequence $S_0, S_1, ...$ of partial sums is expected to have a faster rate of convergence than the rate guaranteed by the Strong Law of Large Numbers (SLLN). In fact, we have the following theorem:

Theorem 2.5.1. If $Var(X_n)<\infty$ and $E[X_n]=0$, then $\lim_{n\rightarrow\infty} (S_n/n^p)=0$ a.s. for all $p>\frac{1}{2}$.

This theorem can be proven using criteria of convergent series and the Borel-Cantelli lemma.

Indep. r.v.s with zero mean and finite sum of variance

It is possible to use a denominator with a slower growing rate than $n_p$. However, we will need a stronger assumption for the variances to ensure convergence without a divergent denominator.

Theorem 2.5.3. If $\sum_{n\ge1} Var(X_n)<\infty$ and $E[X_n]=0$, then $S_n$ converges a.s.

For sequences with $E[X_n]\neq0$, we can consider $Y_n=X_n-E[X_n]$ instead of $X_n$. Note that $Var(Y_n)=Var(X_n)$ and $E[Y_n]=0$. It then follows from the theorem that $\sum Y_n=\sum (X_n-E[X_n])$ converges a.s.

The ultimate characterisation of independent r.v.s with convergent partial sums is given by Kolmogorov.

Theorem 2.5.4. (Kolmogorov's Three-Series Theorem) Given independent $\{X_n\}$ and $a>0$, define $Y_n=X_n\cdot 1\{|X_n|\le A\}$. Then $S_n$ converges a.s. iff the followings all hold:
(i) $\sum \Pr\{|X_n|>A\}<\infty$, (ii) $\sum E[Y_n]<\infty$, and (iii) $\sum Var(Y_n)<\infty$.

Observe that $\sum_{n\ge1}Y_n<\infty$ a.s by the 2nd condition. Also, putting the 3rd condition and Theorem 2.5.3 together leads to the fact that $\sum_{n\ge1}(Y_n-E[Y_n])$ converges a.s. Finally, $\Pr\{X_n=Y_n$ for $n$ large$\}=1$ by the 1st condition and the Borel-Cantelli lemma. Hence, we see that $\sum_{n\ge1}X_n$ converges a.s. We need the Lindeberg-Feller theorem to prove the other direction.

Convergence of sequence is linked to the SLLN by the following theorem.

Theorem 2.5.5. (Kronecker's Lemma) If $a_n\nearrow\infty$ and $\sum_{n\ge1}b_n/a_n$ converges, then $\sum_{n\ge1}^N b_n/a_N \rightarrow 0$ as $N\rightarrow\infty$.

I.i.d. r.v.s with finite mean

The SLLN follows by Kolmogorov's Three-Series Theorem and Kronecker's Lemma.

Theorem 2.5.6. (SLLNIf $E[X_n]=\mu$, then $S_n/n=\mu$ a.s.

Faster rate of convergence

We can prove a faster rate of convergence under stronger assumptions.

Theorem 2.5.7. Suppose that $\{X_n\}$ are i.i.d., $E[X_n]=0$ and $\sigma^2=E[X_n^2]$, then for all $\epsilon>0$, $$\lim\frac{S_n}{\sqrt n (\log n)^{1/2+\epsilon}}=0\quad a.s.$$The most exact estimate is obtained from Kolmogorov's test (Theorem 8.11.3): $$\lim\frac{S_n}{\sqrt n (\log \log n)^{1/2}}=\sigma\sqrt 2\quad a.s.$$
Theorem 2.5.8. Suppose that $\{X_n\}$ are i.i.d., $E[X_n]=0$ and $E[X^p]<\infty$ for $1<p<2$. Then $\lim S_n/n^{1/p}=0$ a.s.

Examples

Example 2.5.3. Suppose $\{X_n\}$ is indep. and $\Pr\{X_n=\pm n^{-p}\}=1/2$. Then $p>1/2$ iff $S_n$ converges a.s. (Hint: use Theorem 2.5.4 and let $A=1$)

References

Rick Durrett. Probability: Theory and Examples. Edition. 4.1.

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...