Personal tools
  •  
You are here: Home Electrical and Computer Engineering Stochastic Processes Sequences and Limit Theorems

Sequences and Limit Theorems

Document Actions
  • Content View
  • Bookmarks
  • CourseFeed

Sequences  ::  Convergence  ::  Limit  ::  Central Limit

Limit Theorems

Laws of Large Numbers

Suppose $X_1, X_2, \ldots,$ is a sequence of r.v.s. We are often interested in sums $\sum_{i=1}^n X_i$ , as $n$ becomes large. What can we say about such sums?

Suppose all $X_i$ have the same means $\mu$ , $E[X_i]= \mu$ , and are uncorrelated. We would expect the average $\frac{1}{n}\sum_{i=1}^n X_i$ to ''approach'' $\mu$ in some way as $n \rightarrow \infty$ . If $\var(x_i) < \infty$ , consider

\begin{displaymath}\frac{1}{n}\sum_{i=1}^n X_i - \mu.
\end{displaymath}

Let us look at m.s. convergence;

\begin{displaymath}\begin{aligned}
E\left[\left(\frac{1}{n}\sum_{i=1}^n X_i - \m...
...,X_j) \\
&= \frac{1}{n^2} \sum_{i=1}^n \var(X_i)
\end{aligned}\end{displaymath}

Summarizing: If $E[X_i]= \mu$ and $\{X_i\}$ are mutually uncorrelated and have finite variance, $\frac{1}{n}\sum_{i=1}^n \var(X_i)
\rightarrow 0$ , so that

\begin{displaymath}\frac{1}{n}\sum_{i=1}^n X_i \rightarrow \mu (\text{m.s.})
\Ri...
...row \frac{1}{n}\sum_{i=1}^n X_i \rightarrow \mu
\text{(i.p.)}.
\end{displaymath}

This is an example of a weak law of large numbers.
\begin{definition}
Suppose $\{X_i\}_{i=1}^\infty$ is a sequence of r.v.s and $...
...{i=1}^n X_i - a_i \rightarrow 0 \text{(i.p.)}.
\end{displaymath}\end{definition}
In the example we just gave, $b_n = n$ and $a_i=\mu$ .


\begin{definition}
A {\bf strong law} of large numbers is the same as the preceding
definition, except that convergence is almost sure (a.s.).
\end{definition}

Kolmogorov's Strong Law


\begin{definition}
An infinite sequence of r.v.s is independent if every finite
subcollection of the r.v.s is independent.
\end{definition}

Theorem 1 (Kolmogorov's Strong Law)   Suppose $\{X_n\}_{n=1}^\infty$ is a sequence of independent r.v.s with finite means for each $i$ . If

\begin{displaymath}\sum_{i=1}^n \frac{\var(X_i)}{b_i^2} < \infty
\end{displaymath}

then

\begin{displaymath}\frac{1}{b_n} \sum_{i=1}^n X_i - a_n \rightarrow 0 \text{(a.s.)},
\end{displaymath}

where

\begin{displaymath}a_n = \frac{\sum_{i=1}^n \mu_i}{b_n}
\end{displaymath}


\begin{example}
If $b_n =n$ and $\mu_n = \mu$, then Kolmogorov's law implies:
...
...
\sum_{i=1}^n X_i \rightarrow \mu \text{ (a.s.)}.
\end{displaymath}\end{example}
Note that in the case that all the variances are bounded, e.g.

\begin{displaymath}\var(X_i) < \sigma^2 < \infty \text{ for all $i$}
\end{displaymath}

then

\begin{displaymath}\sum_{i=1}^\infty \frac{\var(X_i)}{i^2} \leq \sigma^2
\sum_{i=1}^\infty \frac{1}{i^2} < \infty
\end{displaymath}

So, if the variances grow sublinearly , the theorem can apply.

We can get an even stronger conclusion:

Theorem 2   Kinchine's Strong Law of Large Numbers.

Suppose $\{ X_i\}_{i=1}^\infty$ is an i.i.d. sequence (i.e., a sequence of i.i.d. r.v.s) with finite mean

\begin{displaymath}\vert E[X_i]\vert = \vert\mu\vert < \infty.
\end{displaymath}

Then the sample mean converges almost surely to the ensemble mean:

\begin{displaymath}\frac{1}{n} \sum_{i=1}^n X_i \rightarrow \mu \text{ (a.s.)}
\end{displaymath}

Proving these types of theorems

The proofs follow from more general limit theorems.
\begin{definition}
Let $\{A_n\}_{n=1}^\infty$ be a sequence of events. The {\b...
... of all points that are in $\{A_n\}$ {\bf infinitely
often.}
\end{definition}
So $\omega \in \limsup_n A_n \Leftrightarrow \omega$ is in infinitely many of the sets $A_n$ . (It keeps coming back.)

Another notation is: $\limsup_n A_n = A_n$ i.o. (infinitely often).

We observe that if $A_n \uparrow A$ or $A_n \downarrow A$ then $A_n
\text{ (i.o.)} = A$ .

Lemma 1   The Borel Cantelli lemma . [This is frequently a good problem for math qualifiers.]
  1. If $\sum_{i=1}^\infty P(A_n) < \infty$ then $P(A_n \text{
(i.o.)}) = 0$ . That is, $P(A_n) \rightarrow 0$ .
  2. (Conversely) If $\{A_n\}_{n=1}^\infty$ are independent events and $\sum_{i=1}^\infty P(A_n) = \infty$ then $P(A\text{ (i.o.)}) = 1$ .


\begin{proof}
\begin{enumerate}
\item $\underbrace{\cap_{n=1}^\infty \cup_{k=n...
...end{displaymath}so that $P(A_n \text{ (i.o.)}) = 1$.
\end{enumerate}\end{proof}

Kolmogorov's Inequality

Suppose $X_1, X_2, \ldots,$ are independent with zero means and finite variances. Define $S_n$ to be the running sum

\begin{displaymath}S_n = \sum_{k=1}^n X_k
\end{displaymath}

Then for each $\alpha > 0$ ,

\begin{displaymath}P(\max_{1 \leq k \leq n} \vert S_k\vert \geq \alpha) \leq \frac{1}{\alpha^2}
\var(S_n).
\end{displaymath}

This is a lot like the Chebyshev inequality, but instead of looking at the variance of all of the terms, we simply look at the variance of the last one.

Copyright 2008, by the Contributing Authors. Cite/attribute Resource . admin. (2006, May 31). Sequences and Limit Theorems. Retrieved January 07, 2011, from Free Online Course Materials — USU OpenCourseWare Web site: http://ocw.usu.edu/Electrical_and_Computer_Engineering/Stochastic_Processes/lec5_3.html. This work is licensed under a Creative Commons License Creative Commons License