\(\newcommand \e \epsilon \newcommand \ov \overline\)
Law of Large numbers
Weak Law of Large numbers
Let \(X_1,X_2,\cdots\) be i.i.d. with
\[xP(|X_i| > x) \to 0 \quad as \ x \to \infty \]Let \(S_n = X_1 + \cdots + X_n\) and let \(\mu_n = E(X_11_{|X_1| \le n})\). Then \(S_n / n - \mu_n \to 0\) in probability.
Strong Law of Large numbers
Let \(X_1,X_2,\cdots\) be pairwise independent identically distriuted random variables with \(E|X_i| < \infty\). Let \(EX_i = \mu\) and \(S_n = X_1 + \cdots + X_n\). Then \(S_n /n \to \mu\) a.s. as \(n \to \infty\).
You can see the defferences between Weak and Strong Law.
Weak | i.i.d. | to \(\mu_n\) | no | in probability |
---|---|---|---|---|
Large | pairwise i.i.d. | to \(\mu\) | \(E|X_i| < \infty\) | a.s. |
Proof of Weak Law of Large numbers
Defination: Uncorrelated
\(E(X_iX_j) = EX_iEX_j\) whenever \(i \ne j\).
Lamma: \(X_1 ,\cdots\) have \(E(X_i^2) < \infty\) and be uncorrelated. Then
\[var(X_1+\cdots + X_n) = var(X_1) + \cdots + var(X_n) \]Proof:
\[\begin{aligned} &var(X_1 + \cdots + X_n)\\ =&E((\sum X_i - \mu_i)^2)\\ =&\sum E(X_i - \mu_i)^2 + \sum_{i\ne j}E(X_i-\mu_i)(X_j-\mu_j)\\ =&\sum E(X_i - \mu_i)^2 + \sum_{i\ne j}E(X_iX_j-\mu_j X_i-\mu_iX_j + \mu_i \mu_j)\\ =&\sum E(X_i - \mu_i)^2 \end{aligned} \]Weak Law for triangular arrays.
For each n let \(X_{n,k}\), \(1 \le k \le n\), be independent. Let \(b_n > 0\) with \(b_n \to \infty\), and let \(\overline X_{n,k} = X_{n,k}1_{|X_{n,k}| \le b_n}\). Suppose that as \(n \to \infty\).
- \(\sum_{i=1}^n P(|X_{n,k} > b_n|) \to 0\)
- \(b_n^{-2}\sum_{k =1}^n E\overline X^2_{n,k} \to 0\)
Let \(S_n = X_{n,1} + \cdots + X_{n,n}\) and put \(a_n = \sum_{i=1}^n E\overline X_{n,k}\) then
\[(S_n - a_n) / b_n \to 0 \quad \text{in probability} \]
Proof:
\(P(\frac{S_n - a_n}{b_n} > \epsilon) \le P(S_n \ne \overline S_n) + P(\frac{\overline S_n - a_n}{b_n} > \epsilon)\)
\(P(S_n \ne \overline S_n) = \sum P(|X_{n,k}| > b_n) \to 0\).
\[\begin{aligned} &P(\frac{\overline S_n - a_n}{b_n} > \epsilon)\\ =&P(\ov S_n - a_n > \e b_n)\\ \le &(\e b_n)^{-2}E(\ov S_n - a_n)^2\\ = &(\e b_n)^{-2}\sum var(\ov X_{n,i})\\ \le &(\e b_n)^{-2}\sum E(\ov X^2_{n,i})\\ \to& 0 \end{aligned} \]proof for weak law
Using Triangular:
let \(b_n = n\), \(X_{i,j} = X_j\).
- \(\sum_{i=1}^nP(|X_i > n|)=nP(|X_i > n|) \to 0\).
- \(n^{-2}\sum_{k=1}^n E\ov {X_k}^2 = n^{-1}E\ov X_1^2\).
The second condition is not so easy to show:
Lemma: If \(Y \ge 0\) and \(p > 0\), then \(E(Y^{p})=\int_0^{\infty}py^{p-1}P(Y > y) d y\).
\[E(Y^p) = \int_\Omega Y^p d P \\ =\int_\Omega \int_0^Y py^{p-1}dy dP\\ =\int_0^\infty py^{p-1}\int_{Y \ge y}dP dy\\ =\int_0^\infty py^{p-1}P(Y \ge y) dy\\ \]Note: \(\int P(X =x) dx = 0\). for continous function, \(P(X = x) = 0\). for discrete function \(dx = 0\).
Hence \(n^{-1}E\ov X_1^2 = 2n^{-1}\int _0^n yP(\ov X_1 > y) dy\le 2n^{-1}\int_0^n yP(X_1 > y) dy\)
\(yP(X_1 > y) \to 0\), there exists \(N\), for \(n > N\), \(xP(X_1 > x) <\e\). \(2n^{-1}\int_0^n yP(X_1 > y) dy= 2n^{-1}(\int_0^NyP(X_1 > y) + \int_N^nyP(X_1 > y)) \to 0\)
So it satisfied triangular array theorem.
by lemma \(xP(|X_1| > x) \to 0\) implies \(E|X_1|^{1-\e} < \infty\)(bijiefa maybe).
Proof of Strong Law of Large numbers
so long so hard...
Steps:
- Borel-Cantelli Lamma
- Kronecker Lamma
- Kolmogorov's maximal inequality
- Khintehine-Kolmogorov convergence theorem
Borel-Cantelli Lemma
notations:
\(\limsup A_n = \lim_{m \to \infty}\cup_{n = m}^{\infty}A_n = \{\omega \text{ that are in infinitely many }A_n\}\)
\(\liminf A_n = \lim_{m \to \infty} \cap _{n=m}^\infty A_n = \{\omega \text{ that are in all but finitely many }A_n\}\)
Borel-Cantelli lemma
if \(\sum_{n=1}^\infty P(A_n) < \infty\) then \(P(A_n\quad i.o.) = 0\)
Kronecker's Lemma
If \(a_n \uparrow \infty\) and $\sum_{n=1}^{\infty} x_n /a_n $ converges then \(a^{-1}_n \sum_{m=1}^n x_m \to 0\)
exists N s.t. \(\sum_{n=N}^{\infty} < \e\). \(a_n^{-1}\sum_{m=1}^n x_m =a_n^{-1}(\sum_{m=1}^{N-1}x_m +\sum_{m=N}^n x_m)\le a_n^{-1}S + \sum_{m=N}^n a_m^{-1}x_m \le a_n^{-1}S + 2\e \to 0\)
Kolmogorov's Maximal inequality
Suppose \(X_1,\cdots, X_n\) are independent with \(EX_i = 0\) and \(var(X_i) < \infty\). If \(S_n = X_1 + \cdots + X_n\).
then \(P(\max_{1 \le k \le n}|S_k| \ge x) \le x^{-2}var(S_n)\)
Remark: Chebyshev's inequality gives \(P(|S_n|\ge x)\le x^{-2}var(S_n)\).
Let \(A_k = \{|S_k| \ge x \text{ but } |S_j| < x \text{ for } j < k\}\). So \(A_k\) are disjoint and \((S_n-S_k)^2 \ge 0\).
\[ES^2_n \ge \sum_{k=1}^n\int_{A_k}S_n^2 dP = \sum_{k=1}^n \int_{A_k} S_k^2 + 2S_k(S_n-S_k) +(S_n-S_k)^2 dP\\ \ge\sum_{k=1}^n \int_{A_k} S_k^2 dP+ 2\sum_{k=1}^n \int_{A_k} S_k(S_n-S_k) dP \]Notice that \(S_k1_{A_k} \in \sigma(X_1,\cdots,X_k)\), \((S_n-S_k) \in\sigma(X_{k+1},\cdots, X_n)\) are independent, since \(E(S_n-S_k) = 0\). so \(2\sum_{k=1}^n \int_{A_k} S_k(S_n-S_k) dP = 0\).
hence \(ES_n^2 \ge \sum_{k=1}^n \int_{A_k}S_k^2 dP\ge\sum_{k=1}^nx^2P(A_k)=x^2P(\max_{1\le k\le n}|S_k|\ge x)\).
Khintehine-Kolmogorov convergence theorem
\(X_1,\cdots\) are independent. \(E X_n = 0\). If \(\sum_{n=1}^\infty var(X_n) < \infty\). Then almost surely \(\sum_{i=1}^n X_i(\omega)\) converges to a limit.
Proof:
\(S_n = \sum_{n=1}^NX_n\), \(\forall N,M \in N\), as \(N \to \infty\)
\[\begin{aligned} &P(\max_{M \le m \le N}|S_m-S_M| \ge \e)\\ \le& \e^{-2}var(S_N-S_M)\\ =&\e^{-2}\sum_{n = M+1}^Nvar(X_n)\\ \to &\e^{-2}\sum_{n=M+1}^{\infty}var(X_n) \end{aligned} \]hence as \(M \to \infty\), \(P(\max_{M \le m \le \infty}|S_m-S_M| \ge \e) \le \e^{-2}\sum_{n=M+1}^{\infty}var(X_n) \to 0\).
Let \(W_M = \sup_{n,m \ge M}|S_m-S_n|\)
\(P(W_M \ge 2\e) \le P(\sup_{m \ge M} |S_m-S_M|\ge \e)\to 0\).
almost surely \(W_M \downarrow 0\), let \(A_M = \{W_M > \e\}\), \(\forall n > m\), \(A_n \subseteq A_m\).
\(P(\limsup A_m) = 0 \Rightarrow W_m \to 0\) a.s.
almost surely $S_m(\omega) $ is a Cauchy sequence.
a.s. \(S_m(\omega)\) converges to a limit.
Proof for SLLN
Suppose \(EX_i = 0\).
resolve \(X_i\) into \((X_iI_{|X_i|\le i} - E[X_iI_{|X_i|\le i}])+(X_iI_{|X_i|> i} - E[X_iI_{|X_i|> i}])\)
Let \(Y_i = (X_iI_{|X_i|\le i} - E[X_iI_{|X_i|\le i}])\), \(Z_i = (X_iI_{|X_i|> i} - E[X_iI_{|X_i|> i}])\).
\(\frac1n \sum_{i=1}^n Y_i \to 0\) a.s.
\[\begin{aligned} \sum_{n=1}^{\infty}\frac{var(Y_n)}{n^2} \le & \sum_{n=1}^{\infty}\frac1{n^2} E[X_1^2I_{|X_1|\le n}]\\ =&E[X_1^2\sum_{n=1}^\infty\frac1{n^2}I_{|X_1|\le n}]\\ \le &E[X_1^2\sum_{n\ge |X_1|,n \ge1}^\infty\frac2{n(n+1)}]\\ \le &2E[X_1^2(\frac1{|X_1|}\and 1)]\\ \le & 2E[|X_1| + 1]\\ <&\infty \end{aligned} \]so \(\sum_{i=1}^n\frac{Y_i}{i}\) converges to a limit.
by Kronecker Lemma, \(\frac1n\sum_{i=1}^n Y_i\to 0\) a.s.
\(\frac 1n\sum_{i=1}^n Z_i \to 0\)
\(E[X_i I_{|X_i| >i}] \to 0\), so \(\frac1n \sum E[X_i I_{|X_i| > i}] \to 0\).
now we show that \(X_i I_{|X_i| > i} ]\to 0\).
\(\sum_{i=1}^{\infty}P(|X_i| > i)\le \int_0^{\infty}P(|X_i| > t)d t = E|X_1|<\infty\).
by Bore-Cantelli Lemma \(P(|X_i| > i\ \ \ i.o.) = 0\), let \(N(\omega) = i\) s.t. \(\max_i\{X_i(\omega) > i\}\).
\(|\frac1n \sum_{i=1}^nX_iI_{|X_i|>i}|(w)\le \frac1n \sum_{i=1}^{N(w)}|X_i(w)|\to 0\)
Now we finish the prove.
The rate of convergence
Let \(X_1,X_2\cdots\) be i.i.d. random variables with \(EX_i = 0\) and \(EX_i^2 = \sigma^2 < \infty\). Let \(S_n = X_1 + \cdots + X_n\). If \(\e > 0\) then
\[S_n / n^{1/2}(\log n)^{1/2 + \e} \to 0 \]This means \(\frac{\sqrt n}{(\log n)^{\frac 12+ \e}}(S_n/n-\mu)\to 0\) a.s.
Remark: the best possible is \(\limsup S_n/n^{1/2}(\log \log n)^{1/2} =\sigma \sqrt 2\)
KKCT -> \(\sum_{n=1}^\infty X_n/a_n\) converges a.s. by Krronecker Lemma \(\sum X_i / A_n\) converges to 0.
标签:infty,le,int,sum,Large,mu,ge,numbers,Law From: https://www.cnblogs.com/tuagoale/p/16927652.html