Definition (Convergence in probability)
A sequence of r.v $Y_1, ..., Y_n$ converges in probability to a r.v. $Y (Y_n \overset{p}{\to} Y)$ if for any $\epsilon > 0$,
$$ P(|Y_n-Y|>\epsilon) \rightarrow 0 \Leftrightarrow P(|Y_n-Y| \leq \epsilon)\rightarrow1 \text{ as } n \rightarrow \infin $$
Theorem (WLLN)
$$ \bar X = \frac{1}{n} \sum^n_i X_i \overset{p}{\to} \mu=E(X_i) $$
Proof by Chebyschev inequality.
Chebychev inequality
$$ P(|X-\mu| \ge k\sigma) \le 1/k^2 $$
Definition
$Y_1, ... ,Y_n$ converges in distribution to a random variable $Y$ with cdf $F(y)$ if $lim_{n\rightarrow\infin} F_n(y)=F(y)$ for all $y$ at which F is continuous
Theorem (CLT)
$$ Z_n=\frac{\bar X_n-\mu}{\sigma/\sqrt n} \overset{D}{\to} N(0,1) $$
meaning that cdf of $Z_n$ converges to the cdf of $N(0,1)$
($\mu=E(X), \sigma=\sqrt{Var(X)}$)
Mapping Theorem
If $Y_n \overset{D}{\to} Y$, then $h(Y_n) \overset{D}{\to}h(Y)$ for any continuous function $h$.
Slutsky’s Theorem
$U_n \overset{D}{\to} U$ and $W_n \overset{p}{\to} 1 \Rightarrow U_n/W_n \overset{D}{\to} U$
★ Theorem (Delta method)
Suppose that $\sqrt n (Y_n-\theta) \overset{D}{\to} N(0, \sigma^2)$
When $g$ is continuous function with a nonzero derivative at $\theta$,
$$ \sqrt n (g(Y_n)-g(\theta)) \overset{D}{\to} N(0, \sigma^2[g'(\theta)]^2) $$
prove by taylor expansion
Theorem (Limiting MGF)
$\lim_{n \rightarrow \infin} M(t;n) = M(t)$, then $Y_n$ has a limiting distribution with cdf $F(y)$.