MVUE

Definition

$\hat \theta$ is a minium variance unbiased estimator (MVUE) of $\theta$ if

If MVUE exists, then it is unique

Carmer-Rao Method

Definition of Fisher information

$$ I(\theta)=E_\theta\left( \left(\frac{\partial}{\partial \theta} \log f(X|\theta)\right)^2 \right) $$

Theorem of Information inequality

Under some regularity conditions (ex. Exponential Family) and $0<I(\theta)<\infin$, for a statistic $T(\mathbf X)$ with $E(T(\mathbf X))=g(\theta)$ and $V(T(\mathbf X))<\infin$,

$$ V(T(\mathbf X)) \geq \frac{(g'(\theta))^2}{nI(\theta)} $$

In particular, for any UE $\tilde \theta$ of $\theta$, $V(\tilde \theta) \geq \frac{1}{nI(\theta)}$

ex. $X_1,...,X_n$: iid Poisson($\lambda$). $\bar X$ is MVUE of $\lambda$

Rao-Blackwell Theorem

Theorem

$\hat\theta$: a UE of $\theta$, $V(\hat\theta)< \infin$

$U$: a SS for $\theta$ and define $\hat\theta^*=E(\hat\theta|U)$. Then, for all $\theta$,

$$ E(\hat\theta^)=\theta \text{ and } V(\hat\theta^)\leq V(\hat\theta) $$