Unstable Estimator

Consider an estimator $\widetilde{f}(x)$ of some ideal prediction rule $f^*(x)$.

Intuitively, the data-driven prediction rule $\widetilde{f}(x)$ is unstable if small changes in the data can cause large changes in the predicted value(s).

A Toy Example #

Consider the estimator $$\widetilde{f}(x;\bar{X})=\mathbf{1}[\bar{X}\leq x]$$ where $\bar{X}$ is a sample mean of normal features such that $$\bar{X}=\frac{1}{n}\sum_{i=1}^{n}X_i,\quad X_i\sim N(\mu,1)$$ with an unknown mean $\mu\in\mathbb{R}$.

Suppose that we observe that $\bar{X}=0.5$ and so $\widetilde{f}(0.5;0.5)=1$. If there is small change in $\bar{X}$, say, $$\bar{X}=0.5 \Rightarrow \bar{X}=0.5+\delta n^{-1/2}~\text{for}~\delta>0,$$ then there would be a jump of the predicted value at $x=0.5$ not depending on the size of $\delta$: $$\widetilde{f}(0.5;0.5)=1 \Rightarrow \widetilde{f}(0.5;0.5+\delta n^{-1/2})=0.$$


Next Section: Bagging