next up previous
Next: Weak-Learner with Weighting [3 Up: MLA_Exercises_160106 Previous: Confidence Intervals [3 P]

Leave-one-out Error Estimation [4 P]

Consider the set $ Y = \{y_1, \ldots, y_n\} \subset \mathbb{R}$ , containing the sample errors in the $ i$ -th fold of leave-one-out cross-validation. An estimator for the mean of this distribution and the variance of the estimator are given by:

$\displaystyle \hat{\mu} = \frac{1}{n} \sum_{i=1}^n y_i \hspace{.5in} Var[\hat{\mu}] = \frac{1}{n(n-1)} \sum_{i=1}^n(y_i - \hat{\mu})^2 $

Now let

$\displaystyle \mu_{(i)} = \frac{1}{n-1} \sum_{j \neq i} y_j $

be the mean, if $ y_i$ is left out.
a)
Prove that the mean of the means

$\displaystyle \mu_{(.)} = \frac{1}{n} \sum_{i=1}^n \mu_{(i)} $

is equal to $ \hat{\mu}$ .
b)
The variance $ Var[\mu_{(i)}]$ of this so called jackknife-estimator $ \mu_{(.)}$ is given by

$\displaystyle Var[\mu_{(.)}] = \frac{n-1}{n} \sum_{i=1}^{n} (\mu_{(i)} - \mu_{(.)})^2 $

Prove that $ Var[\mu_{(.)}] = Var[\hat{\mu}]$ .



Pfeiffer Michael 2006-01-18