next up previous
Next: Cross-Entropy Error Function [2* Up: NNA_Exercises_2009 Previous: Linear models for regression

PCA vs. Fisher's Linear Discriminant [4 P]

Implement the Fisher's linear discriminant algorithm in MATLAB and apply it to the Olivetti face dataset, available for download on the course homepage1. The dataset contains 7200 images of 4 persons (4 classes). Each image consists of $ 25 \times 25$ gray scale values (use imagesc and colormap gray for visualization). Apply the two dimension reduction methods PCA (see processpca) and Fisher's linear discriminant (FLD) to project the data to low dimensional subspaces. Compare the separation of image classes in this subspaces for PCA and FLD by means of the area under ROC curves (AUC).

  1. Apply PCA and FLD to all images of persons 1 and 2 and project the data to the 1 dimensional space spanned by the 1st principle component (PC) or the Fisher's linear discriminant.

  2. Calculate the AUC for a linear classifier in this 1 dimensional space.

  3. Repeat the experiment described in 1 and 2 for three classes, i.e. images of persons 1, 2, and 3, where images are projected to a 2 dimensional subspace spanned by the first two PCs or the two FLDs.

  4. Calculate the AUC for a linear classifier in this 2 dimensional subspace with a weight vector that is parallel to a line passing through both class centers in the 2 dimensional subspace.

  5. Repeat the experiment described in 3 and 4 for four classes, i.e. images of persons 1, 2, 3, and 4, where images are projected to a 3 dimensional subspace spanned by the first three PCs or the three FLDs.

  6. Visualize the eigenfaces (principle components) and the FLDs and interpret them.

  7. Discuss the dependence of the AUC on the dimensionality of the low dimensional subspaces, i.e. the number of PCs and FLDs.

Present your results clearly, structured and legible.


next up previous
Next: Cross-Entropy Error Function [2* Up: NNA_Exercises_2009 Previous: Linear models for regression
Haeusler Stefan 2010-01-19