next up previous
Next: Regression Support Vector Machine Up: MLA_Exercises_2007 Previous: Clustering [5 P]

Properties of Kernels [6 P]

For all the problems, let $ k$ be a positive definite kernel and $ X$ be the non-empty input space:

  1. Prove that if $ k(x,x) = 0$ for all $ x \in X$ , the kernel is identically zero, i.e. $ k(x,x') = 0 ~\forall x,x' \in X$.

  2. Give an example of a kernel $ k$ which is positive definite, but not positive in the sense that $ k(x, x') \geq 0$ for all $ x,x' \in X$. Give an example where the contrary is the case.

  3. Prove that the inhomogeneous polynomial $ k(x,x') = \left(\langle x,x' \rangle + c \right)^d$ with $ X \subset \mathbb{R}^N, d\in \mathbb{N}, c \geq 0$ is a positive definite kernel.

  4. Give an example of a kernel $ k$ with two valid feature maps $ \Phi_1, \Phi_2$ , mapping into spaces $ H_1, H_2$ of different dimensions.

  5. Show that a reproducing kernel $ k$ (i.e. $ \langle f, k(x,.) \rangle = f(x)$ for all $ f \in H$ , where $ H$ is a Hilbert space of functions $ f: X \rightarrow \mathbb{R}$ ), is symmetric.

  6. Given a kernel $ k$ , construct a corresponding normalized kernel $ \tilde{k}$ by normalizing the feature map $ \tilde{\Phi}$ such that for all $ x \in X, ~\Vert \tilde{\Phi}(x) \Vert = 1$. Use this results to show that $ k(x,x') = cos(\angle(x,x'))$ is a positive definite kernel in a dot product space $ X$.


next up previous
Next: Regression Support Vector Machine Up: MLA_Exercises_2007 Previous: Clustering [5 P]
Haeusler Stefan 2007-12-03