Abstract
Kernel Principal Component Analysis (PCA) has proven a powerful tool for nonlinear feature extraction, and is often applied as a pre-processing step for classification algorithms. In denoising applications Kernel PCA provides the basis for dimensionality reduction, prior to the so-called pre-image problem where denoised feature space points are mapped back into input space. This problem is inherently ill-posed due to the non-bijective feature space mapping. We present a semi-supervised denoising scheme based on kernel PCA and the pre-image problem, where class labels on a subset of the data points are used to improve the denoising. Moreover, by warping the Reproducing Kernel Hilbert Space (RKHS) we also account for the intrinsic manifold structure yielding a Kernel PCA basis that also benefit from unlabeled data points. Our two main contributions are; (1) a generalization of Kernel PCA by incorporating a loss term, leading to an iterative algorithm for finding orthonormal components biased by the class labels, and (2) a fixed-point iteration for solving the pre-image problem based on a manifold warped RKHS. We prove viability of the proposed methods on both synthetic data and images from The Amsterdam Library of Object Images (Geusebroek et al., 2005) [7].
Original language | English |
---|---|
Journal | Pattern Recognition Letters |
Volume | 49 |
Pages (from-to) | 114-120 |
ISSN | 0167-8655 |
DOIs | |
Publication status | Published - 2014 |
Keywords
- Semi-supervised denoising
- Kernel PCA
- Pre-image problem