Projects per year
Abstract
Kernel methods refer to a family of widely used nonlinear algorithms for machine learning tasks like classification, regression, and feature extraction. By exploiting the socalled kernel trick straightforward extensions of classical linear algorithms are enabled as long as the data only appear as innerproducts in the model formulation. This dissertation presents research on improving the performance of standard kernel methods like kernel Principal Component Analysis and the Support Vector Machine. Moreover, the goal of the thesis has been twofold.
The first part focuses on the use of kernel Principal Component Analysis for nonlinear denoising. In this context stable solution of the inverse and inherently illposed preimage problem constitutes the main challenge. It is proposed to stabilize the estimation by augmenting the cost function with either an `1or `2norm penalty, and solution schemes are derived for both approaches. The methods are experimentally validated on several biomedical data sets. Furthermore, frameworks for exploiting label information for improved denoising in the semisupervised case are proposed.
The second part of the thesis examines the effect of variance inflation in kernel methods. Variance inflation occurs in highdimensional problems when the training data are insufficient to describe the entire signal manifold. Thereby leading to a potential mismatch between the subspaces spanned by the training and test data, respectively. It is shown how this effect extends from linear models to kernel learning, and means for restoring the generalizability in both kernel Principal Component Analysis and the Support Vector Machine are proposed. Viability is proved on a wide range of benchmark machine learning data sets.
The first part focuses on the use of kernel Principal Component Analysis for nonlinear denoising. In this context stable solution of the inverse and inherently illposed preimage problem constitutes the main challenge. It is proposed to stabilize the estimation by augmenting the cost function with either an `1or `2norm penalty, and solution schemes are derived for both approaches. The methods are experimentally validated on several biomedical data sets. Furthermore, frameworks for exploiting label information for improved denoising in the semisupervised case are proposed.
The second part of the thesis examines the effect of variance inflation in kernel methods. Variance inflation occurs in highdimensional problems when the training data are insufficient to describe the entire signal manifold. Thereby leading to a potential mismatch between the subspaces spanned by the training and test data, respectively. It is shown how this effect extends from linear models to kernel learning, and means for restoring the generalizability in both kernel Principal Component Analysis and the Support Vector Machine are proposed. Viability is proved on a wide range of benchmark machine learning data sets.
Original language  English 

Place of Publication  Kgs. Lyngby 

Publisher  Technical University of Denmark 
Number of pages  168 
Publication status  Published  2013 
Series  PHD2013 

Number  299 
ISSN  09093192 
Fingerprint
Dive into the research topics of 'Kernel Methods for Machine Learning with Life Science Applications'. Together they form a unique fingerprint.Projects
 1 Finished

Kernel Methods for Machine Learning with lifesciences applications
Abrahamsen, T. J., Hansen, L. K., Winther, O., Kaski, S., Larsen, J. & Jensen, S. H.
Technical University of Denmark
01/08/2009 → 30/08/2013
Project: PhD