Publication: Research - peer-review › Article in proceedings – Annual report year: 2011
Over the past few years kernel methods have gained a tremendous amount of attention as existing linear algorithms can easily be extended to account for highly non-linear data in a computationally efficient manner. Unfortunately most kernels require careful tuning of intrinsic parameters to correctly model the distribution of the underlying data. For large-scale problems the multiplicative scaling in time complexity imposed by introducing free parameters in a crossvalidation setup will prove computationally infeasible, often leaving pure ad-hoc estimates as the only option. In this contribution we investigate a novel randomized approach for kernel parameter selection in large-scale multi-class data. We fit a minimum enclosing ball to the class means in Reproducing Kernel Hilbert Spaces (RKHS), and use the radius as a quality measure of the space, defined by the kernel parameter. We apply the developed algorithm to a computer vision paradigm where the objective is to recognize 72:000 objects among 1:000 classes. Compared to other distance metrics in the RKHS we find that our randomized approach provides better results together with a highly competitive time complexity.
|Title of host publication||2011 IEEE International Workshop on Machine Learning for Signal Processing (MLSP)|
|State||Published - 2011|
|Event||2011 IEEE International Workshop on Machine Learning for Signal Processing - Beijing, China|
|Conference||2011 IEEE International Workshop on Machine Learning for Signal Processing|
|Period||01/01/2011 → …|
|Citations||Web of Science® Times Cited: No match on DOI|