Abstract
Learning algorithms are often equipped with kernels that enable them to deal with non-linearities in the data, which ensures increased performance in practice. However, traditional kernel based methods suffer from the problem of scalability. As a workaround, explicit kernel maps have been proposed in the past. Previously for the task of streaming Kernel Principal Component Analysis (KPCA), an explicit kernel map has been combined with a matrix sketching technique to obtain scalable dimensionality reduction (DR) algorithm. This algorithm is limited by two issues, both pertaining to the explicit kernel map and the matrix sketching algorithms respectively. As a solution, two new scalable DR algorithms called ECM-SKPCA and Euler-SKPCA are proposed. The efficacy of the proposed algorithms as scalable DR algorithms is demonstrated via the task of classification with many publicly available datasets. The results indicate that the proposed algorithms produce more effective features than the previous algorithm for the classification task. Furthermore, ECM-SKPCA is also demonstrated to be much faster than all other algorithms.
Original language | English |
---|---|
Journal | Intelligent Decision Technologies |
Volume | 17 |
Issue number | 2 |
Pages (from-to) | 457-470 |
ISSN | 1872-4981 |
DOIs | |
Publication status | Published - 2023 |
Keywords
- Dimensionality reduction
- Kernel
- Streaming data
- Explicit cosine map
- Classification