We propose a kernel-based quadratic classification method based on kernel principal component analysis (KPCA). Subspace methods have been widely used for multiclass classification problems, and they have been extended by the kernel trick. However, there are large computational complexities for the subspace methods that use the kernel trick because the problems are defined in the space spanned by all of the training samples. To reduce the computational complexity of the subspace methods for multiclass classification problems, we extend Oja's averaged learning subspace method and apply a subset approximation of KPCA. We also propose an efficient method for selecting the basis vectors for this. Due to these extensions, for many problems, our classification method exhibits a higher classification accuracy with fewer basis vectors than does the support vector machine (SVM) or conventional subspace methods.
Yoshikazu WASHIZAWA
The Univeristy of Electro-Communications,RIKEN
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Yoshikazu WASHIZAWA, "Learning Subspace Classification Using Subset Approximated Kernel Principal Component Analysis" in IEICE TRANSACTIONS on Information,
vol. E99-D, no. 5, pp. 1353-1363, May 2016, doi: 10.1587/transinf.2015EDP7334.
Abstract: We propose a kernel-based quadratic classification method based on kernel principal component analysis (KPCA). Subspace methods have been widely used for multiclass classification problems, and they have been extended by the kernel trick. However, there are large computational complexities for the subspace methods that use the kernel trick because the problems are defined in the space spanned by all of the training samples. To reduce the computational complexity of the subspace methods for multiclass classification problems, we extend Oja's averaged learning subspace method and apply a subset approximation of KPCA. We also propose an efficient method for selecting the basis vectors for this. Due to these extensions, for many problems, our classification method exhibits a higher classification accuracy with fewer basis vectors than does the support vector machine (SVM) or conventional subspace methods.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.2015EDP7334/_p
Copy
@ARTICLE{e99-d_5_1353,
author={Yoshikazu WASHIZAWA, },
journal={IEICE TRANSACTIONS on Information},
title={Learning Subspace Classification Using Subset Approximated Kernel Principal Component Analysis},
year={2016},
volume={E99-D},
number={5},
pages={1353-1363},
abstract={We propose a kernel-based quadratic classification method based on kernel principal component analysis (KPCA). Subspace methods have been widely used for multiclass classification problems, and they have been extended by the kernel trick. However, there are large computational complexities for the subspace methods that use the kernel trick because the problems are defined in the space spanned by all of the training samples. To reduce the computational complexity of the subspace methods for multiclass classification problems, we extend Oja's averaged learning subspace method and apply a subset approximation of KPCA. We also propose an efficient method for selecting the basis vectors for this. Due to these extensions, for many problems, our classification method exhibits a higher classification accuracy with fewer basis vectors than does the support vector machine (SVM) or conventional subspace methods.},
keywords={},
doi={10.1587/transinf.2015EDP7334},
ISSN={1745-1361},
month={May},}
Copy
TY - JOUR
TI - Learning Subspace Classification Using Subset Approximated Kernel Principal Component Analysis
T2 - IEICE TRANSACTIONS on Information
SP - 1353
EP - 1363
AU - Yoshikazu WASHIZAWA
PY - 2016
DO - 10.1587/transinf.2015EDP7334
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E99-D
IS - 5
JA - IEICE TRANSACTIONS on Information
Y1 - May 2016
AB - We propose a kernel-based quadratic classification method based on kernel principal component analysis (KPCA). Subspace methods have been widely used for multiclass classification problems, and they have been extended by the kernel trick. However, there are large computational complexities for the subspace methods that use the kernel trick because the problems are defined in the space spanned by all of the training samples. To reduce the computational complexity of the subspace methods for multiclass classification problems, we extend Oja's averaged learning subspace method and apply a subset approximation of KPCA. We also propose an efficient method for selecting the basis vectors for this. Due to these extensions, for many problems, our classification method exhibits a higher classification accuracy with fewer basis vectors than does the support vector machine (SVM) or conventional subspace methods.
ER -