Most of the recent classification methods require tuning of the hyper-parameters, such as the kernel function parameter and the regularization parameter. Cross-validation or the leave-one-out method is often used for the tuning, however their computational costs are much higher than that of obtaining a classifier. Quadratically constrained maximum a posteriori (QCMAP) classifiers, which are based on the Bayes classification rule, do not have the regularization parameter, and exhibit higher classification accuracy than support vector machine (SVM). In this paper, we propose a multiple kernel learning (MKL) for QCMAP to tune the kernel parameter automatically and improve the classification performance. By introducing MKL, QCMAP has no parameter to be tuned. Experiments show that the proposed classifier has comparable or higher classification performance than conventional MKL classifiers.
Yoshikazu WASHIZAWA
The Univeristy of Electro-Communications,RIKEN
Tatsuya YOKOTA
RIKEN,Tokyo Institute of Technology
Yukihiko YAMASHITA
Tokyo Institute of Technology
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Yoshikazu WASHIZAWA, Tatsuya YOKOTA, Yukihiko YAMASHITA, "Multiple Kernel Learning for Quadratically Constrained MAP Classification" in IEICE TRANSACTIONS on Information,
vol. E97-D, no. 5, pp. 1340-1344, May 2014, doi: 10.1587/transinf.E97.D.1340.
Abstract: Most of the recent classification methods require tuning of the hyper-parameters, such as the kernel function parameter and the regularization parameter. Cross-validation or the leave-one-out method is often used for the tuning, however their computational costs are much higher than that of obtaining a classifier. Quadratically constrained maximum a posteriori (QCMAP) classifiers, which are based on the Bayes classification rule, do not have the regularization parameter, and exhibit higher classification accuracy than support vector machine (SVM). In this paper, we propose a multiple kernel learning (MKL) for QCMAP to tune the kernel parameter automatically and improve the classification performance. By introducing MKL, QCMAP has no parameter to be tuned. Experiments show that the proposed classifier has comparable or higher classification performance than conventional MKL classifiers.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.E97.D.1340/_p
Copy
@ARTICLE{e97-d_5_1340,
author={Yoshikazu WASHIZAWA, Tatsuya YOKOTA, Yukihiko YAMASHITA, },
journal={IEICE TRANSACTIONS on Information},
title={Multiple Kernel Learning for Quadratically Constrained MAP Classification},
year={2014},
volume={E97-D},
number={5},
pages={1340-1344},
abstract={Most of the recent classification methods require tuning of the hyper-parameters, such as the kernel function parameter and the regularization parameter. Cross-validation or the leave-one-out method is often used for the tuning, however their computational costs are much higher than that of obtaining a classifier. Quadratically constrained maximum a posteriori (QCMAP) classifiers, which are based on the Bayes classification rule, do not have the regularization parameter, and exhibit higher classification accuracy than support vector machine (SVM). In this paper, we propose a multiple kernel learning (MKL) for QCMAP to tune the kernel parameter automatically and improve the classification performance. By introducing MKL, QCMAP has no parameter to be tuned. Experiments show that the proposed classifier has comparable or higher classification performance than conventional MKL classifiers.},
keywords={},
doi={10.1587/transinf.E97.D.1340},
ISSN={1745-1361},
month={May},}
Copy
TY - JOUR
TI - Multiple Kernel Learning for Quadratically Constrained MAP Classification
T2 - IEICE TRANSACTIONS on Information
SP - 1340
EP - 1344
AU - Yoshikazu WASHIZAWA
AU - Tatsuya YOKOTA
AU - Yukihiko YAMASHITA
PY - 2014
DO - 10.1587/transinf.E97.D.1340
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E97-D
IS - 5
JA - IEICE TRANSACTIONS on Information
Y1 - May 2014
AB - Most of the recent classification methods require tuning of the hyper-parameters, such as the kernel function parameter and the regularization parameter. Cross-validation or the leave-one-out method is often used for the tuning, however their computational costs are much higher than that of obtaining a classifier. Quadratically constrained maximum a posteriori (QCMAP) classifiers, which are based on the Bayes classification rule, do not have the regularization parameter, and exhibit higher classification accuracy than support vector machine (SVM). In this paper, we propose a multiple kernel learning (MKL) for QCMAP to tune the kernel parameter automatically and improve the classification performance. By introducing MKL, QCMAP has no parameter to be tuned. Experiments show that the proposed classifier has comparable or higher classification performance than conventional MKL classifiers.
ER -