For the last few decades, learning with multiple kernels, represented by the ensemble kernel regressor and the multiple kernel regressor, has attracted much attention in the field of kernel-based machine learning. Although their efficacy was investigated numerically in many works, their theoretical ground is not investigated sufficiently, since we do not have a theoretical framework to evaluate them. In this paper, we introduce a unified framework for evaluating kernel regressors with multiple kernels. On the basis of the framework, we analyze the generalization errors of the ensemble kernel regressor and the multiple kernel regressor, and give a sufficient condition for the ensemble kernel regressor to outperform the multiple kernel regressor in terms of the generalization error in noise-free case. We also show that each kernel regressor can be better than the other without the sufficient condition by giving examples, which supports the importance of the sufficient condition.
Akira TANAKA
Hokkaido University
Hirofumi TAKEBAYASHI
Hokkaido University
Ichigaku TAKIGAWA
Hokkaido University
Hideyuki IMAI
Hokkaido University
Mineichi KUDO
Hokkaido University
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Akira TANAKA, Hirofumi TAKEBAYASHI, Ichigaku TAKIGAWA, Hideyuki IMAI, Mineichi KUDO, "Ensemble and Multiple Kernel Regressors: Which Is Better?" in IEICE TRANSACTIONS on Fundamentals,
vol. E98-A, no. 11, pp. 2315-2324, November 2015, doi: 10.1587/transfun.E98.A.2315.
Abstract: For the last few decades, learning with multiple kernels, represented by the ensemble kernel regressor and the multiple kernel regressor, has attracted much attention in the field of kernel-based machine learning. Although their efficacy was investigated numerically in many works, their theoretical ground is not investigated sufficiently, since we do not have a theoretical framework to evaluate them. In this paper, we introduce a unified framework for evaluating kernel regressors with multiple kernels. On the basis of the framework, we analyze the generalization errors of the ensemble kernel regressor and the multiple kernel regressor, and give a sufficient condition for the ensemble kernel regressor to outperform the multiple kernel regressor in terms of the generalization error in noise-free case. We also show that each kernel regressor can be better than the other without the sufficient condition by giving examples, which supports the importance of the sufficient condition.
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/transfun.E98.A.2315/_p
Copy
@ARTICLE{e98-a_11_2315,
author={Akira TANAKA, Hirofumi TAKEBAYASHI, Ichigaku TAKIGAWA, Hideyuki IMAI, Mineichi KUDO, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={Ensemble and Multiple Kernel Regressors: Which Is Better?},
year={2015},
volume={E98-A},
number={11},
pages={2315-2324},
abstract={For the last few decades, learning with multiple kernels, represented by the ensemble kernel regressor and the multiple kernel regressor, has attracted much attention in the field of kernel-based machine learning. Although their efficacy was investigated numerically in many works, their theoretical ground is not investigated sufficiently, since we do not have a theoretical framework to evaluate them. In this paper, we introduce a unified framework for evaluating kernel regressors with multiple kernels. On the basis of the framework, we analyze the generalization errors of the ensemble kernel regressor and the multiple kernel regressor, and give a sufficient condition for the ensemble kernel regressor to outperform the multiple kernel regressor in terms of the generalization error in noise-free case. We also show that each kernel regressor can be better than the other without the sufficient condition by giving examples, which supports the importance of the sufficient condition.},
keywords={},
doi={10.1587/transfun.E98.A.2315},
ISSN={1745-1337},
month={November},}
Copy
TY - JOUR
TI - Ensemble and Multiple Kernel Regressors: Which Is Better?
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 2315
EP - 2324
AU - Akira TANAKA
AU - Hirofumi TAKEBAYASHI
AU - Ichigaku TAKIGAWA
AU - Hideyuki IMAI
AU - Mineichi KUDO
PY - 2015
DO - 10.1587/transfun.E98.A.2315
JO - IEICE TRANSACTIONS on Fundamentals
SN - 1745-1337
VL - E98-A
IS - 11
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - November 2015
AB - For the last few decades, learning with multiple kernels, represented by the ensemble kernel regressor and the multiple kernel regressor, has attracted much attention in the field of kernel-based machine learning. Although their efficacy was investigated numerically in many works, their theoretical ground is not investigated sufficiently, since we do not have a theoretical framework to evaluate them. In this paper, we introduce a unified framework for evaluating kernel regressors with multiple kernels. On the basis of the framework, we analyze the generalization errors of the ensemble kernel regressor and the multiple kernel regressor, and give a sufficient condition for the ensemble kernel regressor to outperform the multiple kernel regressor in terms of the generalization error in noise-free case. We also show that each kernel regressor can be better than the other without the sufficient condition by giving examples, which supports the importance of the sufficient condition.
ER -