Kernel-based learning algorithms have been successfully applied in various problem domains, given appropriate kernel functions. In this paper, we discuss the problem of designing kernel functions for binary regression and show that using a bell-shaped cosine function as a kernel function is optimal in some sense. The rationale of this result is based on the Karhunen-Loeve expansion, i.e., the optimal approximation to a set of functions is given by the principal component of the correlation operator of the functions.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Masashi SUGIYAMA, Hidemitsu OGAWA, "Constructing Kernel Functions for Binary Regression" in IEICE TRANSACTIONS on Information,
vol. E89-D, no. 7, pp. 2243-2249, July 2006, doi: 10.1093/ietisy/e89-d.7.2243.
Abstract: Kernel-based learning algorithms have been successfully applied in various problem domains, given appropriate kernel functions. In this paper, we discuss the problem of designing kernel functions for binary regression and show that using a bell-shaped cosine function as a kernel function is optimal in some sense. The rationale of this result is based on the Karhunen-Loeve expansion, i.e., the optimal approximation to a set of functions is given by the principal component of the correlation operator of the functions.
URL: https://global.ieice.org/en_transactions/information/10.1093/ietisy/e89-d.7.2243/_p
Copy
@ARTICLE{e89-d_7_2243,
author={Masashi SUGIYAMA, Hidemitsu OGAWA, },
journal={IEICE TRANSACTIONS on Information},
title={Constructing Kernel Functions for Binary Regression},
year={2006},
volume={E89-D},
number={7},
pages={2243-2249},
abstract={Kernel-based learning algorithms have been successfully applied in various problem domains, given appropriate kernel functions. In this paper, we discuss the problem of designing kernel functions for binary regression and show that using a bell-shaped cosine function as a kernel function is optimal in some sense. The rationale of this result is based on the Karhunen-Loeve expansion, i.e., the optimal approximation to a set of functions is given by the principal component of the correlation operator of the functions.},
keywords={},
doi={10.1093/ietisy/e89-d.7.2243},
ISSN={1745-1361},
month={July},}
Copy
TY - JOUR
TI - Constructing Kernel Functions for Binary Regression
T2 - IEICE TRANSACTIONS on Information
SP - 2243
EP - 2249
AU - Masashi SUGIYAMA
AU - Hidemitsu OGAWA
PY - 2006
DO - 10.1093/ietisy/e89-d.7.2243
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E89-D
IS - 7
JA - IEICE TRANSACTIONS on Information
Y1 - July 2006
AB - Kernel-based learning algorithms have been successfully applied in various problem domains, given appropriate kernel functions. In this paper, we discuss the problem of designing kernel functions for binary regression and show that using a bell-shaped cosine function as a kernel function is optimal in some sense. The rationale of this result is based on the Karhunen-Loeve expansion, i.e., the optimal approximation to a set of functions is given by the principal component of the correlation operator of the functions.
ER -