The solution of the ordinary kernel ridge regression, based on the squared loss function and the squared norm-based regularizer, can be easily interpreted as a stochastic linear estimator by considering the autocorrelation prior for an unknown true function. As is well known, a stochastic affine estimator is one of the simplest extensions of the stochastic linear estimator. However, its corresponding kernel regression problem is not revealed so far. In this paper, we give a formulation of the kernel regression problem, whose solution is reduced to a stochastic affine estimator, and also give interpretations of the formulation.
Akira TANAKA
Hokkaido University
Masanari NAKAMURA
Hokkaido University
Hideyuki IMAI
Hokkaido University
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Akira TANAKA, Masanari NAKAMURA, Hideyuki IMAI, "Kernel-Based Regressors Equivalent to Stochastic Affine Estimators" in IEICE TRANSACTIONS on Information,
vol. E105-D, no. 1, pp. 116-122, January 2022, doi: 10.1587/transinf.2021EDP7156.
Abstract: The solution of the ordinary kernel ridge regression, based on the squared loss function and the squared norm-based regularizer, can be easily interpreted as a stochastic linear estimator by considering the autocorrelation prior for an unknown true function. As is well known, a stochastic affine estimator is one of the simplest extensions of the stochastic linear estimator. However, its corresponding kernel regression problem is not revealed so far. In this paper, we give a formulation of the kernel regression problem, whose solution is reduced to a stochastic affine estimator, and also give interpretations of the formulation.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.2021EDP7156/_p
Copy
@ARTICLE{e105-d_1_116,
author={Akira TANAKA, Masanari NAKAMURA, Hideyuki IMAI, },
journal={IEICE TRANSACTIONS on Information},
title={Kernel-Based Regressors Equivalent to Stochastic Affine Estimators},
year={2022},
volume={E105-D},
number={1},
pages={116-122},
abstract={The solution of the ordinary kernel ridge regression, based on the squared loss function and the squared norm-based regularizer, can be easily interpreted as a stochastic linear estimator by considering the autocorrelation prior for an unknown true function. As is well known, a stochastic affine estimator is one of the simplest extensions of the stochastic linear estimator. However, its corresponding kernel regression problem is not revealed so far. In this paper, we give a formulation of the kernel regression problem, whose solution is reduced to a stochastic affine estimator, and also give interpretations of the formulation.},
keywords={},
doi={10.1587/transinf.2021EDP7156},
ISSN={1745-1361},
month={January},}
Copy
TY - JOUR
TI - Kernel-Based Regressors Equivalent to Stochastic Affine Estimators
T2 - IEICE TRANSACTIONS on Information
SP - 116
EP - 122
AU - Akira TANAKA
AU - Masanari NAKAMURA
AU - Hideyuki IMAI
PY - 2022
DO - 10.1587/transinf.2021EDP7156
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E105-D
IS - 1
JA - IEICE TRANSACTIONS on Information
Y1 - January 2022
AB - The solution of the ordinary kernel ridge regression, based on the squared loss function and the squared norm-based regularizer, can be easily interpreted as a stochastic linear estimator by considering the autocorrelation prior for an unknown true function. As is well known, a stochastic affine estimator is one of the simplest extensions of the stochastic linear estimator. However, its corresponding kernel regression problem is not revealed so far. In this paper, we give a formulation of the kernel regression problem, whose solution is reduced to a stochastic affine estimator, and also give interpretations of the formulation.
ER -