The search functionality is under construction.
The search functionality is under construction.

Keyword Search Result

[Keyword] kernel regression(2hit)

1-2hit
  • Kernel-Based Regressors Equivalent to Stochastic Affine Estimators

    Akira TANAKA  Masanari NAKAMURA  Hideyuki IMAI  

     
    PAPER-Artificial Intelligence, Data Mining

      Pubricized:
    2021/10/05
      Vol:
    E105-D No:1
      Page(s):
    116-122

    The solution of the ordinary kernel ridge regression, based on the squared loss function and the squared norm-based regularizer, can be easily interpreted as a stochastic linear estimator by considering the autocorrelation prior for an unknown true function. As is well known, a stochastic affine estimator is one of the simplest extensions of the stochastic linear estimator. However, its corresponding kernel regression problem is not revealed so far. In this paper, we give a formulation of the kernel regression problem, whose solution is reduced to a stochastic affine estimator, and also give interpretations of the formulation.

  • Ensemble and Multiple Kernel Regressors: Which Is Better?

    Akira TANAKA  Hirofumi TAKEBAYASHI  Ichigaku TAKIGAWA  Hideyuki IMAI  Mineichi KUDO  

     
    PAPER-Neural Networks and Bioengineering

      Vol:
    E98-A No:11
      Page(s):
    2315-2324

    For the last few decades, learning with multiple kernels, represented by the ensemble kernel regressor and the multiple kernel regressor, has attracted much attention in the field of kernel-based machine learning. Although their efficacy was investigated numerically in many works, their theoretical ground is not investigated sufficiently, since we do not have a theoretical framework to evaluate them. In this paper, we introduce a unified framework for evaluating kernel regressors with multiple kernels. On the basis of the framework, we analyze the generalization errors of the ensemble kernel regressor and the multiple kernel regressor, and give a sufficient condition for the ensemble kernel regressor to outperform the multiple kernel regressor in terms of the generalization error in noise-free case. We also show that each kernel regressor can be better than the other without the sufficient condition by giving examples, which supports the importance of the sufficient condition.