The search functionality is under construction.
The search functionality is under construction.

Keyword Search Result

[Keyword] regularized subspace information criterion(3hit)

1-3hit
  • A New Meta-Criterion for Regularized Subspace Information Criterion

    Yasushi HIDAKA  Masashi SUGIYAMA  

     
    PAPER-Pattern Recognition

      Vol:
    E90-D No:11
      Page(s):
    1779-1786

    In order to obtain better generalization performance in supervised learning, model parameters should be determined appropriately, i.e., they should be determined so that the generalization error is minimized. However, since the generalization error is inaccessible in practice, the model parameters are usually determined so that an estimator of the generalization error is minimized. The regularized subspace information criterion (RSIC) is such a generalization error estimator for model selection. RSIC includes an additional regularization parameter and it should be determined appropriately for better model selection. A meta-criterion for determining the regularization parameter has also been proposed and shown to be useful in practice. In this paper, we show that there are several drawbacks in the existing meta-criterion and give an alternative meta-criterion that can solve the problems. Through simulations, we show that the use of the new meta-criterion further improves the model selection performance.

  • Analytic Optimization of Adaptive Ridge Parameters Based on Regularized Subspace Information Criterion

    Shun GOKITA  Masashi SUGIYAMA  Keisuke SAKURAI  

     
    PAPER-Neural Networks and Bioengineering

      Vol:
    E90-A No:11
      Page(s):
    2584-2592

    In order to obtain better learning results in supervised learning, it is important to choose model parameters appropriately. Model selection is usually carried out by preparing a finite set of model candidates, estimating a generalization error for each candidate, and choosing the best one from the candidates. If the number of candidates is increased in this procedure, the optimization quality may be improved. However, this in turn increases the computational cost. In this paper, we focus on a generalization error estimator called the regularized subspace information criterion and derive an analytic form of the optimal model parameter over a set of infinitely many model candidates. This allows us to maximize the optimization quality while the computational cost is kept moderate.

  • Analytic Optimization of Shrinkage Parameters Based on Regularized Subspace Information Criterion

    Masashi SUGIYAMA  Keisuke SAKURAI  

     
    PAPER-Neural Networks and Bioengineering

      Vol:
    E89-A No:8
      Page(s):
    2216-2225

    For obtaining a higher level of generalization capability in supervised learning, model parameters should be optimized, i.e., they should be determined in such a way that the generalization error is minimized. However, since the generalization error is inaccessible in practice, model parameters are usually determined in such a way that an estimate of the generalization error is minimized. A standard procedure for model parameter optimization is to first prepare a finite set of candidates of model parameter values, estimate the generalization error for each candidate, and then choose the best one from the candidates. If the number of candidates is increased in this procedure, the optimization quality may be improved. However, this in turn increases the computational cost. In this paper, we give methods for analytically finding the optimal model parameter value from a set of infinitely many candidates. This maximally enhances the optimization quality while the computational cost is kept reasonable.