Evaluating the generalization performance of learning machines without using additional test samples is one of the most important issues in the machine learning community. The subspace information criterion (SIC) is one of the methods for this purpose, which is shown to be an unbiased estimator of the generalization error with finite samples. Although the mean of SIC agrees with the true generalization error even in small sample cases, the scatter of SIC can be large under some severe conditions. In this paper, we therefore investigate the causes of degrading the precision of SIC, and discuss how its precision could be improved.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Masashi SUGIYAMA, "Improving Precision of the Subspace Information Criterion" in IEICE TRANSACTIONS on Fundamentals,
vol. E86-A, no. 7, pp. 1885-1895, July 2003, doi: .
Abstract: Evaluating the generalization performance of learning machines without using additional test samples is one of the most important issues in the machine learning community. The subspace information criterion (SIC) is one of the methods for this purpose, which is shown to be an unbiased estimator of the generalization error with finite samples. Although the mean of SIC agrees with the true generalization error even in small sample cases, the scatter of SIC can be large under some severe conditions. In this paper, we therefore investigate the causes of degrading the precision of SIC, and discuss how its precision could be improved.
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/e86-a_7_1885/_p
Copy
@ARTICLE{e86-a_7_1885,
author={Masashi SUGIYAMA, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={Improving Precision of the Subspace Information Criterion},
year={2003},
volume={E86-A},
number={7},
pages={1885-1895},
abstract={Evaluating the generalization performance of learning machines without using additional test samples is one of the most important issues in the machine learning community. The subspace information criterion (SIC) is one of the methods for this purpose, which is shown to be an unbiased estimator of the generalization error with finite samples. Although the mean of SIC agrees with the true generalization error even in small sample cases, the scatter of SIC can be large under some severe conditions. In this paper, we therefore investigate the causes of degrading the precision of SIC, and discuss how its precision could be improved.},
keywords={},
doi={},
ISSN={},
month={July},}
Copy
TY - JOUR
TI - Improving Precision of the Subspace Information Criterion
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 1885
EP - 1895
AU - Masashi SUGIYAMA
PY - 2003
DO -
JO - IEICE TRANSACTIONS on Fundamentals
SN -
VL - E86-A
IS - 7
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - July 2003
AB - Evaluating the generalization performance of learning machines without using additional test samples is one of the most important issues in the machine learning community. The subspace information criterion (SIC) is one of the methods for this purpose, which is shown to be an unbiased estimator of the generalization error with finite samples. Although the mean of SIC agrees with the true generalization error even in small sample cases, the scatter of SIC can be large under some severe conditions. In this paper, we therefore investigate the causes of degrading the precision of SIC, and discuss how its precision could be improved.
ER -