Our purpose is to estimate conditional probabilities of output labels in multiclass classification problems. Adaboost provides highly accurate classifiers and has potential to estimate conditional probabilities. However, the conditional probability estimated by Adaboost tends to overfit to training samples. We propose loss functions for boosting that provide shrinkage estimator. The effect of regularization is realized by shrinkage of probabilities toward the uniform distribution. Numerical experiments indicate that boosting algorithms based on proposed loss functions show significantly better results than existing boosting algorithms for estimation of conditional probabilities.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Takafumi KANAMORI, "Multiclass Boosting Algorithms for Shrinkage Estimators of Class Probability" in IEICE TRANSACTIONS on Information,
vol. E90-D, no. 12, pp. 2033-2042, December 2007, doi: 10.1093/ietisy/e90-d.12.2033.
Abstract: Our purpose is to estimate conditional probabilities of output labels in multiclass classification problems. Adaboost provides highly accurate classifiers and has potential to estimate conditional probabilities. However, the conditional probability estimated by Adaboost tends to overfit to training samples. We propose loss functions for boosting that provide shrinkage estimator. The effect of regularization is realized by shrinkage of probabilities toward the uniform distribution. Numerical experiments indicate that boosting algorithms based on proposed loss functions show significantly better results than existing boosting algorithms for estimation of conditional probabilities.
URL: https://global.ieice.org/en_transactions/information/10.1093/ietisy/e90-d.12.2033/_p
Copy
@ARTICLE{e90-d_12_2033,
author={Takafumi KANAMORI, },
journal={IEICE TRANSACTIONS on Information},
title={Multiclass Boosting Algorithms for Shrinkage Estimators of Class Probability},
year={2007},
volume={E90-D},
number={12},
pages={2033-2042},
abstract={Our purpose is to estimate conditional probabilities of output labels in multiclass classification problems. Adaboost provides highly accurate classifiers and has potential to estimate conditional probabilities. However, the conditional probability estimated by Adaboost tends to overfit to training samples. We propose loss functions for boosting that provide shrinkage estimator. The effect of regularization is realized by shrinkage of probabilities toward the uniform distribution. Numerical experiments indicate that boosting algorithms based on proposed loss functions show significantly better results than existing boosting algorithms for estimation of conditional probabilities.},
keywords={},
doi={10.1093/ietisy/e90-d.12.2033},
ISSN={1745-1361},
month={December},}
Copy
TY - JOUR
TI - Multiclass Boosting Algorithms for Shrinkage Estimators of Class Probability
T2 - IEICE TRANSACTIONS on Information
SP - 2033
EP - 2042
AU - Takafumi KANAMORI
PY - 2007
DO - 10.1093/ietisy/e90-d.12.2033
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E90-D
IS - 12
JA - IEICE TRANSACTIONS on Information
Y1 - December 2007
AB - Our purpose is to estimate conditional probabilities of output labels in multiclass classification problems. Adaboost provides highly accurate classifiers and has potential to estimate conditional probabilities. However, the conditional probability estimated by Adaboost tends to overfit to training samples. We propose loss functions for boosting that provide shrinkage estimator. The effect of regularization is realized by shrinkage of probabilities toward the uniform distribution. Numerical experiments indicate that boosting algorithms based on proposed loss functions show significantly better results than existing boosting algorithms for estimation of conditional probabilities.
ER -