Full Text Views
27
This paper discusses recent developments for pattern recognition focusing on boosting approach in machine learning. The statistical properties such as Bayes risk consistency for several loss functions are discussed in a probabilistic framework. There are a number of loss functions proposed for different purposes and targets. A unified derivation is given by a generator function U which naturally defines entropy, divergence and loss function. The class of U-loss functions associates with the boosting learning algorithms for the loss minimization, which includes AdaBoost and LogitBoost as a twin generated from Kullback-Leibler divergence, and the (partial) area under the ROC curve. We expand boosting to unsupervised learning, typically density estimation employing U-loss function. Finally, a future perspective in machine learning is discussed.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Osamu KOMORI, Shinto EGUCHI, "Boosting Learning Algorithm for Pattern Recognition and Beyond" in IEICE TRANSACTIONS on Information,
vol. E94-D, no. 10, pp. 1863-1869, October 2011, doi: 10.1587/transinf.E94.D.1863.
Abstract: This paper discusses recent developments for pattern recognition focusing on boosting approach in machine learning. The statistical properties such as Bayes risk consistency for several loss functions are discussed in a probabilistic framework. There are a number of loss functions proposed for different purposes and targets. A unified derivation is given by a generator function U which naturally defines entropy, divergence and loss function. The class of U-loss functions associates with the boosting learning algorithms for the loss minimization, which includes AdaBoost and LogitBoost as a twin generated from Kullback-Leibler divergence, and the (partial) area under the ROC curve. We expand boosting to unsupervised learning, typically density estimation employing U-loss function. Finally, a future perspective in machine learning is discussed.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.E94.D.1863/_p
Copy
@ARTICLE{e94-d_10_1863,
author={Osamu KOMORI, Shinto EGUCHI, },
journal={IEICE TRANSACTIONS on Information},
title={Boosting Learning Algorithm for Pattern Recognition and Beyond},
year={2011},
volume={E94-D},
number={10},
pages={1863-1869},
abstract={This paper discusses recent developments for pattern recognition focusing on boosting approach in machine learning. The statistical properties such as Bayes risk consistency for several loss functions are discussed in a probabilistic framework. There are a number of loss functions proposed for different purposes and targets. A unified derivation is given by a generator function U which naturally defines entropy, divergence and loss function. The class of U-loss functions associates with the boosting learning algorithms for the loss minimization, which includes AdaBoost and LogitBoost as a twin generated from Kullback-Leibler divergence, and the (partial) area under the ROC curve. We expand boosting to unsupervised learning, typically density estimation employing U-loss function. Finally, a future perspective in machine learning is discussed.},
keywords={},
doi={10.1587/transinf.E94.D.1863},
ISSN={1745-1361},
month={October},}
Copy
TY - JOUR
TI - Boosting Learning Algorithm for Pattern Recognition and Beyond
T2 - IEICE TRANSACTIONS on Information
SP - 1863
EP - 1869
AU - Osamu KOMORI
AU - Shinto EGUCHI
PY - 2011
DO - 10.1587/transinf.E94.D.1863
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E94-D
IS - 10
JA - IEICE TRANSACTIONS on Information
Y1 - October 2011
AB - This paper discusses recent developments for pattern recognition focusing on boosting approach in machine learning. The statistical properties such as Bayes risk consistency for several loss functions are discussed in a probabilistic framework. There are a number of loss functions proposed for different purposes and targets. A unified derivation is given by a generator function U which naturally defines entropy, divergence and loss function. The class of U-loss functions associates with the boosting learning algorithms for the loss minimization, which includes AdaBoost and LogitBoost as a twin generated from Kullback-Leibler divergence, and the (partial) area under the ROC curve. We expand boosting to unsupervised learning, typically density estimation employing U-loss function. Finally, a future perspective in machine learning is discussed.
ER -