Machine learning is used in various fields and demand for implementations is increasing. Within machine learning, a Random Forest is a multi-class classifier with high-performance classification, achieved using bagging and feature selection, and is capable of high-speed training and classification. However, as a type of ensemble learning, Random Forest determines classifications using the majority of multiple trees; so many decision trees must be built. Performance increases with the number of decision trees, requiring memory, and decreases if the number of decision trees is decreased. Because of this, the algorithm is not well suited to implementation on small-scale hardware as an embedded system. As such, we have proposed Boosted Random Forest, which introduces a boosting algorithm into the Random Forest learning method to produce high-performance decision trees that are smaller. When evaluated using databases from the UCI Machine learning Repository, Boosted Random Forest achieved performance as good or better than ordinary Random Forest, while able to reduce memory use by 47%. Thus, it is suitable for implementing Random Forests on embedded hardware with limited memory.
Yohei MISHINA
Chubu University
Ryuei MURATA
Chubu University
Yuji YAMAUCHI
Chubu University
Takayoshi YAMASHITA
Chubu University
Hironobu FUJIYOSHI
Chubu University
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Yohei MISHINA, Ryuei MURATA, Yuji YAMAUCHI, Takayoshi YAMASHITA, Hironobu FUJIYOSHI, "Boosted Random Forest" in IEICE TRANSACTIONS on Information,
vol. E98-D, no. 9, pp. 1630-1636, September 2015, doi: 10.1587/transinf.2014OPP0004.
Abstract: Machine learning is used in various fields and demand for implementations is increasing. Within machine learning, a Random Forest is a multi-class classifier with high-performance classification, achieved using bagging and feature selection, and is capable of high-speed training and classification. However, as a type of ensemble learning, Random Forest determines classifications using the majority of multiple trees; so many decision trees must be built. Performance increases with the number of decision trees, requiring memory, and decreases if the number of decision trees is decreased. Because of this, the algorithm is not well suited to implementation on small-scale hardware as an embedded system. As such, we have proposed Boosted Random Forest, which introduces a boosting algorithm into the Random Forest learning method to produce high-performance decision trees that are smaller. When evaluated using databases from the UCI Machine learning Repository, Boosted Random Forest achieved performance as good or better than ordinary Random Forest, while able to reduce memory use by 47%. Thus, it is suitable for implementing Random Forests on embedded hardware with limited memory.
URL: https://global.ieice.org/en_transactions/information/10.1587/transinf.2014OPP0004/_p
Copy
@ARTICLE{e98-d_9_1630,
author={Yohei MISHINA, Ryuei MURATA, Yuji YAMAUCHI, Takayoshi YAMASHITA, Hironobu FUJIYOSHI, },
journal={IEICE TRANSACTIONS on Information},
title={Boosted Random Forest},
year={2015},
volume={E98-D},
number={9},
pages={1630-1636},
abstract={Machine learning is used in various fields and demand for implementations is increasing. Within machine learning, a Random Forest is a multi-class classifier with high-performance classification, achieved using bagging and feature selection, and is capable of high-speed training and classification. However, as a type of ensemble learning, Random Forest determines classifications using the majority of multiple trees; so many decision trees must be built. Performance increases with the number of decision trees, requiring memory, and decreases if the number of decision trees is decreased. Because of this, the algorithm is not well suited to implementation on small-scale hardware as an embedded system. As such, we have proposed Boosted Random Forest, which introduces a boosting algorithm into the Random Forest learning method to produce high-performance decision trees that are smaller. When evaluated using databases from the UCI Machine learning Repository, Boosted Random Forest achieved performance as good or better than ordinary Random Forest, while able to reduce memory use by 47%. Thus, it is suitable for implementing Random Forests on embedded hardware with limited memory.},
keywords={},
doi={10.1587/transinf.2014OPP0004},
ISSN={1745-1361},
month={September},}
Copy
TY - JOUR
TI - Boosted Random Forest
T2 - IEICE TRANSACTIONS on Information
SP - 1630
EP - 1636
AU - Yohei MISHINA
AU - Ryuei MURATA
AU - Yuji YAMAUCHI
AU - Takayoshi YAMASHITA
AU - Hironobu FUJIYOSHI
PY - 2015
DO - 10.1587/transinf.2014OPP0004
JO - IEICE TRANSACTIONS on Information
SN - 1745-1361
VL - E98-D
IS - 9
JA - IEICE TRANSACTIONS on Information
Y1 - September 2015
AB - Machine learning is used in various fields and demand for implementations is increasing. Within machine learning, a Random Forest is a multi-class classifier with high-performance classification, achieved using bagging and feature selection, and is capable of high-speed training and classification. However, as a type of ensemble learning, Random Forest determines classifications using the majority of multiple trees; so many decision trees must be built. Performance increases with the number of decision trees, requiring memory, and decreases if the number of decision trees is decreased. Because of this, the algorithm is not well suited to implementation on small-scale hardware as an embedded system. As such, we have proposed Boosted Random Forest, which introduces a boosting algorithm into the Random Forest learning method to produce high-performance decision trees that are smaller. When evaluated using databases from the UCI Machine learning Repository, Boosted Random Forest achieved performance as good or better than ordinary Random Forest, while able to reduce memory use by 47%. Thus, it is suitable for implementing Random Forests on embedded hardware with limited memory.
ER -