The family of Quasi-Additive (QA) algorithms is a natural generalization of the perceptron learning, which is a kind of on-line learning having two parameter vectors: One is an accumulation of input vectors and the other is a weight vector for prediction associated with the former by a nonlinear function. We show that the vectors have a dually-flat structure from the information-geometric point of view, and this representation makes it easier to discuss the convergence properties.
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Kazushi IKEDA, "Geometric Properties of Quasi-Additive Learning Algorithms" in IEICE TRANSACTIONS on Fundamentals,
vol. E89-A, no. 10, pp. 2812-2817, October 2006, doi: 10.1093/ietfec/e89-a.10.2812.
Abstract: The family of Quasi-Additive (QA) algorithms is a natural generalization of the perceptron learning, which is a kind of on-line learning having two parameter vectors: One is an accumulation of input vectors and the other is a weight vector for prediction associated with the former by a nonlinear function. We show that the vectors have a dually-flat structure from the information-geometric point of view, and this representation makes it easier to discuss the convergence properties.
URL: https://global.ieice.org/en_transactions/fundamentals/10.1093/ietfec/e89-a.10.2812/_p
Copy
@ARTICLE{e89-a_10_2812,
author={Kazushi IKEDA, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={Geometric Properties of Quasi-Additive Learning Algorithms},
year={2006},
volume={E89-A},
number={10},
pages={2812-2817},
abstract={The family of Quasi-Additive (QA) algorithms is a natural generalization of the perceptron learning, which is a kind of on-line learning having two parameter vectors: One is an accumulation of input vectors and the other is a weight vector for prediction associated with the former by a nonlinear function. We show that the vectors have a dually-flat structure from the information-geometric point of view, and this representation makes it easier to discuss the convergence properties.},
keywords={},
doi={10.1093/ietfec/e89-a.10.2812},
ISSN={1745-1337},
month={October},}
Copy
TY - JOUR
TI - Geometric Properties of Quasi-Additive Learning Algorithms
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 2812
EP - 2817
AU - Kazushi IKEDA
PY - 2006
DO - 10.1093/ietfec/e89-a.10.2812
JO - IEICE TRANSACTIONS on Fundamentals
SN - 1745-1337
VL - E89-A
IS - 10
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - October 2006
AB - The family of Quasi-Additive (QA) algorithms is a natural generalization of the perceptron learning, which is a kind of on-line learning having two parameter vectors: One is an accumulation of input vectors and the other is a weight vector for prediction associated with the former by a nonlinear function. We show that the vectors have a dually-flat structure from the information-geometric point of view, and this representation makes it easier to discuss the convergence properties.
ER -