There have been some researchers that investigate the accuracy of the approximation to a function that shows a generating pattern of data by a deep neural network. However, they have confirmed only whether at least one function close to the function showing a generating pattern exists in function classes of deep neural networks whose parameter values are changing. Therefore, we propose a new criterion to infer the approximation accuracy. Our new criterion shows the existence ratio of functions close to the function showing a generating pattern in the function classes. Moreover, we show a deep neural network with a larger number of layers approximates the function showing a generating pattern more accurately than one with a smaller number of layers under the proposed criterion, with numerical simulations.
Yasushi ESAKI
Waseda University
Yuta NAKAHARA
Waseda University
Toshiyasu MATSUSHIMA
Waseda University
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Yasushi ESAKI, Yuta NAKAHARA, Toshiyasu MATSUSHIMA, "The Ratio of the Desired Parameters of Deep Neural Networks" in IEICE TRANSACTIONS on Fundamentals,
vol. E105-A, no. 3, pp. 433-435, March 2022, doi: 10.1587/transfun.2021TAL0003.
Abstract: There have been some researchers that investigate the accuracy of the approximation to a function that shows a generating pattern of data by a deep neural network. However, they have confirmed only whether at least one function close to the function showing a generating pattern exists in function classes of deep neural networks whose parameter values are changing. Therefore, we propose a new criterion to infer the approximation accuracy. Our new criterion shows the existence ratio of functions close to the function showing a generating pattern in the function classes. Moreover, we show a deep neural network with a larger number of layers approximates the function showing a generating pattern more accurately than one with a smaller number of layers under the proposed criterion, with numerical simulations.
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/transfun.2021TAL0003/_p
Copy
@ARTICLE{e105-a_3_433,
author={Yasushi ESAKI, Yuta NAKAHARA, Toshiyasu MATSUSHIMA, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={The Ratio of the Desired Parameters of Deep Neural Networks},
year={2022},
volume={E105-A},
number={3},
pages={433-435},
abstract={There have been some researchers that investigate the accuracy of the approximation to a function that shows a generating pattern of data by a deep neural network. However, they have confirmed only whether at least one function close to the function showing a generating pattern exists in function classes of deep neural networks whose parameter values are changing. Therefore, we propose a new criterion to infer the approximation accuracy. Our new criterion shows the existence ratio of functions close to the function showing a generating pattern in the function classes. Moreover, we show a deep neural network with a larger number of layers approximates the function showing a generating pattern more accurately than one with a smaller number of layers under the proposed criterion, with numerical simulations.},
keywords={},
doi={10.1587/transfun.2021TAL0003},
ISSN={1745-1337},
month={March},}
Copy
TY - JOUR
TI - The Ratio of the Desired Parameters of Deep Neural Networks
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 433
EP - 435
AU - Yasushi ESAKI
AU - Yuta NAKAHARA
AU - Toshiyasu MATSUSHIMA
PY - 2022
DO - 10.1587/transfun.2021TAL0003
JO - IEICE TRANSACTIONS on Fundamentals
SN - 1745-1337
VL - E105-A
IS - 3
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - March 2022
AB - There have been some researchers that investigate the accuracy of the approximation to a function that shows a generating pattern of data by a deep neural network. However, they have confirmed only whether at least one function close to the function showing a generating pattern exists in function classes of deep neural networks whose parameter values are changing. Therefore, we propose a new criterion to infer the approximation accuracy. Our new criterion shows the existence ratio of functions close to the function showing a generating pattern in the function classes. Moreover, we show a deep neural network with a larger number of layers approximates the function showing a generating pattern more accurately than one with a smaller number of layers under the proposed criterion, with numerical simulations.
ER -