This paper presents a quantized gradient descent algorithm for distributed nonconvex optimization in multiagent systems that takes into account the bandwidth limitation of communication channels. Each agent encodes its estimation variable using a zoom-in parameter and sends the quantized intermediate variable to the neighboring agents. Then, each agent updates the estimation by decoding the received information. In this paper, we show that all agents achieve consensus and their estimated variables converge to a critical point in the optimization problem. A numerical example of a nonconvex logistic regression shows that there is a trade-off between the convergence rate of the estimation and the communication bandwidth.
Junya YOSHIDA
Osaka University
Naoki HAYASHI
Osaka University
Shigemasa TAKAI
Osaka University
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Junya YOSHIDA, Naoki HAYASHI, Shigemasa TAKAI, "Quantized Gradient Descent Algorithm for Distributed Nonconvex Optimization" in IEICE TRANSACTIONS on Fundamentals,
vol. E106-A, no. 10, pp. 1297-1304, October 2023, doi: 10.1587/transfun.2023EAP1020.
Abstract: This paper presents a quantized gradient descent algorithm for distributed nonconvex optimization in multiagent systems that takes into account the bandwidth limitation of communication channels. Each agent encodes its estimation variable using a zoom-in parameter and sends the quantized intermediate variable to the neighboring agents. Then, each agent updates the estimation by decoding the received information. In this paper, we show that all agents achieve consensus and their estimated variables converge to a critical point in the optimization problem. A numerical example of a nonconvex logistic regression shows that there is a trade-off between the convergence rate of the estimation and the communication bandwidth.
URL: https://global.ieice.org/en_transactions/fundamentals/10.1587/transfun.2023EAP1020/_p
Copy
@ARTICLE{e106-a_10_1297,
author={Junya YOSHIDA, Naoki HAYASHI, Shigemasa TAKAI, },
journal={IEICE TRANSACTIONS on Fundamentals},
title={Quantized Gradient Descent Algorithm for Distributed Nonconvex Optimization},
year={2023},
volume={E106-A},
number={10},
pages={1297-1304},
abstract={This paper presents a quantized gradient descent algorithm for distributed nonconvex optimization in multiagent systems that takes into account the bandwidth limitation of communication channels. Each agent encodes its estimation variable using a zoom-in parameter and sends the quantized intermediate variable to the neighboring agents. Then, each agent updates the estimation by decoding the received information. In this paper, we show that all agents achieve consensus and their estimated variables converge to a critical point in the optimization problem. A numerical example of a nonconvex logistic regression shows that there is a trade-off between the convergence rate of the estimation and the communication bandwidth.},
keywords={},
doi={10.1587/transfun.2023EAP1020},
ISSN={1745-1337},
month={October},}
Copy
TY - JOUR
TI - Quantized Gradient Descent Algorithm for Distributed Nonconvex Optimization
T2 - IEICE TRANSACTIONS on Fundamentals
SP - 1297
EP - 1304
AU - Junya YOSHIDA
AU - Naoki HAYASHI
AU - Shigemasa TAKAI
PY - 2023
DO - 10.1587/transfun.2023EAP1020
JO - IEICE TRANSACTIONS on Fundamentals
SN - 1745-1337
VL - E106-A
IS - 10
JA - IEICE TRANSACTIONS on Fundamentals
Y1 - October 2023
AB - This paper presents a quantized gradient descent algorithm for distributed nonconvex optimization in multiagent systems that takes into account the bandwidth limitation of communication channels. Each agent encodes its estimation variable using a zoom-in parameter and sends the quantized intermediate variable to the neighboring agents. Then, each agent updates the estimation by decoding the received information. In this paper, we show that all agents achieve consensus and their estimated variables converge to a critical point in the optimization problem. A numerical example of a nonconvex logistic regression shows that there is a trade-off between the convergence rate of the estimation and the communication bandwidth.
ER -