The search functionality is under construction.
The search functionality is under construction.

Quantized Gradient Descent Algorithm for Distributed Nonconvex Optimization

Junya YOSHIDA, Naoki HAYASHI, Shigemasa TAKAI

  • Full Text Views

    0

  • Cite this

Summary :

This paper presents a quantized gradient descent algorithm for distributed nonconvex optimization in multiagent systems that takes into account the bandwidth limitation of communication channels. Each agent encodes its estimation variable using a zoom-in parameter and sends the quantized intermediate variable to the neighboring agents. Then, each agent updates the estimation by decoding the received information. In this paper, we show that all agents achieve consensus and their estimated variables converge to a critical point in the optimization problem. A numerical example of a nonconvex logistic regression shows that there is a trade-off between the convergence rate of the estimation and the communication bandwidth.

Publication
IEICE TRANSACTIONS on Fundamentals Vol.E106-A No.10 pp.1297-1304
Publication Date
2023/10/01
Publicized
2023/04/13
Online ISSN
1745-1337
DOI
10.1587/transfun.2023EAP1020
Type of Manuscript
PAPER
Category
Systems and Control

Authors

Junya YOSHIDA
  Osaka University
Naoki HAYASHI
  Osaka University
Shigemasa TAKAI
  Osaka University

Keyword