Home >

news Help

Publication Information


Title
Japanese: 
English:Convergence Rate Bounds for the Mirror Descent Method: IQCs, the Popov Criterion and Bregman Divergence 
Author
Japanese: Li Mengmou, Khaled Laib, 畑中 健志, Ioannis Lestas.  
English: Mengmou Li, Khaled Laib, Takeshi Hatanaka, Ioannis Lestas.  
Language English 
Journal/Book name
Japanese: 
English:Automatica 
Volume, Number, Page vol. 171        111973
Published date Jan. 2025 
Publisher
Japanese: 
English: 
Conference name
Japanese: 
English: 
Conference site
Japanese: 
English: 
DOI https://doi.org/10.1016/j.automatica.2024.111973
Abstract This paper presents a comprehensive convergence analysis for the mirror descent (MD) method, a widely used algorithm in convex optimization. The key feature of this algorithm is that it provides a generalization of classical gradient-based methods via the use of generalized distance-like functions, which are formulated using the Bregman divergence. Establishing convergence rate bounds for this algorithm is in general a non-trivial problem due to the lack of monotonicity properties in the composite nonlinearities involved. In this paper, we show that the Bregman divergence from the optimal solution, which is commonly used as a Lyapunov function for this algorithm, is a special case of Lyapunov functions that follow when the Popov criterion is applied to an appropriate reformulation of the MD dynamics. This is then used as a basis to construct an integral quadratic constraint (IQC) framework through which convergence rate bounds with reduced conservatism can be deduced. We also illustrate via examples that the convergence rate bounds derived can be tight.

©2007 Institute of Science Tokyo All rights reserved.