Home >

news ヘルプ

論文・著書情報


タイトル
和文: 
英文:Convergence Rate Bounds for the Mirror Descent Method: IQCs, the Popov Criterion and Bregman Divergence 
著者
和文: Li Mengmou, Khaled Laib, 畑中 健志, Ioannis Lestas.  
英文: Mengmou Li, Khaled Laib, Takeshi Hatanaka, Ioannis Lestas.  
言語 English 
掲載誌/書名
和文: 
英文:Automatica 
巻, 号, ページ vol. 171        111973
出版年月 2025年1月 
出版者
和文: 
英文: 
会議名称
和文: 
英文: 
開催地
和文: 
英文: 
DOI https://doi.org/10.1016/j.automatica.2024.111973
アブストラクト This paper presents a comprehensive convergence analysis for the mirror descent (MD) method, a widely used algorithm in convex optimization. The key feature of this algorithm is that it provides a generalization of classical gradient-based methods via the use of generalized distance-like functions, which are formulated using the Bregman divergence. Establishing convergence rate bounds for this algorithm is in general a non-trivial problem due to the lack of monotonicity properties in the composite nonlinearities involved. In this paper, we show that the Bregman divergence from the optimal solution, which is commonly used as a Lyapunov function for this algorithm, is a special case of Lyapunov functions that follow when the Popov criterion is applied to an appropriate reformulation of the MD dynamics. This is then used as a basis to construct an integral quadratic constraint (IQC) framework through which convergence rate bounds with reduced conservatism can be deduced. We also illustrate via examples that the convergence rate bounds derived can be tight.

©2007 Institute of Science Tokyo All rights reserved.