AN ADAPTIVE VARIANT OF THE FRANK–WOLFE METHOD FOR RELATIVE SMOOTH CONVEX OPTIMIZATION PROBLEMS
- 作者: Vyguzov A.A.1,2, Stonyakina F.S.1,2,3
-
隶属关系:
- Moscow Institute of Physics and Technology
- Innopolis University
- V.I. Vernadsky Crimean Federal University, Republic of Crimea
- 期: 卷 65, 编号 3 (2025)
- 页面: 364-375
- 栏目: Optimal control
- URL: https://bakhtiniada.ru/0044-4669/article/view/293545
- DOI: https://doi.org/10.31857/S0044466925030105
- EDN: https://elibrary.ru/HSRUGM
- ID: 293545
如何引用文章
详细
作者简介
A. Vyguzov
Moscow Institute of Physics and Technology; Innopolis University
Email: al.vyguzov@yandex.ru
Dolgoprudny, 141701 Russia; Innopolis, 420500 Russia
F. Stonyakina
Moscow Institute of Physics and Technology; Innopolis University; V.I. Vernadsky Crimean Federal University, Republic of Crimea
Email: fedyor@mail.ru
Dolgoprudny, 141701 Russia; Innopolis, 420500 Russia; Simferopol, 295007 Russia
参考
- Bauschke H. H., Bolte J., and Teboulle M. A descent lemma beyond Lipschitz gradient continuity: first-order methods revisited and applications // Mathematics of Operations Research. 2017. V. 42. P. 330–48.
- Lu H., Freund R. M., and Nesterov Y. Relatively smooth convex optimization by first-order methods, and applications // SIAM Journal on Optimization 2018. V. 28. P. 333–54.
- Hendrikx H., Xiao L., Bubeck S., Bach F., and Massoulie L. Statistically preconditioned accelerated gradient method for distributed optimization // In: International conference on machine learning. PMLR. 2020:4203–27.
- Lu H. “Relative continuity” for non-Lipschitz nonsmooth convex optimization using stochastic (or deterministic) mirror descent // INFORMS Journal on Optimization 2019. V. 1. P. 288–303.
- Nesterov Y. Implementable tensor methods in unconstrained convex optimization // Mathematical Programming. 2021. V. 186 P. 157–83.
- Stonyakin F., Titov A., Alkousa M., Savchuk O., and Gasnikov A. Adaptive Algorithms for Relatively Lipschitz Continuous Convex Optimization Problems // Pure and Applied Functional Analysis. 2023. V. 8. P. 1505–26.
- Dragomir R. A., Taylor A. B., d’Aspremont A, and Bolte J. Optimal complexity and certification of Bregman first-order methods // Mathematical Programming 2022:1–43.
- Hanzely F., Richtarik P., and Xiao L. Accelerated Bregman Proximal Gradient Methods for Relatively Smooth Convex Optimization // Comput Optim Appl. 2021. V. 79. P. 405–440.
- Dragomir R. A. Bregman gradient methods for relatively-smooth optimization // PhD thesis. UT1 Capitole, 2021.
- Combettes C. W. and Pokutta S. Complexity of linear minimization and projection on some sets // Operations Research Letters. 2021. V. 49. P. 565–71.
- Bomze I. M., Rinaldi F., and Zeffiro D. Frank–Wolfe and friends: a journey into projection-free first order optimization methods // 4OR-Q J Oper Res. 2021. V. 14. P. 313–345.
- Frank M., Wolfe P., et al. An algorithm for quadratic programming // Naval research logistics quarterly. 1956. V. 3. P. 95–110.
- Aivazian G., Stonyakin F. S., Pasechnyk D., Alkousa M. S., Raigorodsky A., and Baran I. Adaptive variant of the Frank– Wolfe algorithm for convex optimization problems // Programming and Computer Software 2023. V. 49. P. 493–504.
- Polyak B. Introduction to Optimization. 2020.
- Braun G., Carderera A., Combettes C. W., et al. Conditional gradient methods // arXiv preprint arXiv:2211.14103 2022.
补充文件
