ADAPTIVE PRIMAL–DUAL METHODS WITH AN INEXACT ORACLE FOR RELATIVELY SMOOTH OPTIMIZATION PROBLEMS AND THEIR APPLICATIONS TO RECOVERING LOW-RANK MATRICES
- 作者: Savchuk O.S1,2, Stonyakin F.S1,2,3, Vyguzov A.A1,3, Alkousa M.S3, Gasnikov A.V1,3
-
隶属关系:
- NRU MIPT
- V.I. Vernadsky Crimean Federal University
- Innopolis University
- 期: 卷 65, 编号 7 (2025)
- 页面: 1156-1177
- 栏目: General numerical methods
- URL: https://bakhtiniada.ru/0044-4669/article/view/304082
- DOI: https://doi.org/10.31857/S0044466925070077
- EDN: https://elibrary.ru/JXZDQW
- ID: 304082
如何引用文章
详细
作者简介
O. Savchuk
NRU MIPT; V.I. Vernadsky Crimean Federal University
Email: oleg.savchuk19@mail.ru
Dolgoprudny, Russia; Simferopol, Russia
F. Stonyakin
NRU MIPT; V.I. Vernadsky Crimean Federal University; Innopolis University
Email: fedvor@mail.ru
Dolgoprudny, Russia; Simferopol, Russia; Innopolis, Russia
A. Vyguzov
NRU MIPT; Innopolis University
Email: al.vyguzov@yandex.ru
Dolgoprudny, Russia; Innopolis, Russia
M. Alkousa
Innopolis University
Email: m.alkousa@mnopolis.ru
Innopolis, Russia
A. Gasnikov
NRU MIPT; Innopolis University
Email: gasnikov@yandex.ru
Dolgoprudny, Russia; Innopolis, Russia
参考
- Nesterov Yu. Lectures on convex optimization, volume 137. Springer, 2018.
- Beck A. First-order methods in optimization. SIAM, 2017.
- Lu H. Relative continuity for non-lipschitz nonsmooth convex optimization using stochastic (or deterministic) mirror descent // INFORMS Journal on Optimization. 2019. 1(4):288–303.
- Teboulle M., Bauschke H., Bolte J. A descent lemma beyond lipschitz gradient continuity: first-order methods revisited and applications // Math. Oper. Res. 2017. 42(2):330–348.
- Lu H., Freund R., Nesterov Yu. Relatively smooth convex optimization by first-order methods, and applications // SIOPT. 2018. 28(1):333--354.
- Nesterov Yu. Implementable tensor methods in unconstrained convex optimization // Math. Program. 2021. 186:157--183.
- Nesterov Yu. Inexact accelerated high-order proximal-point methods // Math. Program. 2023. P. 1--26.
- Bertero M., Boccacci P., Desidera G., Vicidomini G. Image deblurring with poisson data: from cells to galaxies // Inverse Probl. 2009. 25(12):123006.
- Csiszar I. Why least squares and maximum entropy? an axiomatic approach to inference for linear inverse problems // Ann. Stat. 1991. 19(4):2032--2066.
- Wolfowitz J., Kiefer J. Optimum designs in regression problems // Ann. Math. Stat. 1959. 30(2):271--294.
- Anwood C.L. Optimal and efficient designs of experiments // Ann. Math. Stat. 1969. P. 1570--1602.
- Zhou Y., Liang Y., Shen L. A simple convergence analysis of bregman proximal gradient algorithm // Comput. Optim. Appl. 2019. 73:903--912.
- Dragomir R.A. Bregman gradient methods for relatively-smooth optimization PhD thesis, UT1 Capitole, 2021.
- Xiao L., Hanzely F., Richtarik P. Accelerated bregman proximal gradient methods for relatively smooth convex optimization // Comput. Optim. Appl. 2021. 79:405--440.
- Bolte J., Dragomir R.A., d'Aspremont A. Quartic first-order methods for low-rank minimization // J. Optim. Theory Appl. 2021. 189:341--363.
- Monteiro R., Burer S. Local minima and convergence in low-rank semidefinite programming // Math. Program. 2005. 103(3):427--444.
- Nocedal J., Bottou L., Curtis F. Optimization methods for large-scale machine learning // SIAM review. 2018. 60(2):223--311.
- Bengio Y., Bergstra J. Random search for hyper-parameter optimization // JMLR. 2012. 13(2).
- Гасников А.В. Современные численные методы оптимизации. Метод универсального градиентного спуска. М.: МЦНМО, 2021.
- Artamonov S., Piskunova V., Stonyakin F., Tyurin A., Gasnikov A., Dvurechensky P., Agafonov A., Divinskikh D., Alkousa M., Pasechnyyuk D. Inexact model: A framework for optimization and variational inequalities // Optim. Methods Softw. 2021. 36(6):1155--1201.
- Тюрин А.И. Прямо-двойственный быстрый градиентный метод с моделью // Компьютерные исследования и моделирование. 2020. 12(2):263--274.
- Nesterov Yu. Primal-dual subgradient methods for convex problems // Math. Program. 2009. 120(1):221--259.
- Гасников А.В., Тюрин А.И. Быстрый градиентный спуск для задач выпуклой минимизации с оракулом, выдающим (δ, l)-модель функции в запрошенной точке // Ж. вычисл. матем. и матем. физ. 2019. 59(7):1137--1150.
- Ben-Tal A., Nemirovski A. Lectures on Modern Convex Optimization. 2012. 01.
- Dragomir R.A., Taylor A., d'Aspremont A., Bolte J. Optimal complexity and certification of bregman first-order methods // Math. Program. 2021. 194, 04.
- Bogolubsky L., Dvurechenskii P., Gasnikov A., Gusev G., Nesterov Yu., Raigorodskii A., Tikhonov A., Zhukovskii M. Learning supervised pagerank with gradient-based and gradient-free optimization methods // NeurIPS. 2016. 29.
- Nesterov Yu., Devolder O., Gilneur F. First-order methods of smooth convex optimization with inexact oracle // Math. Program. 2014. 146:37--75.
- Gasnikov A., Kamzolov D., Dvurechensky P. Universal intermediate gradient method for convex problems with inexact oracle // Optim. Methods Softw. 36(6):1289--1316, 2021.
- Nesterov Yu. Universal gradient methods for convex optimization problems // Math. Program. 2015. 152(1):381--404.
- Нестеров Ю.Е., Гасников А.В., Двуреченский П.Е. Стожетические градиентные методы с неточным оракулом // Труды Московского физико-технического института. 2016. 8(1 (29)):41--91.
- Hendrikx H., Xiao L., Bubeck S., Bach F., Massoulie L. Statistically preconditioned accelerated gradient method for distributed optimization, 2020.
- Boyd S. Convex optimization. Cambridge UP, 2004.
- Savchuk O.S., Alkousa M.S., Shushko A.S., Vyguzov A.A., Stonyakin F.S., Pasechnyuk D.A., Gasnikov A.V. Accelerated bregman gradient methods for relatively smooth and relatively lipschitz continuous minimization problems // arXiv preprint. 2024. arXiv:2411.16743.
- Nesterov Yu. Complexity bounds for primal-dual methods minimizing the model of objective function // Math. Program. 2018. 171(1):311--330.
- Jaggi M. Revisiting frank-wolfe: Projection-free sparse convex optimization. 2013. V. 28. 01.
- Nemirovski A., Harchaoui Z., Juditsky A. Conditional gradient algorithms for norm-regularized smooth convex optimization // Math. Program. 2013. 152. 02.
- Harter A., Samaria F. Parameterisation of a stochastic model for human face identification // Proceedings of 1994 IEEE workshop on applications of computer vision. 1994. P. 138--142.
- Bayandina A., Dwarechensky P., Gasnikov A., Stonyakin F., Titov A. Mirror descent and convex optimization problems with non-smooth inequality constraints. Large-scale and distributed optimization. 2018. P. 181--213.
- Aivazian G.V., Stonyakin F.S., Pasechnyk D.A., Alkousa M.S., Raigorodsky A.M., Baran I.V. Adaptive variant of the frank–wolfe algorithm for convex optimization problems // Programming and Computer Software. 2023. 49(6):493--504.
补充文件
