Convergence of the operator extrapolation method




variational inequality, pseudo-monotonicity, sequential weak continuity, Bregman divergence, operator extrapolation, adaptivity, weak convergence


One of the popular areas of the modern applied nonlinear analysis is the study of variational inequalities and the development of methods for approximating their solutions. Many important problems of the research of operations, optimal control theory, and mathematical physics can be written in the form of variational inequalities. Non-smooth optimization problems can be solved effectively, if they are reformulated as saddle problems, and modern approximate algorithms for solving the variational inequalities are applied to the obtained saddle problems. With the advent of generating adversarial neural networks (GANs), the strong interest in the use and investigation of iterative algorithms for solving the variational inequalities arose in the ML-community. This paper is devoted to the study of two new approximate algorithms with the Bregman projection for solving the variational inequalities in a Hilbert space. The first algorithm, which we call the operator extrapolation algorithm, is obtained by replacing the Euclidean metric in the Malitsky–Tam method with the Bregman divergence. An attractive feature of the algorithm is only one computation at the iterative step of the Bregman projection onto the feasible set. The second algorithm is an adaptive version of the first, where the used rule for updating the step size does not require knowledge of Lipschitz constants and the calculation of operator values at additional points. For variational inequalities with pseudo-monotone, Lipschitz-continuous, and sequentially weakly continuous operators acting in a Hilbert space, some weak convergence theorems are proved.


Nemirovski, A. (2004). Prox-method with rate of convergence O(1/T) for variational inequalities with Lipschitz continuous monotone operators and smooth convex-concave saddle point problems. SIAM J. Optimization, 15, pp. 229-251.

Gidel, G., Berard, H., Vincent, P., Lacoste-Julien, S. (2018). A Variational Inequality Perspective on Ge nerative Adversarial Networks. arXiv preprint arXiv:1802.10551.

Korpelevich, G. M. (1976). An extragradient method for finding saddle points and for other problems. Matecon, 12, No. 4, pp. 747-756.

Tseng, P. (2000). A modified forward-backward splitting method for maximal monotone mappings. SIAM J, Control and Optimization, 38, pp. 431-446.

Verlan, D. A., Semenov, V. V., Chabak, L. M. (2015). A Strongly Convergent Modified Extragradient Method for Variational Inequalities with Non-Lipschitz Operators. J. Automation and Inform. Sci., 47, No. 7, pp. 31- 46.

Nesterov, Yu. (2007). Dual extrapolation and its applications to solving variational inequalities and related problems. Math. Programming, 109, Iss. 2-3, pp. 319-344.

Denisov, S. V., Semenov, V. V., Stetsyuk, P. I. (2019). Bregman Extragradient Method with Monotone Rule of Step Adjustment. Cybern. Syst. Analysis, 55, Iss. 3, pp. 377-383.

Beck, A. (2017). First-Order Methods in Optimization. 479 p.

Popov, L. D. (1980). A modification of the Arrow-Hurwicz method for search of saddle points. Mathematical notes of the Academy of Sciences of the USSR, 28, Iss. 5, pp. 845-848.

Malitsky, Yu. V. & Semenov, V. V. (2014). An Extragradient Algorithm for Monotone Variational Inequalities. Cybern. Syst. Analysis, 50, Iss. 2, pp. 271-277.

Lyashko, S. I. & Semenov, V. V. (2016). A New Two-Step Proximal Algorithm of Solving the Problem of Equilibrium Programming. In: Goldengorin, B. (ed.) Optimization and Its Applications in Control and Data Sciences. Springer Optimization and Its Applications, vol. 115. Springer, pp. 315-325.

Semenov, V. V. (2017). A Version of the Mirror descent Method to Solve Variational Inequalities. Cybernetics and Systems Analysis, 53, Iss. 2, pp. 234-243.

Chabak, L., Semenov, V. & Vedel, Y. (2019). A New Non-Euclidean Proximal Method for Equilibrium

Problems. In: Chertov O., Mylovanov T., Kondratenko Y., Kacprzyk J., Kreinovich V., Stefanuk V. (eds.)

Recent Developments in Data Science and Intelligent Analysis of Information. ICDSIAI 2018. Advances in Intelligent Systems and Computing, vol. 836. Springer, Cham, pp. 50-58.

Malitsky, Y. & Tam, M. K. (2020). A Forward-Backward Splitting Method for Monotone Inclusions WithoutCocoercivity. SIAM Journal on Optimization, 30, No. 2, pp. 1451-1472.

Csetnek, E. R., Malitsky, Y. & Tam, M. K. (2019). Shadow Douglas-Rachford Splitting for Monotone

Inclusions. Appl Math Optim., 80, pp. 665-678.



How to Cite

Semenov В., Siryk Д., & Kharkov О. (2021). Convergence of the operator extrapolation method. Reports of the National Academy of Sciences of Ukraine, (4), 28–35.



Information Science and Cybernetics