## About

86

Publications

5,655

Reads

**How we measure 'reads'**

A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more

343

Citations

Introduction

## Publications

Publications (86)

The paper is devoted to subgradient methods with switching between productive and nonproductive steps for problems of minimization of quasiconvex functions under functional inequality constraints. For the problem of minimizing a convex function with quasiconvex inequality constraints, a result is obtained on the convergence of the subgradient metho...

The article is devoted to some adaptive methods for variational inequalities with relatively smooth and relatively strongly monotone operators. Starting from the recently proposed proximal variant of the extragradient method for this class of problems, we investigate in detail the method with adaptively selected parameter values. An estimate of the...

Some variant of the Frank-Wolfe method for convex optimization problems with adaptive selection of the step parameter corresponding to information about the smoothness of the objective function (the Lipschitz constant of the gradient). Theoretical estimates of the quality of the solution provided by the method are obtained in terms of adaptively se...

In this paper we propose a generalized condition for a sharp minimum, somewhat similar to the inexact oracle proposed recently by Devolder-Glineur-Nesterov. The proposed approach makes it possible to extend the class of applicability of subgradient methods with the Polyak step-size, to the situation of inexact information about the value of the min...

Статья посвящена существенному расширению недавно предложенного класса относительно сильно выпуклых оптимизационных задач в пространствах больших размерностей. В работе вводится аналог понятия относительной сильной выпуклости для вариационных неравенств (относительная сильная монотонность) и исследуются оценки скорости сходимости некоторых численны...

In this paper, we introduce some adaptive methods for solving variational inequalities with relatively strongly monotone operators. Firstly, we focus on the modification of the recently proposed, in smooth case [1], adaptive numerical method for generalized smooth (with H\"older condition) saddle point problem, which has convergence rate estimates...

It is well-known that accelerated first-order gradient methods possess optimal complexity estimates for the class of convex smooth minimization problems. In many practical situations it makes sense to work with inexact gradient information. However, this can lead to an accumulation of corresponding inexactness in the theoretical estimates of the ra...

We introduce a notion of inexact model of a convex objective function, which allows for errors both in the function and in its gradient. For this situation, a gradient method with an adaptive adjustment of some parameters of the model is proposed and an estimate for the convergence rate is found. This estimate is optimal on a class of sufficiently...

The article is devoted to the development of numerical methods for solving saddle point problems and variational inequalities with simplified requirements for the smoothness conditions of functionals. Recently, some notable methods for optimization problems with strongly monotone operators were proposed. Our focus here is on newly proposed techniqu...

The article is devoted to the development of numerical methods for solving variational inequalities with relatively strongly monotone operators. We consider two classes of variational inequalities related to some analogs of the Lipschitz condition of the operator that appeared several years ago. One of these classes is associated with the relative...

It is well-known that accelerated gradient first-order methods possess optimal complexity estimates for the class of convex smooth minimization problems. In many practical situations it makes sense to work with inexact gradient information. However, this can lead to an accumulation of corresponding inexactness in the theoretical estimates of the ra...

Recently there were proposed some innovative convex optimization concepts, namely, relative smoothness [1] and relative strong convexity [2,3]. These approaches have significantly expanded the class of applicability of gradient-type methods with optimal estimates of the convergence rate, which are invariant regardless of the dimensionality of the p...

In this paper, we propose a general algorithmic framework for the first-order methods in optimization in a broad sense, including minimization problems, saddle-point problems and variational inequalities (VIs). This framework allows obtaining many known methods as a special case, the list including accelerated gradient method, composite optimizatio...

This textbook is based on lectures given by the authors at MIPT (Moscow), HSE (Moscow), FEFU (Vladivostok), V.I. Vernadsky KFU (Simferopol), ASU (Republic of Adygea), and the University of Grenoble-Alpes (Grenoble, France). First of all, the authors focused on the program of a two-semester course of lectures on convex optimization, which is given t...

Basing on some recently proposed methods for solving variational inequalities with non-smooth operators, we propose an analogue of the Mirror Prox method for the corresponding class of problems under the assumption of relative smoothness and relative strong monotonicity of the operator.

In this paper, we consider gradient-type methods for convex positively homogeneous optimization problems with relative accuracy. An analogue of the accelerated universal gradient-type method for positively homogeneous optimization problems with relative accuracy is investigated. The second approach is related to subgradient methods with B. T. Polya...

The article is devoted to the development of numerical methods for solving saddle point problems and variational inequalities with simplified requirements for the smoothness conditions of functionals. Recently there were proposed some notable methods for optimization problems with strongly monotone operators. Our focus here is on newly proposed tec...

We consider the problem of learning the optimal policy for infinite-horizon Markov decision processes (MDPs). For this purpose, some variant of Stochastic Mirror Descent is proposed for convex programming problems with Lipschitz-continuous functionals. An important detail is the ability to use inexact values of functional constraints. We analyze th...

We consider an interesting class of composite optimization problems with a gradient dominance condition and introduce corresponding analogue of the recently proposed concept of an inexact oracle. This concept is applied to some classes of smooth functional.

We propose some adaptive mirror descent dethods for convex programming problems with delta-subgradients and prove some theoretical results.

Recently, it has been shown how, on the basis of the usual accelerated gradient method for solving problems of smooth convex optimization, accelerated methods for more complex problems (with a structure) and problems that are solved using various local information about the behavior of a function (stochastic gradient, Hessian, etc.) can be obtained...

The article is devoted to the development of algorithmic methods for strongly convex-concave saddle-point problems in the case when one of the groups of variables has a large dimension, and the other is sufficiently small (up to a hundred). The proposed technique is based on reducing problems of this type to a problem of minimizing a convex (maximi...

Recently some specific classes of non-smooth and non-Lipsch-itz convex optimization problems were considered by Yu. Nesterov and H. Lu. We consider convex programming problems with similar smoothness conditions for the objective function and functional constraints. We introduce a new concept of an inexact model and propose some analogues of switchi...

For problems of unconstrained optimization, the concept of inexact oracle proposed by O. Devolder, F. Glineur and Yu.E. Nesterov is well known. We introduce an analog of the concept of inexact oracle (model of a function) for abstract equilibrium problems, variational inequalities, and saddle-point problems. This allows us to propose an analog of N...

Recently some specific classes of non-smooth and non-Lipschitz convex optimization problems were selected by Yu.~Nesterov along with H.~Lu. We consider convex programming problems with similar smoothness conditions for the objective function and functional constraints. We introduce a new concept of an inexact model and propose some analogues of swi...

This chapter is devoted to the blackbox subgradient algorithms with the minimal requirements for the storage of auxiliary results, which are necessary to execute these algorithms. To provide historical perspective this survey starts with the original result of Shor which opened this field with the application to the classical transportation problem...

In this paper we propose a general algorithmic framework for first-order methods in optimization in a broad sense, including minimization problems, saddle-point problems and variational inequalities. This framework allows to obtain many known methods as a special case, the list including accelerated gradient method, composite optimization methods,...

Some adaptive analogue of the Mirror Prox method for variational inequalities is proposed. In this work we consider the adaptation not only to the value of the Lipschitz constant, but also to the magnitude of the oracle error. This approach, in particular, allows us to prove a complexity near for variational inequalities for a special class of mono...

Network utility maximization is the most important problem in network traffic management. Given the growth of modern communication networks, we consider utility maximization problem in a network with a large number of connections (links) that are used by a huge number of users. To solve this problem an adaptive mirror descent algorithm for many con...

We propose several adaptive algorithmic methods for problems of non-smooth convex optimization. The first of them is based on a special artificial inexactness. Namely, the concept of inexact ($ \delta, \Delta, L$)-model of objective functional in optimization is introduced and some gradient-type methods with adaptation of inexactness parameters are...

Network utility maximization is the most important problem in network traffic management. Given the growth of modern communication networks, we consider the utility maximization problem in a network with a large number of connections (links) that are used by a huge number of users. To solve this problem an adaptive mirror descent algorithm for many...

In this paper, some analogs of the Devolder–Glineu–Nesterov (𝛿,𝐿,𝜇)-oracle are introduced for optimization problems.At the same time, various types of conditions of relative smoothness and relative strong convexity of the objective function are highlighted. Examples of convex and strongly convex optimization problems admitting the existence of inex...

Based on the ideas of arXiv:1710.06612, we consider the problem of minimization of the Lipschitz-continuous non-smooth functional $f$ with non-positive convex (generally, non-smooth) Lipschitz-continuous functional constraint. We propose some novel strategies of step-sizes and adaptive stopping rules in Mirror Descent algorithms for the considered...

An analogue of the Adaptive Mirror Prox method for variational inequalities is proposed. In this work we consider the adaptation not only to the level of operator smoothness, but also to the magnitude of the oracle error. This approach, in particular, allows us to prove a complexity near $O(1/ \varepsilon)$ for variational inequalities for a specia...

Under consideration are some adaptive mirror descent algorithms for the problems of minimization of a convex objective functional with several convex Lipschitz (generally, nonsmooth) functional constraints. It is demonstrated that the methods are applicable to the objective functionals of various levels of smoothness: The Lipschitz condition holds...

We consider optimization methods for convex minimization problems under inexact information on the objective function. We introduce inexact model of the objective, which as a particular cases includes inexact oracle [16] and relative smoothness condition [36]. We analyze gradient method which uses this inexact model and obtain convergence rates for...

Based on G. Lan's accelerated gradient sliding and general relation between the smoothness and strong convexity parameters of function under Legendre transformation we show that under rather general conditions the best known bounds for bilinear convex-concave smooth composite saddle point problem keep true for or non-bilinear convex-concave smooth...

We consider the classical optimization problem of minimizing a strongly convex, non-smooth, Lipschitz-continuous function with one Lipschitz-continuous constraint. We develop the approach in [10] and propose two methods for the considered problem with adaptive stopping rules. The main idea of the methods is using the dichotomy method and solving an...

An adaptive analog of Nesterov’s method for variational inequalities with a strongly monotone operator is proposed. The main idea of the method is an adaptive choice of constants in the maximized concave functionals at each iteration. In this case there is no need in specifying exact values of the constants, since this method makes it possible to f...

We consider optimization methods for convex minimization problems under inexact information on the objective function. We introduce inexact model of the objective, which as a particular cases includes $(\delta,L)$ inexact oracle and relative smoothness condition. We analyze gradient method which uses this inexact model and obtain convergence rates...

This chapter is devoted to the black-box subgradient algorithms with the minimal requirements for the storage of auxiliary results, which are necessary to execute these algorithms. It starts with the original result of N.Z. Shor which open this field with the application to the classical transportation problem. To discuss the fundamentals of non-sm...

In this paper we propose a general algorithmic framework for first-order methods in optimization in a broad sense, including minimization problems, saddle-point problems and variational inequalities. This framework allows to obtain many known methods as a special case, the list including accelerated gradient method, composite optimization methods,...

We consider the following class of online optimization problems with functional constraints. Assume, that a finite set of convex Lipschitz-continuous non-smooth functionals are given on a closed set of n-dimensional vector space. The problem is to minimize the arithmetic mean of functionals with a convex Lipschitz-continuous non-smooth constraint....

An adaptive proximal method for a special class of abstract variational inequalities is proposed. For example, the so-called mixed variational inequalities and composite saddle problems are considered. Some estimates of the necessary number of iterations are obtained to achieve a given quality of the variational inequality solution.

In the article we have obtained some estimates of the rate of convergence for the recently proposed by Yu.E. Nesterov method of minimization of a convex Lipschitz-continuous function of two variables on a square with a fixed side. The method consists in solving auxiliary problems of one-dimensional minimization along the separating segments and doe...

In this paper some adaptive mirror descent algorithms for problems of minimization convex objective functional with several convex Lipschitz (generally, non-smooth) functional constraints are considered. It is shown that the methods are applicable to the objective functionals of various level of smoothness: the Lipschitz condition is valid either f...

Theoretical estimates of the convergence rate of many well-known gradient-type optimization methods are based on quadratic interpolation, provided that the Lipschitz condition for the gradient is satisfied. In this article we obtain a possibility of constructing an analogue of such interpolation in the class of locally Lipschitz quasi-convex functi...

We consider the problem of minimization of a convex function on a simple set with convex non-smooth inequality constraint and describe first-order methods to solve such problems in different situations: smooth or non-smooth objective function; convex or strongly convex objective and constraint; deterministic or randomized information about the obje...

We consider the following class of online optimization problems with functional constraints. Assume, that a finite set of convex Lipschitz-continuous non-smooth functionals are given on a closed set of $n$-dimensional vector space. The problem is to minimize the arithmetic mean of functionals with a convex Lipschitz-continuous non-smooth constraint...

В работе на базе свойства функциональной отделимости элементов выделен специальный класс отделимых нормированных конусов, который включает в себя выпуклые конусы в нормированных пространствах, а также в пространствах с несимметричной нормой. Показано, что отделимые нормированные конусы, вообще говоря, не допускают линейного инъективного изометрично...

A special class of separated normed cones, which includes convex cones in normed spaces and in spaces with an asymmetric norm, is distinguished on the basis of the functional separability of elements. It is shown that, generally, separated normed cones admit no linear injective isometric embedding in any normed space. An analog of the Banach–Mazur...

We introduce an inexact oracle model for variational inequalities (VI) with monotone operator, propose a numerical method which solves such VI's and analyze its convergence rate. As a particular case, we consider VI's with H\"older continuous operator and show that our algorithm is universal. This means that without knowing the H\"older parameter $...

The paper is devoted to new modifications of recently proposed adaptive methods of Mirror Descent for convex minimization problems in the case of several convex functional constraints. Methods for problems of two classes are considered. The first type of problems with Lipschitz-continuous objective (generally speaking, nonsmooth) functional. The se...

A novel analog of Nemirovski’s proximal mirror method with an adaptive choice of constants in the minimized prox-mappings at each iteration for variational inequalities with a Lipschitz continuous field is proposed. Estimates of the number of iterations needed to attain the desired quality of solution of the variational inequality are obtained. It...

An adaptive analogue of the Yu.~E.~Nesterov method for variational inequalities with a strongly monotone vector field is proposed. Main idea of the proposed method is the adaptive choice of constants in maximized concave functional at each iteration. These constants are related to the Lipschitz constant of the field. It does not need to specify the...

The paper is devoted to a special Mirror Descent algorithm for problems of convex minimization with functional constraints. The objective function may not satisfy the Lipschitz condition, but it must necessarily have the Lipshitz-continuous gradient. We assume, that the functional constraint can be non-smooth, but satisfying the Lipschitz condition...

We consider the problem of minimization of a convex function on a simple set with convex non-smooth inequality constraint and describe first-order methods to solve such problems in different situations: smooth or non-smooth objective function; convex or strongly convex objective and constraint; deterministic or randomized information about the obje...

In this paper we develop the theory of anti-compact sets we introduced earlier. We describe the class of Fréchet spaces where anti-compact sets exist. They are exactly the spaces that have a countable set of continuous linear functionals. In such spaces we prove an analogue of the Hahn–Banach theorem on extension of a continuous linear functional f...

We introduce anti-compact sets (anti-compacts) in Frechét spaces. We thoroughly investigate the properties of anti-compacts and the scale of Banach spaces generated by anti-compacts. Special attention is paid to systems of anti-compact ellipsoids in Hilbert spaces. The existence of a system of anti-compacts is proved for any separable Frechét space...

We prove an analogue of the Hahn-Banach theorem on the extension of a linear functional with a convex estimate for each abstract convex cone with the cancellation law. Also we consider the special class of the so-called strict convex normed cones (SCNC). For such structures we obtain an appropriate analogue of the Hahn-Banach separation theorem. On...

We consider the problem of transfer of the Denjoy-Young-Saks theorem on derivates to infinite-dimensional Banach spaces and the problem of nondifferentiability of indefinite Pettis integral in infinite-dimensional Banach spaces. Our approach is based on the concept of an anticompact set proposed by us earlier. We prove an analog of the Denjoy-Young...

In this paper, we propose a new limiting form of the Radon–Nikodym property for the Bochner integral. We prove that the limiting form holds for an arbitrary Fr´echet space as opposed to an ordinary Radon–Nikodym property. We consider some applications in linear and nonlinear analysis.

The general properties of compact subdifferentials (K-subdifferentials) for mappings of a segment to a locally convex space are studied. Different forms of the general theorem
of finite increments and the mean value theorem for compact subdifferentials are considered in detail with closed and open
estimates.

For mappings acting from an interval into a locally convex space, we study properties of strong compact variation and strong compact absolute continuity connected with an expansion of the space into subspaces generated by the compact sets. A description of strong K-absolutely continuous mappings in terms of indefinite Bochner integral is obtained....

The notions of compact convex variation and compact convex subdiffer- ential for the mappings from a segment into a locally convex space (LCS) are studied. In the case of an arbitrary complete LCS, each indefinite Bochner integral has com- pact variation and each strongly absolutely continuous and compact subdifferentiable a.e. mapping is an indefi...