Fedor S. Stonyakin

Fedor S. Stonyakin
V.I. Vernadsky Crimean Federal University · Algebra and Functional Analysis Department

About

75
Publications
2,559
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
207
Citations

Publications

Publications (75)
Chapter
It is well-known that accelerated first-order gradient methods possess optimal complexity estimates for the class of convex smooth minimization problems. In many practical situations it makes sense to work with inexact gradient information. However, this can lead to an accumulation of corresponding inexactness in the theoretical estimates of the ra...
Preprint
We introduce a notion of inexact model of a convex objective function, which allows for errors both in the function and in its gradient. For this situation, a gradient method with an adaptive adjustment of some parameters of the model is proposed and an estimate for the convergence rate is found. This estimate is optimal on a class of sufficiently...
Chapter
The article is devoted to the development of numerical methods for solving saddle point problems and variational inequalities with simplified requirements for the smoothness conditions of functionals. Recently, some notable methods for optimization problems with strongly monotone operators were proposed. Our focus here is on newly proposed techniqu...
Preprint
The article is devoted to the development of numerical methods for solving variational inequalities with relatively strongly monotone operators. We consider two classes of variational inequalities related to some analogs of the Lipschitz condition of the operator that appeared several years ago. One of these classes is associated with the relative...
Preprint
It is well-known that accelerated gradient first-order methods possess optimal complexity estimates for the class of convex smooth minimization problems. In many practical situations it makes sense to work with inexact gradient information. However, this can lead to an accumulation of corresponding inexactness in the theoretical estimates of the ra...
Preprint
Recently there were proposed some innovative convex optimization concepts, namely, relative smoothness [1] and relative strong convexity [2,3]. These approaches have significantly expanded the class of applicability of gradient-type methods with optimal estimates of the convergence rate, which are invariant regardless of the dimensionality of the p...
Article
In this paper, we propose a general algorithmic framework for the first-order methods in optimization in a broad sense, including minimization problems, saddle-point problems and variational inequalities (VIs). This framework allows obtaining many known methods as a special case, the list including accelerated gradient method, composite optimizatio...
Preprint
Full-text available
This textbook is based on lectures given by the authors at MIPT (Moscow), HSE (Moscow), FEFU (Vladivostok), V.I. Vernadsky KFU (Simferopol), ASU (Republic of Adygea), and the University of Grenoble-Alpes (Grenoble, France). First of all, the authors focused on the program of a two-semester course of lectures on convex optimization, which is given t...
Preprint
Basing on some recently proposed methods for solving variational inequalities with non-smooth operators, we propose an analogue of the Mirror Prox method for the corresponding class of problems under the assumption of relative smoothness and relative strong monotonicity of the operator.
Preprint
In this paper, we consider gradient-type methods for convex positively homogeneous optimization problems with relative accuracy. Estimates for the quality of solution with relative accuracy are obtained for adaptive non-accelerated and accelerated gradient-type methods for an analogue of the recently proposed concept of inexact oracle. An analogue...
Preprint
The article is devoted to the development of numerical methods for solving saddle point problems and variational inequalities with simplified requirements for the smoothness conditions of functionals. Recently there were proposed some notable methods for optimization problems with strongly monotone operators. Our focus here is on newly proposed tec...
Preprint
We consider the problem of learning the optimal policy for infinite-horizon Markov decision processes (MDPs). For this purpose, some variant of Stochastic Mirror Descent is proposed for convex programming problems with Lipschitz-continuous functionals. An important detail is the ability to use inexact values of functional constraints. We analyze th...
Article
Full-text available
We consider an interesting class of composite optimization problems with a gradient dominance condition and introduce corresponding analogue of the recently proposed concept of an inexact oracle. This concept is applied to some classes of smooth functional.
Preprint
We propose some adaptive mirror descent dethods for convex programming problems with delta-subgradients and prove some theoretical results.
Article
Recently, it has been shown how, on the basis of the usual accelerated gradient method for solving problems of smooth convex optimization, accelerated methods for more complex problems (with a structure) and problems that are solved using various local information about the behavior of a function (stochastic gradient, Hessian, etc.) can be obtained...
Preprint
Full-text available
The article is devoted to the development of algorithmic methods for strongly convex-concave saddle-point problems in the case when one of the groups of variables has a large dimension, and the other is sufficiently small (up to a hundred). The proposed technique is based on reducing problems of this type to a problem of minimizing a convex (maximi...
Chapter
Recently some specific classes of non-smooth and non-Lipsch-itz convex optimization problems were considered by Yu. Nesterov and H. Lu. We consider convex programming problems with similar smoothness conditions for the objective function and functional constraints. We introduce a new concept of an inexact model and propose some analogues of switchi...
Article
For problems of unconstrained optimization, the concept of inexact oracle proposed by O. Devolder, F. Glineur and Yu.E. Nesterov is well known. We introduce an analog of the concept of inexact oracle (model of a function) for abstract equilibrium problems, variational inequalities, and saddle-point problems. This allows us to propose an analog of N...
Preprint
Recently some specific classes of non-smooth and non-Lipschitz convex optimization problems were selected by Yu.~Nesterov along with H.~Lu. We consider convex programming problems with similar smoothness conditions for the objective function and functional constraints. We introduce a new concept of an inexact model and propose some analogues of swi...
Chapter
This chapter is devoted to the blackbox subgradient algorithms with the minimal requirements for the storage of auxiliary results, which are necessary to execute these algorithms. To provide historical perspective this survey starts with the original result of Shor which opened this field with the application to the classical transportation problem...
Preprint
Full-text available
In this paper we propose a general algorithmic framework for first-order methods in optimization in a broad sense, including minimization problems, saddle-point problems and variational inequalities. This framework allows to obtain many known methods as a special case, the list including accelerated gradient method, composite optimization methods,...
Chapter
Full-text available
Some adaptive analogue of the Mirror Prox method for variational inequalities is proposed. In this work we consider the adaptation not only to the value of the Lipschitz constant, but also to the magnitude of the oracle error. This approach, in particular, allows us to prove a complexity near for variational inequalities for a special class of mono...
Article
Network utility maximization is the most important problem in network traffic management. Given the growth of modern communication networks, we consider utility maximization problem in a network with a large number of connections (links) that are used by a huge number of users. To solve this problem an adaptive mirror descent algorithm for many con...
Preprint
The concept of inexact ($ \delta, \Delta, L$)-model of objective functional in optimization is introduced. Some gradient-type methods with adaptation of inexactness parameters are proposed. The applicability of methods to some non-smooth optimization is discussed.
Preprint
Full-text available
Network utility maximization is the most important problem in network traffic management. Given the growth of modern communication networks, we consider the utility maximization problem in a network with a large number of connections (links) that are used by a huge number of users. To solve this problem an adaptive mirror descent algorithm for many...
Article
Full-text available
In this paper, some analogs of the Devolder–Glineu–Nesterov (𝛿,𝐿,𝜇)-oracle are introduced for optimization problems.At the same time, various types of conditions of relative smoothness and relative strong convexity of the objective function are highlighted. Examples of convex and strongly convex optimization problems admitting the existence of inex...
Preprint
Based on the ideas of arXiv:1710.06612, we consider the problem of minimization of the Lipschitz-continuous non-smooth functional $f$ with non-positive convex (generally, non-smooth) Lipschitz-continuous functional constraint. We propose some novel strategies of step-sizes and adaptive stopping rules in Mirror Descent algorithms for the considered...
Preprint
Full-text available
An analogue of the Adaptive Mirror Prox method for variational inequalities is proposed. In this work we consider the adaptation not only to the level of operator smoothness, but also to the magnitude of the oracle error. This approach, in particular, allows us to prove a complexity near $O(1/ \varepsilon)$ for variational inequalities for a specia...
Article
Under consideration are some adaptive mirror descent algorithms for the problems of minimization of a convex objective functional with several convex Lipschitz (generally, nonsmooth) functional constraints. It is demonstrated that the methods are applicable to the objective functionals of various levels of smoothness: The Lipschitz condition holds...
Chapter
We consider optimization methods for convex minimization problems under inexact information on the objective function. We introduce inexact model of the objective, which as a particular cases includes inexact oracle [16] and relative smoothness condition [36]. We analyze gradient method which uses this inexact model and obtain convergence rates for...
Preprint
Based on G. Lan's accelerated gradient sliding and general relation between the smoothness and strong convexity parameters of function under Legendre transformation we show that under rather general conditions the best known bounds for bilinear convex-concave smooth composite saddle point problem keep true for or non-bilinear convex-concave smooth...
Chapter
We consider the classical optimization problem of minimizing a strongly convex, non-smooth, Lipschitz-continuous function with one Lipschitz-continuous constraint. We develop the approach in [10] and propose two methods for the considered problem with adaptive stopping rules. The main idea of the methods is using the dichotomy method and solving an...
Article
An adaptive analog of Nesterov’s method for variational inequalities with a strongly monotone operator is proposed. The main idea of the method is an adaptive choice of constants in the maximized concave functionals at each iteration. In this case there is no need in specifying exact values of the constants, since this method makes it possible to f...
Preprint
Full-text available
We consider optimization methods for convex minimization problems under inexact information on the objective function. We introduce inexact model of the objective, which as a particular cases includes $(\delta,L)$ inexact oracle and relative smoothness condition. We analyze gradient method which uses this inexact model and obtain convergence rates...
Preprint
Full-text available
This chapter is devoted to the black-box subgradient algorithms with the minimal requirements for the storage of auxiliary results, which are necessary to execute these algorithms. It starts with the original result of N.Z. Shor which open this field with the application to the classical transportation problem. To discuss the fundamentals of non-sm...
Preprint
Full-text available
In this paper we propose a general algorithmic framework for first-order methods in optimization in a broad sense, including minimization problems, saddle-point problems and variational inequalities. This framework allows to obtain many known methods as a special case, the list including accelerated gradient method, composite optimization methods,...
Preprint
An adaptive proximal method for a special class of abstract variational inequalities is proposed. For example, the so-called mixed variational inequalities and composite saddle problems are considered. Some estimates of the necessary number of iterations are obtained to achieve a given quality of the variational inequality solution.
Conference Paper
We consider the following class of online optimization problems with functional constraints. Assume, that a finite set of convex Lipschitz-continuous non-smooth functionals are given on a closed set of n-dimensional vector space. The problem is to minimize the arithmetic mean of functionals with a convex Lipschitz-continuous non-smooth constraint....
Preprint
Full-text available
In the article we have obtained some estimates of the rate of convergence for the recently proposed by Yu.E. Nesterov method of minimization of a convex Lipschitz-continuous function of two variables on a square with a fixed side. The method consists in solving auxiliary problems of one-dimensional minimization along the separating segments and doe...
Preprint
In this paper some adaptive mirror descent algorithms for problems of minimization convex objective functional with several convex Lipschitz (generally, non-smooth) functional constraints are considered. It is shown that the methods are applicable to the objective functionals of various level of smoothness: the Lipschitz condition is valid either f...
Preprint
Theoretical estimates of the convergence rate of many well-known gradient-type optimization methods are based on quadratic interpolation, provided that the Lipschitz condition for the gradient is satisfied. In this article we obtain a possibility of constructing an analogue of such interpolation in the class of locally Lipschitz quasi-convex functi...
Preprint
Full-text available
We consider the following class of online optimization problems with functional constraints. Assume, that a finite set of convex Lipschitz-continuous non-smooth functionals are given on a closed set of $n$-dimensional vector space. The problem is to minimize the arithmetic mean of functionals with a convex Lipschitz-continuous non-smooth constraint...
Article
A special class of separated normed cones, which includes convex cones in normed spaces and in spaces with an asymmetric norm, is distinguished on the basis of the functional separability of elements. It is shown that, generally, separated normed cones admit no linear injective isometric embedding in any normed space. An analog of the Banach–Mazur...
Preprint
Full-text available
We introduce an inexact oracle model for variational inequalities (VI) with monotone operator, propose a numerical method which solves such VI's and analyze its convergence rate. As a particular case, we consider VI's with H\"older continuous operator and show that our algorithm is universal. This means that without knowing the H\"older parameter $...
Preprint
Full-text available
The paper is devoted to new modifications of recently proposed adaptive methods of Mirror Descent for convex minimization problems in the case of several convex functional constraints. Methods for problems of two classes are considered. The first type of problems with Lipschitz-continuous objective (generally speaking, nonsmooth) functional. The se...
Article
Full-text available
A novel analog of Nemirovski’s proximal mirror method with an adaptive choice of constants in the minimized prox-mappings at each iteration for variational inequalities with a Lipschitz continuous field is proposed. Estimates of the number of iterations needed to attain the desired quality of solution of the variational inequality are obtained. It...
Article
An adaptive analogue of the Yu.~E.~Nesterov method for variational inequalities with a strongly monotone vector field is proposed. Main idea of the proposed method is the adaptive choice of constants in maximized concave functional at each iteration. These constants are related to the Lipschitz constant of the field. It does not need to specify the...
Article
The paper is devoted to a special Mirror Descent algorithm for problems of convex minimization with functional constraints. The objective function may not satisfy the Lipschitz condition, but it must necessarily have the Lipshitz-continuous gradient. We assume, that the functional constraint can be non-smooth, but satisfying the Lipschitz condition...
Chapter
We consider the problem of minimization of a convex function on a simple set with convex non-smooth inequality constraint and describe first-order methods to solve such problems in different situations: smooth or non-smooth objective function; convex or strongly convex objective and constraint; deterministic or randomized information about the obje...
Article
Full-text available
We consider the problem of minimization of a convex function on a simple set with convex non-smooth inequality constraint and describe first-order methods to solve such problems in different situations: smooth or non-smooth objective function; convex or strongly convex objective and constraint; deterministic or randomized information about the obje...
Article
Full-text available
In this paper we develop the theory of anti-compact sets we introduced earlier. We describe the class of Fréchet spaces where anti-compact sets exist. They are exactly the spaces that have a countable set of continuous linear functionals. In such spaces we prove an analogue of the Hahn–Banach theorem on extension of a continuous linear functional f...
Article
Full-text available
We introduce anti-compact sets (anti-compacts) in Frechét spaces. We thoroughly investigate the properties of anti-compacts and the scale of Banach spaces generated by anti-compacts. Special attention is paid to systems of anti-compact ellipsoids in Hilbert spaces. The existence of a system of anti-compacts is proved for any separable Frechét space...
Article
We prove an analogue of the Hahn-Banach theorem on the extension of a linear functional with a convex estimate for each abstract convex cone with the cancellation law. Also we consider the special class of the so-called strict convex normed cones (SCNC). For such structures we obtain an appropriate analogue of the Hahn-Banach separation theorem. On...
Article
We consider the problem of transfer of the Denjoy-Young-Saks theorem on derivates to infinite-dimensional Banach spaces and the problem of nondifferentiability of indefinite Pettis integral in infinite-dimensional Banach spaces. Our approach is based on the concept of an anticompact set proposed by us earlier. We prove an analog of the Denjoy-Young...
Article
In this paper, we propose a new limiting form of the Radon–Nikodym property for the Bochner integral. We prove that the limiting form holds for an arbitrary Fr´echet space as opposed to an ordinary Radon–Nikodym property. We consider some applications in linear and nonlinear analysis.
Article
The general properties of compact subdifferentials (K-subdifferentials) for mappings of a segment to a locally convex space are studied. Different forms of the general theorem of finite increments and the mean value theorem for compact subdifferentials are considered in detail with closed and open estimates.
Article
For mappings acting from an interval into a locally convex space, we study properties of strong compact variation and strong compact absolute continuity connected with an expansion of the space into subspaces generated by the compact sets. A description of strong K-absolutely continuous mappings in terms of indefinite Bochner integral is obtained....
Article
The notions of compact convex variation and compact convex subdiffer- ential for the mappings from a segment into a locally convex space (LCS) are studied. In the case of an arbitrary complete LCS, each indefinite Bochner integral has com- pact variation and each strongly absolutely continuous and compact subdifferentiable a.e. mapping is an indefi...

Network

Cited By