Article

The perturbed proximal point algorithm and some of its applications

Applied Mathematics and Optimization (Impact Factor: 0.68). 02/1994; 29(2):125-159. DOI: 10.1007/BF01204180

ABSTRACT Following the works of R. T. Rockafellar, to search for a zero of a maximal monotone operator, and of B. Lemaire, to solve convex optimization problems, we present a perturbed version of the proximal point algorithm. We apply this new algorithm to convex optimization and to variational inclusions or, more particularly, to variational inequalities.

0 Bookmarks
 · 
95 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The theory of maximal set-valued monotone mappings provide a powerful framework to the study of convex programming and variational inequalities. Based on the notion of relatively maximal relaxed monotonicity, the approximation solvability of a general class of inclusion problems is discussed, while generalizing most of investigations on weak convergence using the proximal point algorithm in a real Hilbert space setting. A well-known method of multipliers of constrained convex programming is a special case of the proximal point algorithm. The obtained results can be used to generalize the Yosida approximation, which, in turn, can be applied to generalize first-order evolution equations to the case of evolution inclusions. Furthermore, we observe that the Douglas-Rachford splitting method for finding the zero of the sum of two monotone operators is a specialization of the proximal point algorithm as well. This allows a further generalization and unification of a wide range of convex programming algorithms.
    WSEAS Transactions on Mathematics 08/2011; 10(8):259-269.
  • [Show abstract] [Hide abstract]
    ABSTRACT: A new application-oriented notion of relatively A-maximal monotonicity (RMM) framework is introduced, and then it is applied to the approximation solvability of a general class of inclusion problems, while generalizing other existing results on linear convergence, including Rockafellar’s theorem (1976) on linear convergence using the proximal point algorithm in a real Hilbert space setting. The obtained results not only generalize most of the existing investigations, but also reduce smoothly to the case of the results on maximal monotone mappings and corresponding classical resolvent operators. Furthermore, our proof approach differs significantly to that of Rockafellar’s celebrated work, where the Lipschitz continuity of M −1, the inverse of M:X→2X , at zero is assumed to achieve a linear convergence of the proximal point algorithm. Note that the notion of relatively A-maximal monotonicity framework seems to be used to generalize the classical Yosida approximation (which is applied and studied mostly based on the classical resolvent operator in the literature) that in turn can be applied to first-order evolution equations as well as evolution inclusions.
    Journal of Applied Mathematics and Computing 01/2012; 39(1-2).
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In this article, we consider the proximal point algorithm for the problem of approximating zeros of maximal monotone mappings. Strong convergence theorems for zero points of maximal monotone mappings are established in the framework of Hilbert spaces. 2000 AMS Subject Classification: 47H05; 47H09; 47J25.
    Journal of Inequalities and Applications 01/2012; 2012(1). · 0.77 Impact Factor