Axel Böhm

Axel Böhm
University of Vienna | UniWien · Fakultät für Mathematik

About

12
Publications
1,530
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
124
Citations

Publications

Publications (12)
Preprint
We improve the understanding of the $\textit{golden ratio algorithm}$, which solves monotone variational inequalities (VI) and convex-concave min-max problems via the distinctive feature of adapting the step sizes to the local Lipschitz constants. Adaptive step sizes not only eliminate the need to pick hyperparameters, but they also remove the nece...
Preprint
We investigate a structured class of nonconvex-nonconcave min-max problems exhibiting so-called \emph{weak Minty} solutions, a notion which was only recently introduced, but is able to simultaneously capture different generalizations of monotonicity. We prove novel convergence results for a generalized version of the optimistic gradient method (OGD...
Article
Full-text available
We study minimization of a structured objective function, being the sum of a smooth function and a composition of a weakly convex function with a linear operator. Applications include image reconstruction problems with regularizers that introduce less bias than the standard convex regularizers. We develop a variable smoothing algorithm, based on th...
Article
Full-text available
We aim to solve a structured convex optimization problem, where a nonsmooth function is composed with a linear operator. When opting for full splitting schemes, usually, primal–dual type methods are employed as they are effective and also well studied. However, under the additional assumption of Lipschitz continuity of the nonsmooth function which...
Preprint
Full-text available
Minimax problems have attracted increased interest largely due to advances in machine learning, in particular generative adversarial networks. These are trained using variants of stochastic gradient descent for the two players. Although convex-concave problems are well understood with many efficient solution methods to choose from, theoretical guar...
Preprint
Full-text available
Motivated by the training of Generative Adversarial Networks (GANs), we study methods for solving minimax problems with additional nonsmooth regularizers. We do so by employing \emph{monotone operator} theory, in particular the \emph{Forward-Backward-Forward (FBF)} method, which avoids the known issue of limit cycling by correcting each update by a...
Preprint
Full-text available
We study minimization of a structured objective function, being the sum of a smooth function and a composition of a weakly convex function with a linear operator. Applications include image reconstruction problems with regularizers that introduce less bias than the standard convex regularizers. We develop a variable smoothing algorithm, based on th...
Preprint
Full-text available
In this work we show that various algorithms, ubiquitous in convex optimization (e.g. proximal-gradient, alternating projections and averaged projections) generate self-contracted sequences $\{x_{k}\}_{k\in\mathbb{N}}$. As a consequence, a novel universal bound for the \emph{length} ($\sum_{k\ge 0}\Vert x_{k+1}-x_k\Vert$) can be deduced. In additio...
Preprint
Full-text available
We aim to solve a structured convex optimization problem, where a nonsmooth function is composed with a linear operator. When opting for full splitting schemes, usually, primal-dual type methods are employed as they are effective and also well studied. However, under the additional assumption of Lipschitz continuity of the nonsmooth function which...
Article
Full-text available
We investigate the convergence properties of incremental mirror descent type subgradient algorithms for minimizing the sum of convex functions. In each step, we only evaluate the subgradient of a single component function and mirror it back to the feasible domain, which makes iterations very cheap to compute. The analysis is made for a randomized s...

Network

Cited By