August 2017
·
50 Reads
·
17 Citations
These lecture notes for a graduate course cover generalized derivative concepts useful in deriving necessary optimality conditions and numerical algorithms for nondifferentiable optimization problems in inverse problems, imaging, and PDE-constrained optimization. Treated are convex functions and subdifferentials, Fenchel duality, monotone operators and resolvents, Moreau--Yosida regularization, proximal point and (some) first-order splitting methods, Clarke subdifferentials, and semismooth Newton methods. The required background from functional analysis and calculus of variations is also briefly summarized.