Carsten Witt’s research while affiliated with Technical University of Denmark and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (160)


Compact Genetic Algorithm (cGA)
Illustration of Cliff
Variance after 100nK+100K2\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$100\sqrt{n}K+100K^2$$\end{document} iterations, averaged over 100 runs, for increasing n and different values of K. The black line indicates the minimum variance of 1-1/n\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$1-1/n$$\end{document}. The plot on the right-hand side shows the same curves divided by K1/2\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$K^{1/2}$$\end{document}
Box plots for the number of evaluations in 1000 runs on Cliff with n=15\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$n=15$$\end{document} (top) and n=18\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$n=18$$\end{document} (bottom) and exponentially increasing values of K. The plots also show values of (3/2)n\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$(3/2)^n$$\end{document}, 2n\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$2^n$$\end{document} and nn/3\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$n^{n/3}$$\end{document} for comparison
Minimum, maximum and quartile frequencies of the cGA in a single run with n=18\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$n=18$$\end{document} and K∈{16,162,163,164}\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$K \in \{16, 16^2, 16^3, 16^4\}$$\end{document}, recorded every τ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\tau $$\end{document} iterations, for values of τ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\tau $$\end{document} stated below each figure. The gray dotted line indicates 2/3

+3

The Compact Genetic Algorithm Struggles on Cliff Functions
  • Article
  • Full-text available

November 2024

·

11 Reads

·

1 Citation

Algorithmica

·

Dirk Sudholt

·

Carsten Witt

Estimation of distribution algorithms (EDAs) are general-purpose optimizers that maintain a probability distribution over a given search space. This probability distribution is updated through sampling from the distribution and a reinforcement learning process which rewards solution components that have shown to be part of good quality samples. The compact genetic algorithm (cGA) is a non-elitist EDA able to deal with difficult multimodal fitness landscapes that are hard to solve by elitist algorithms. We investigate the cGA on the Cliff function for which it was shown recently that non-elitist evolutionary algorithms and artificial immune systems optimize it in expected polynomial time. We point out that the cGA faces major difficulties when solving the Cliff function and investigate its dynamics both experimentally and theoretically. Our experimental results indicate that the cGA requires exponential time for all values of the update strength 1/K. We show theoretically that, under sensible assumptions, there is a negative drift when sampling around the location of the cliff. Experiments further suggest that there is a phase transition for K where the expected optimization time drops from nΘ(n)n^{\Theta (n)} to 2Θ(n)2^{\Theta (n)}.

Download



Stagnation Detection in Highly Multimodal Fitness Landscapes

July 2024

·

3 Reads

Algorithmica

Stagnation detection has been proposed as a mechanism for randomized search heuristics to escape from local optima by automatically increasing the size of the neighborhood to find the so-called gap size, i. e., the distance to the next improvement. Its usefulness has mostly been considered in simple multimodal landscapes with few local optima that could be crossed one after another. In multimodal landscapes with a more complex location of optima of similar gap size, stagnation detection suffers from the fact that the neighborhood size is frequently reset to 1 without using gap sizes that were promising in the past. In this paper, we investigate a new mechanism called radius memory which can be added to stagnation detection to control the search radius more carefully by giving preference to values that were successful in the past. We implement this idea in an algorithm called SD-RLSmm^{\text {m}} and show compared to previous variants of stagnation detection that it yields speed-ups for linear functions under uniform constraints and the minimum spanning tree problem. Moreover, its running time does not significantly deteriorate on unimodal functions and a generalization of the Jump benchmark. Finally, we present experimental results carried out to study SD-RLSmm^{\text {m}} and compare it with other algorithms.


Sliding Window 3-Objective Pareto Optimization for Problems with Chance Constraints

June 2024

·

4 Reads

Constrained single-objective problems have been frequently tackled by evolutionary multi-objective algorithms where the constraint is relaxed into an additional objective. Recently, it has been shown that Pareto optimization approaches using bi-objective models can be significantly sped up using sliding windows (Neumann and Witt, ECAI 2023). In this paper, we extend the sliding window approach to 3-objective formulations for tackling chance constrained problems. On the theoretical side, we show that our new sliding window approach improves previous runtime bounds obtained in (Neumann and Witt, GECCO 2023) while maintaining the same approximation guarantees. Our experimental investigations for the chance constrained dominating set problem show that our new sliding window approach allows one to solve much larger instances in a much more efficient way than the 3-objective approach presented in (Neumann and Witt, GECCO 2023).


Fast Pareto Optimization Using Sliding Window Selection

September 2023

·

44 Reads

·

7 Citations

Pareto optimization using evolutionary multi-objective algorithms such as the classical GSEMO algorithm has been widely applied to solve constrained submodular optimization problems. A crucial factor determining the runtime of the used evolutionary algorithms to obtain good approximations is the population size of the algorithms which grows with the number of trade-offs that the algorithms encounter. In this paper, we introduce a sliding window speed up technique for recently introduced algorithms. We prove that our technique eliminates the population size as a crucial factor negatively impacting the runtime of the classical GSEMO algorithm and achieves the same theoretical performance guarantees as previous approaches within less computation time. Our experimental investigations for the classical maximum coverage problem confirms that our sliding window technique clearly leads to better results for a wide range of instances and constraint settings.


Simulated Annealing (SA) with starting temperature T0\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$T_0$$\end{document} and cooling factor β≤1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\beta \le 1$$\end{document} for the minimization of f:{0,1}n→R\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$f:\{0,1\}^n\rightarrow \mathbb {R}$$\end{document}
Simulated Annealing is a Polynomial-Time Approximation Scheme for the Minimum Spanning Tree Problem

July 2023

·

36 Reads

·

2 Citations

Algorithmica

We prove that Simulated Annealing with an appropriate cooling schedule computes arbitrarily tight constant-factor approximations to the minimum spanning tree problem in polynomial time. This result was conjectured by Wegener (Automata, Languages and Programming, ICALP, Berlin, 2005). More precisely, denoting by n,m,wmaxn, m, w_{\max }, and wminw_{\min } the number of vertices and edges as well as the maximum and minimum edge weight of the MST instance, we prove that simulated annealing with initial temperature T0wmaxT_0 \ge w_{\max } and multiplicative cooling schedule with factor 11/1-1/\ell , where =ω(mnln(m))\ell = \omega (mn\ln (m)), with probability at least 11/m1-1/m computes in time O((lnln()+ln(T0/wmin)))O(\ell (\ln \ln (\ell ) + \ln (T_0/w_{\min }) )) a spanning tree with weight at most 1+κ1+\kappa times the optimum weight, where 1+κ=(1+o(1))ln(m)ln()ln(mnln(m))1+\kappa = \frac{(1+o(1))\ln (\ell m)}{\ln (\ell ) -\ln (mn\ln (m))}. Consequently, for any ϵ>0\epsilon >0, we can choose \ell in such a way that a (1+ϵ)(1+\epsilon )-approximation is found in time O((mnln(n))1+1/ϵ+o(1)(lnlnn+ln(T0/wmin)))O((mn\ln (n))^{1+1/\epsilon +o(1)}(\ln \ln n + \ln (T_0/w_{\min }))) with probability at least 11/m1-1/m. In the special case of so-called (1+ϵ)(1+\epsilon )-separated weights, this algorithm computes an optimal solution (again in time O((mnln(n))1+1/ϵ+o(1)(lnlnn+ln(T0/wmin)))O( (mn\ln (n))^{1+1/\epsilon +o(1)}(\ln \ln n + \ln (T_0/w_{\min })))), which is a significant speed-up over Wegener’s runtime guarantee of O(m8+8/ϵ)O(m^{8 + 8/\epsilon }). Our tighter upper bound also admits the result that in some situations a hybridization of simulated annealing and the (1+1){(1 + 1)} EA can lead to stronger runtime guarantees than either algorithm alone.




Fast Pareto Optimization Using Sliding Window Selection

May 2023

·

24 Reads

Pareto optimization using evolutionary multi-objective algorithms has been widely applied to solve constrained submodular optimization problems. A crucial factor determining the runtime of the used evolutionary algorithms to obtain good approximations is the population size of the algorithms which grows with the number of trade-offs that the algorithms encounter. In this paper, we introduce a sliding window speed up technique for recently introduced algorithms. We prove that our technique eliminates the population size as a crucial factor negatively impacting the runtime and achieves the same theoretical performance guarantees as previous approaches within less computation time. Our experimental investigations for the classical maximum coverage problem confirms that our sliding window technique clearly leads to better results for a wide range of instances and constraint settings.


Citations (73)


... Recently, the sliding window GSEMO (SW-GSEMO) has been introduced in [11] which outscores the performance of the original GSEMO significantly of large problem instances. The improvement here comes from a sliding window selection method that selects parent individuals dependent on the anticipated progress in time during the optimization run. ...

Reference:

Archive-based Single-Objective Evolutionary Algorithms for Submodular Optimization
Fast Pareto Optimization Using Sliding Window Selection

... In this paper, we extend the sliding window approach to 3-objective formulations for tackling chance constrained problems. On the theoretical side, we show that our new sliding window approach improves previous runtime bounds obtained in [15] while maintaining the same approximation guarantees. Our experimental investigations for the chance constrained dominating set problem show that our new sliding window approach allows one to solve much larger instances in a much more efficient way than the 3-objective approach presented in [15]. ...

3-Objective Pareto Optimization for Problems with Chance Constraints
  • Citing Conference Paper
  • July 2023

... This upper bound shows that, in principle, combining the global variation operator popular in evolutionary computation with a local-search hyper-heuristic can be an interesting approach. The potential of such combinations was discussed for the Metropolis algorithm in [DERW23], where it is analyzed only via experiments on Cliff functions with problem size n = 100 and cliff height d = 3. Hence our work is the first time that such speed-ups are proven in a rigorous manner. ...

How Well Does the Metropolis Algorithm Cope With Local Optima?
  • Citing Conference Paper
  • July 2023

... All rights reserved. preferring worse search points (Corus et al. 2021) or adapting mutation rates during the run (Doerr et al. 2019). ...

On Steady-State Evolutionary Algorithms and Selective Pressure: Why Inverse Rank-Based Allocation of Reproductive Trials Is Best
  • Citing Article
  • April 2021

ACM Transactions on Evolutionary Learning and Optimization

... We consider the gradual process of EAs from their initial position to a satisfactory solution as a renewal process to establish a renewal model, which is different from the theoretical results on intelligent optimization algorithms [26,27]. Before establishing the model, the following theorem needs to be given. ...

Runtime Analysis of the (1+1) EA on Weighted Sums of Transformed Linear Functions
  • Citing Chapter
  • August 2022

Lecture Notes in Computer Science

... Therefore, even for the best setting of the mutation rate, only the expected number of flipping bits equals the gap size while the actual number of flipping bits may be different. This has motivated Rajabi and Witt [7] to consider the k-bit flip operator flipping a uniform random subset of k bits as known from randomized local search (RLS) [4] and to adjust k via stagnation detection. ...

Stagnation Detection with Randomized Local Search
  • Citing Article
  • July 2022

Evolutionary Computation

... In the context of problems with stochastic constraints, it has recently been shown that 3-objective formulations where the given constraint is relaxed into a third objective lead to better performance than 2-objective formulations that optimize the expected value and variance of the given stochastic components under the given constraint [14,15]. The experimental investigations for the chance constrained dominating set problem carried out in [15] show that the 3objective approach is beneficial and outperforms the bi-objective one introduced in [14] for medium size instances of the problem. ...

Runtime Analysis of Single- and Multi-Objective Evolutionary Algorithms for Chance Constrained Optimization Problems with Normally Distributed Random Variables
  • Citing Conference Paper
  • July 2022

... We note that the runtime guarantees we prove do not beat the best known runtimes. For example, the use of fast mutation [DLMN17] or stagnation detection [RW22] gives speed-ups of order m −Θ(m) over the (1 + 1) EA, with crossover speed-ups of O(n −1 ) or more can be obtained [DFK + 18, DEJK24]. Nevertheless, we believe that our result is an interesting demonstration of how non-elitism can be successful in randomized search heuristics, which might aid the future development of non-elitist approaches. ...

Self-Adjusting Evolutionary Algorithms for Multimodal Optimization

Algorithmica