Fig 2 - uploaded by Thierry Petit
Content may be subject to copyright.
Scaling of Dynamic Sweep for FlexC.

Scaling of Dynamic Sweep for FlexC.

Source publication
Conference Paper
Full-text available
This paper investigates cumulative scheduling in uncertain environments, using constraint programming. We present a new declarative characterization of robustness, which preserves solution quality. We highlight the significance of our framework on a crane assignment problem with business constraints.

Context in source publication

Context 1
... in K are not null, with an average equal to 4. Similar results are obtained with fixed k a . Figure 2 shows that our filtering algorithm scales on problems up 12800 activities for a first solution. The decomposition reaches the time limit of 1h:00m with 1600 activities and leads to a memory crash with 6400 Cumulative. ...

Similar publications

Citations

... Such solutions are, however, hard to compute and are often restricted to small instances. The FlexC constraint [6,7] can model cumulative scheduling problems where tasks have two processing times: a normal processing time and a delayed one. The constraint ensures that if any subset of k tasks is delayed, the cumulative resource is not overloaded. ...
Conference Paper
Full-text available
We introduce the Confidence constraint, a chance constraint that ensures, with probability , that a set of variables are no smaller than random variables for which the probability distribution is given. This constraint is useful in stochastic optimization to ensure that a solution is robust to external random events. It allows to control the trade-off between optimizing the objective function and ensuring the satisfiability of the solution under random parameters. We present a filtering algorithm for this constraint with explanations. We apply the constraint to a case study, an industrial scheduling problem where tasks have random processing times due to possible breakdowns during their execution. We evaluate our solutions with simulations and show that this new constraint allows robust solutions in decent computation time.
... For example, in [12,29] the authors generalise the notion of robust solutions and fault tolerance using a SAT encoding. More recently, robustness has been studied in combinatorial problems, such as jobshop scheduling [14,15,8] and combinatorial auctions [16,17]. There are two main approaches to find super solutions: reformulation and search-based algorithms. ...
Chapter
When optimising under uncertainty, it is desirable that solutions are robust to unexpected disruptions and changes. A possible formalisation of robustness is given by super solutions. An assignment to a set of decision variables is an (a, b, c) super solution if any change involving at most a variables can be repaired by changing at most b other variables; the repair solution should have a cost of at most c units worse than a non-robust optimum. We propose a method exploiting Logic Based Benders Decomposition to find super solutions to an optimisation problem for generic disruptions. The master deals with the original problem, while subproblems try to find repair solutions for each possible disruption. As a case study, we consider the Kidney Exchange Problem, for which our method scales dramatically better on realistic instances than a reformulation-based approach from the literature.
... A usual objective is to minimize the makespan, i.e., the latest end among all activities. In this paper, we investigate the filtering algorithm of the FlexC constraint [3]. FlexC expresses cumulative problems that should integrate a given level of robustness. ...
... In this section we recall the conditions that are exploited by the algorithm. Proofs and explanations can be found in [3]. [3]). ...
... Proofs and explanations can be found in [3]. [3]). Let a ∈ A be an activity and k a ∈ K. ...
Article
Full-text available
We investigate cumulative scheduling in uncertain environments, using constraint programming. We detail in this paper the dynamic sweep filtering algorithm of the FlexC global constraint.
Article
Full-text available
La programmation par contraintes (PPC) a pour objet de résoudre des problèmes ayant une structure combinatoire, qui impliquent des contraintes de formes diverses. Les outils de PPC sont génériques et composables. Ils peuvent facilement être hybridés avec d’autres technologies. Depuis son apparition dans les années soixante-dix, la PPC a suscité un intérêt croissant dans la communauté scientifique. Une des raisons de son succès est qu’il s’agit d’un paradigme théoriquement déclaratif, comme l’avait souligné le Professeur Eugène Freuder : “Constraint Programming represents one of the closest approaches computer science has yet made to the Holy Grail of programming : The user states the problem, the computer solves it”. Ce pouvoir expressif permet de modéliser un grand nombre d’applications proches les unes des autres sans avoir à implémenter plusieurs fois leurs composants algorithmiques les plus critiques. La PPC est largement employée pour résoudre des problèmes industriels, au détriment parfois, paradoxalement, de sa déclarativité. Il peut en effet s’avérer nécessaire, voire indispensable, de concevoir la modélisation du problème en exploitant fortement des propriétés qui lui sont propres. Cet état de faits comporte le risque d’éliminer à tort certaines idées scientifiques importantes mais trop novatrices pour pouvoir être employées à court terme dans un cadre industriel. Le compromis entre généricité et efficacité se situe donc au cœur de la recherche scientifique en programmation par contraintes. L’objet de cette habilitation est de synthétiser un ensemble de tentatives de réponses à cette problématique, ainsi que les perspectives qui s’en dégagent. Je présente notamment une sélection de contributions portant sur les problèmes d’optimisation : traitement des problèmes sur contraints, de l’incertitude, ainsi que des outils conceptuels permettant une caractérisation des solutions plus fine qu’une notion d’optimalité fonctionnelle classique. Je mets en relief les motivations pratiques et théoriques de ces contributions, leur fil conducteur, ainsi que certaines questions qui demeurent ouvertes. En outre, j’évoque mes travaux sur l’aide à la modélisation de problèmes en programmation par contraintes, par apprentissage, et la génération de conditions de filtrage à partir de représentations génériques des contraintes globales.