Thesis

Cumulative scheduling in constraint programming : energetic characterization of reasoning and robust solutions

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

La programmation par contraintes est une approche régulièrement utilisée pour traiter des problèmes d’ordonnancement variés. Les problèmes d’ordonnancement cumulatifs représentent une classe de problèmes dans laquelle des tâches non morcelable peuvent être effectuées en parallèle. Ces problèmes apparaissent dans de nombreux contextes réels, tels que par exemple l’allocation de machines virtuelles ou l’ordonnancement de processus dans le "cloud", la gestion de personnel ou encore d’un port. De nombreux mécanismes ont été adaptés et proposés en programmation par contraintes pour résoudre les problèmes d’ordonnancement. Les différentes adaptations ont abouti à des raisonnements qui semblent à priori significativement distincts. Dans cette thèse nous avons effectué une analyse détaillée des différents raisonnements, proposant à la fois une notation unifiée purement théorique mais aussi des règles de dominance, permettant une amélioration significative du temps d’exécution d’algorithmes issus de l’état de l’art, pouvant aller jusqu’à un facteur sept. Nous proposons aussi un nouveau cadre de travail pour l’ordonnancement cumulatif robuste, permettant de trouver des solutions supportant qu’à tout moment une ou plusieurs tâches soit retardées, sans remise en cause de l’ordonnancement généré et en gardant une date de fin de projet satisfaisante. Dans ce cadre, nous proposons une adaptation d’un algorithme de l’état de l’art, Dynamic Sweep.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Une démonstration de ce lemme ainsi que la preuve que ces intervalles sont suffisants pour détecter toute incohérence due au raisonnement énergétique peuvent être trouvées dans [Der15]. ...
... Ceci nous permet de décrire l'algorithme (cf. 1) de détection d'incohérence en O(n 2 ) de Derrien [Der15]. ...
Thesis
Full-text available
Les problèmes d’ordonnancement à contraintes de ressource ont été largement étudiés dans la littérature. Cependant, dans la plupart des cas, il est supposé que les activités ont une durée fixe et nécessitent une quantité constante de la ressource durant toute leur exécution. Dans cette thèse, nous nous proposons de traiter un problème d’ordonnancement dans lequel les tâches ont une durée et un profil de consommation de ressource variables. Ce profil, qui peut varier en fonction du temps, est une variable de décision du problème dont dépend la durée de la tâche associée. Par ailleurs, la considération de fonctions de rendement linéaires et non linéaires pour la représentation de l’utilisation des ressources complexifie le problème et permet de modéliser de manière réaliste les transferts de ressources énergétiques. Pour ce problème NP-complet, nous présentons plusieurs propriétés permettant de dériver des modèles et méthodes de résolution. Ces méthodes de résolution sont divisées en deux parties. La première partie visualise ce problème du point de vue de la Programmation Par Contraintes et plusieurs méthodes dérivées de ce paradigme sont détaillées dont le développement du raisonnement énergétique sur le problème étudié. La seconde partie de la thèse est dédiée à des approches de Programmation Linéaire Mixte et plusieurs modèles, notamment un modèle à temps continu basé sur les événements, ainsi que des analyses théoriques et des techniques d’amélioration de ces modèles sont présentés. Enfin, des expérimentations viennent appuyer les résultats présentés dans ce manuscrit.
Thesis
Les problèmes de tournées de véhicules sont des problèmes d’optimisation combinatoire épineux avec des enjeux économiques et environnementaux importants au sein de la chaîne logistique. Le problème fondamental est de desservir des clients avec un ensemble de véhicules de façon à minimiser la distance totale parcourue. En pratique, il y a une grande variété d’objectifs et de contraintes additionnelles, liées à la législation et à la diversité des domaines d’applications. Ces problèmes de tournées sont très fréquents pour de nombreuses industries et la conception d’approches de résolution génériques est devenue une question de recherche importante.Cette thèse porte sur la conception et le développement d’un nouveau moteur de résolution pour les logiciels de tournées de véhicules proposés par l’entreprise GEOCONCEPT. Le solveur mis au point s’appuie sur la programmation par contraintes (PPC) pour améliorer la flexibilité (prise en compte de contraintes additionnelles), la déclarativité et la maintenance qui sont les limites des solveurs actuels de GEOCONCEPT fondés sur la recherche locale.Dans un premier temps, un modèle de graphe est établi pour la représentation unifiée des données et de nombreuses contraintes métiers. La résolution s’effectue par des approches à base de voisinage large disponibles dans les solveurs de PPC modernes. On peut ainsi traiter des instances de très grandes tailles efficacement tout en conservant une approche déclarative pour exprimer une classe très large de problèmes de tournées de véhicules. Dans un second temps, des modèles PPC s’appuyant sur des représentations redondantes du problème sont proposés afin de renforcer le filtrage. Nous nous intéressons en détails aux mécanismes de filtrage c’est-à-dire aux processus d’élimination des valeurs infaisables ou sous-optimales dans les domaines des variables. Ces algorithmes permettent de simplifier rapidement le problème et de fournir des bornes inférieures afin d’évaluer la qualité des solutions obtenues. Les bornes inférieures sont obtenues en résolvant des relaxations du plus célèbre des problèmes de la Recherche Opérationnelle : le problème du voyageur de commerce (TSP). Ce problème est le cœur de la contrainte globale weightedcircuit permettant de modéliser les problèmes de tournées en PPC. Nous proposons de nouveaux mécanismes de filtrage pour cette contrainte s’appuyant sur trois relaxations du TSP. Ces relaxations sont comparées sur les plans théorique et expérimental. L’originalité de ce travail est de proposer un nouvel algorithme de filtrage permettant de raisonner à la fois sur les successeurs directs d’un client et sur sa position dans la tournée. Ces raisonnements sont particulièrement utiles en présence de contraintes de fenêtres de temps, très communes dans les problèmes industriels.Le nouveau moteur de résolution offre d’excellentes performances sur des problèmes académiques et industriels tout en proposant des bornes inférieures informatives à des problèmes industriels réels.
Chapter
Full-text available
No framework or classification scheme has ever been introduced for stochastic scheduling problems. It is more difficult to develop such a scheme for stochastic scheduling problems than for deterministic scheduling problems. In order to characterize a stochastic scheduling problem more information is required. For example, the distributions of the processing times have to be specified as well as the distributions of the due dates (which may be different). It has to be specified whether the processing times of the n jobs are independent or correlated (e.g., equal to the same random variable) and also which class of policies is considered. For these reasons no framework has been introduced in this book either
Conference Paper
Full-text available
This paper is originally motivated by an application where the objective is to generate a video summary, built using intervals extracted from a video source. In this application, the constraints used to select the relevant pieces of intervals are based on Allen’s algebra. The best state-of-the-art results are obtained with a small set of ad hoc solution techniques, each specific to one combination of the 13 Allen’s relations. Such techniques require some expertise in Constraint Programming. This is a critical issue for video specialists. In this paper, we design a generic constraint, dedicated to a class of temporal problems that covers this case study, among others. ExistAllen takes as arguments a vector of tasks, a set of disjoint intervals and any of the 213 combinations of Allen’s relations. ExistAllen holds if and only if the tasks are ordered according to their indexes and for any task at least one relation is satisfied, between the task and at least one interval. We design a propagator that achieves bound-consistency in O(n +m), where n is the number of tasks and m the number of intervals. This propagator is suited to any combination of Allen’s relations, without any specific tuning. Therefore, using our framework does not require a strong expertise in Constraint Programming. The experiments, performed on real data, confirm the relevance of our approach.
Article
Full-text available
J. Carlier, D. Dubois, J. Favrel (rap.), M. Coursoivier (pdt), G. Bel, R. Companys (exa.)
Article
Full-text available
The cumulative scheduling constraint, which enforces the sharing of a finite resource by several tasks, is widely used in constraint-based scheduling applications. Propagation of the cumulative constraint can be performed by several different filtering algorithms, often used in combination. One of the most important and successful of these filtering algorithms is edge-finding. Recent work by Vilím has resulted in a 𝒪 (kn log n) algorithm for cumulative edge-finding (where n is the number of tasks and k is the number of distinct capacity requirements), as well as a new related filter, timetable edge-finding, with a complexity of 𝒪(n 2). We present a sound 𝒪(n 2) filtering algorithm for standard cumulative edge-finding, orthogonal to the work of Vilím; we also show how this algorithm’s filtering may be improved by incorporating some reasoning from extended edge-finding, with no increase in complexity. The complexity of the new algorithm does not strictly dominate previous edge-finders for small k, and it sometimes requires more iterations to reach the same fixpoint; nevertheless, results from Project Scheduling Problem Library benchmarks show that in practice this algorithm consistently outperforms earlier edge-finding filters, and remains competitive with timetable edge-finding, despite the latter algorithm’s generally stronger filtering.
Article
Full-text available
Edge-finding and energetic reasoning are well known filtering rules used in constraint based disjunctive and cumulative scheduling during the propagation of the resource constraint. In practice, however, edge-finding is most used (because it has a low running time complexity) than the energetic reasoning which needs   2 n  time-intervals to be considered (where n is the number of tasks). In order to reduce the number of time-intervals in the energetic reasoning, the maximum density and the minimum slack notions are used as criteria to select the time-intervals. The paper proposes a new filtering algorithm for cumulative resource constraint, titled energetic extended edge finder of complexity   3 n . The new algorithm is a hybridization of extended edge-finding and energetic reasoning: more powerful than the extended edge-finding and faster than the energetic reasoning. It is proven that the new algorithm subsumes the extended edge-finding algorithm. Results on Resource Constrained Project Scheduling Problems (RCPSP) from BL set and PSPLib librairies are reported. These results show that in practice the new algorithm is a good trade-off between the filtering power and the running time on instances where the number of tasks is less than 30.
Article
Full-text available
We investigate cumulative scheduling in uncertain environments, using constraint programming. We detail in this paper the dynamic sweep filtering algorithm of the FlexC global constraint.
Article
Full-text available
Energetic Reasoning (ER) is a powerful filtering algorithm for the Cumulative constraint. Unfortunately, ER is generally too costly to be used in practice. One reason of its bad behavior is that many intervals are considered as relevant by the checker of ER, although most of them should be ignored. In this paper, we provide a sharp characterization that allows to reduce the number of intervals by a factor seven. Our experiments show that associating this checker with a Time-Table filtering algorithm leads to promising results.
Conference Paper
Full-text available
A key feature of constraint programming is the ability to design specific search strategies to solve problems. On the contrary, integer programming solvers have used efficient general-purpose strategies since their earliest implementations. We present a new general purpose search strategy for constraint programming inspired from integer programming techniques and based on the concept of the impact of a variable. The impact measures the importance of a variable for the reduction of the search space. Impacts are learned from the observation of domain reduction during search and we show how restarting search can dramatically improve performance. Using impacts for solving multiknapsack, magic square, and Latin square completion problems shows that this new criteria for choosing variables and values can outperform classical general-purpose strategies.
Conference Paper
Full-text available
In this paper, we present a dynamic and adaptive variable ordering heuristic which guides systematic search toward i nconsis- tent or hard parts of a Constraint Satisfaction Problem (CSP ). This generic heuristic is able to exploit information about prev ious states of the search process whereas traditional dynamic,ones only exploit information about the current state. Intuitively, it avoid s some trash- ing by first instantiating variables involved in the constra ints that have frequently participated in dead-end situations. Such informa- tion is recorded by associating a weight with each constrain t. This weight is increased whenever,the associated constraint is v iolated during search. The extensive experiments,that we have conducted prove that our conflict-directed approach is the most efficie nt current one with respect to significant and large classes of academic , random and real-world instances.
Conference Paper
Full-text available
Super solutions are solutions in which, if a small num- ber of variables lose their values, we are guaranteed to be able to repair the solution with only a few changes. In this paper, we stress the need to extend the super solution framework along several dimensions to make it more useful practically. We demonstrate the useful- ness of those extensions on an example from jobshop scheduling, an optimization problem solved through constraint satisfaction. In such a case there is indeed a trade-off between optimality and robustness, however robustness may be increased without sacrificing opti- mality.
Article
Full-text available
Edge-finding algorithms for cumulative scheduling are at the core of commercial constraint-based schedulers. This paper shows that Nuijten's edge finder for cumulative scheduling, and its derivatives, are incomplete and use an invalid dominance rule. The paper then presents a new edge- finding algorithm for cumulative resources which runs in time O(n2k), where n is the number of tasks and k the number of different capacity re- quirements of the tasks. The new algorithm is organized in two phases and first uses dynamic programming to precompute the innermost maximiza- tion in the edge-finder specification. Finally, this paper also proposes the first extended edge-finding algorithm that runs in time O(n2k), improving the running time of available algorithms.
Article
Full-text available
Robust Optimization (RO) is a modeling methodology, combined with computational tools, to process optimization problems in which the data are uncertain and is only known to belong to some uncertainty set. The paper surveys the main results of RO as applied to uncertain linear, conic quadratic and semidefinite programming. For these cases, computationally tractable robust counterparts of uncertain problems are explicitly obtained, or good approximations of these counterparts are proposed, making RO a useful tool for real-world applications. We discuss some of these applications, specifically: antenna design, truss topology design and stability analysis/synthesis in uncertain dynamic systems. We also describe a case study of 90 LPs from the NETLIB collection. The study reveals that the feasibility properties of the usual solutions of real world LPs can be severely affected by small perturbations of the data and that the RO methodology can be successfully used to overcome this phenomenon.
Conference Paper
Full-text available
Robust search procedures are a central component in the design of black-box constraint-programming solvers. This paper proposes activity-based search, the idea of using the activity of variables during propagation to guide the search. Activity-based search was compared experimentally to impact-based search and the WDEG heuristics. Experimental results on a variety of benchmarks show that activity-based search is more robust than other heuristics and may produce significant improvements in performance.
Article
Full-text available
This paper introduces LocalSolver 1.x, a black-box local-search solver for general 0-1 programming. This software allows OR practitioners to focus on the modeling of the problem using a simple formalism, and then to defer its actual resolution to a solver based on efficient and reliable local-search techniques. Started in 2007, the goal of the LocalSolver project is to offer a model-and-run approach to combinatorial optimization problems which are out of reach of existing black-box tree-search solvers (integer or constraint programming). Having outlined the modeling formalism and the main technical features behind LocalSolver, its effectiveness is demonstrated through an extensive computational study. The version 1.1 of LocalSolver can be freely downloaded at http://www.localsolver.com and used for educational, research, or commercial purposes.
Article
Full-text available
A robust approach to solving linear optimization problems with uncertain data was proposed in the early 1970s and has recently been extensively studied and extended. Under this approach, we are willing to accept a suboptimal solution for the nominal values of the data in order to ensure that the solution remains feasible and near optimal when the data changes. A concern with such an approach is that it might be too conservative. In this paper, we propose an approach that attempts to make this trade-off more attractive; that is, we investigate ways to decrease what we call the price of robustness. In particular, we flexibly adjust the level of conservatism of the robust solutions in terms of probabilistic bounds of constraint violations. An attractive aspect of our method is that the new robust formulation is also a linear optimization problem. Thus we naturally extend our methods to discrete optimization problems in a tractable way. We report numerical results for a portfolio optimization problem, a knapsack problem, and a problem from the Net Lib library. Subject classifications: Programming, stochastic: robust approach for solving LP/MIP with data uncertainties. Area of review: Financial Services. History: Received September 2001; revision received August 2002; accepted December 2002.
Conference Paper
Full-text available
To improve solution robustness, we introduce the concept of super solutions to constraint programming. An (a; b)-super solution is one in which if a variables lose their values, the solution can be repaired by assigning these variables with a new values and at most b other variables. Super solutions are a generalization of supermodels in propositional satis ability. We focus in this paper on (1,0)-super solutions, where if one variable loses its value, we can nd another solution by re-assigning this variable with a new value. To nd super solutions, we explore methods based both on reformulation and on search. Our reformulation methods transform the constraint satisfaction problem so that the only solutions are super solutions. Our search methods are based on a notion of super consistency. Experiments show that super MAC, a novel search-based method shows considerable promise. When super solutions do not exist, we show how to nd the most robust solution. Finally, we extend our approach from robust solutions of constraint satisfaction problems to constraint optimization problems.
Conference Paper
This paper considers the combination of berth and crane allocation problems in container terminals. We propose a novel approach based on constraint programming which is able to model many realistic operational constraints. The costs for berth allocation, crane allocation, time windows, breaks and transition times during gang movements are optimized simultaneously. The model is based on a resource view where gangs are consumed by vessel activities. Side constraints are added independently around this core model. The model is richer than the state of the art in the operations research community. Experiments show that the model produces solutions with a cost gap of 1/10 (7,8%) to 1/5 (18,8%) compared to an ideal operational setting where operational constraints are ignored.
Article
We have described a system for reasoning about temporal intervals that is both expressive and computationally effective. The representation captures the temporal hierarchy implicit in many domains by using a hierarchy of reference intervals, which precisely control the amount of deduction performed automatically by the system. This approach is partially partially useful in domains where temporal information is imprecise and relative, and techniques such as dating are not possible. © 1990 Morgan Kaufmann Publishers, Inc. Published by Elsevier Inc. All rights reserved.
Chapter
IN PREVIOUS chapters the performance of an approximate algorithm has been analyzed using a worst case point of view. In this chapter this condition is relaxed and the probabilistic analysis of an algorithm is performed by considering either the average case or the behavior in almost all cases.
Article
This paper introduces a family of synchronized sweep-based filtering algorithms for handling scheduling problems involving resource and precedence constraints. The key idea is to filter all constraints of a scheduling problem in a synchronized way in order to scale better. In addition to normal filtering mode, the algorithms can run in greedy mode, in which case they perform a greedy assignment of start and end times. The filtering mode achieves a significant speed-up over the decomposition into independent cumulative and precedence constraints, while the greedy mode can handle up to 1 million tasks with 64 resource constraints and 2 million precedences. These algorithms were implemented in both CHOCO and SICStus.
Article
Following a line of approach recently applied to the 0-1 integer programming problem with some success by Egon Balas, the algorithm of this paper is based upon an underlying tree-search structure upon which a series of tests is superimposed to exclude large portions of the tree of all possible 0-1 solutions from examination. In our method, the specific design of the enumeration and tests, supplemented by the use of a special type of constraint called a "surrogate constraint," results in an algorithm that appears to be quite efficient in relation to other algorithms currently available for solving the 0-1 integer programming problem. Early indications of efficiency must, however, be regarded as suggestive rather than conclusive, due to the limited range and size of problems so far examined. Following the analytical development of the method, three example problems are solved in detail with the Multiphase-Dual Algorithm to illustrate various aspects of its application. An extension of the algorithm to the general integer programming problem in bounded variables is briefly sketched in a concluding section.
Article
We present a set of benchmark instances for the evaluation of solution procedures for single- and multi-mode resource-constrained project scheduling problems. The instances have been systematically generated by the standard project generator ProGen. They are characterized by the input-parameters of ProGen. The entire benchmark set including its detailed characterization and the best solutions known so-far are available on a public ftp-site. Hence, researchers can download the benchmark sets they need for the evaluation of their algorithms. Additionally, they can make available new results. Depending on the progress made in the field, the instance library will be continuously enlarged and new results will be made accessible. This should be a valuable and driving source for further improvements in the area of project type scheduling.
Article
This chapter discusses that constraint reasoning involves various types of techniques to tackle the inherent intractability of the problem of satisfying a set of constraints. Constraint propagation is one of those types of techniques. It is central to the process of solving a constraint problem and indispensable for constraint reasoning. The chapter examines that constraint propagation embeds any reasoning, which consists in explicitly forbidding values or combinations of values for some variables of a problem, because a given subset of its constraints cannot be satisfied otherwise. It is in constraint reasoning that this concept shows its most accomplished form. There is no other field in which the concept of constraint propagation appears in such a variety of forms, and in which its characteristics have been analyzed. The chapter reviews that it formalizes all constraint propagation approaches within a unifying framework and discusses the main existing types of constraint propagation.
Chapter
Once we have developed an algorithm (q.v.) for solving a computational problem and analyzed its worst-case time requirements as a function of the size of its input (most usefully, in terms of the O-notation; see ALGORITHMS, ANALYSIS OF), it is inevitable to ask the question: "Can we do better?" In a typical problem, we may be able to devise new algorithms for the problem that are more and more efficient. But eventually, this line of research often seems to hit an invisible barrier, a level beyond whch improvements are very difficult, seemingly impossible, to come by. After many unsuccessful attempts, algorithm designers inevitably start to wonder if there is something inherent in the problem that makes it impossible to devise algorithms that are faster than the current one. They may try to develop mathematical techniques for proving formally that there can be no algorithm for the given problem which runs faster than the current one. Such a proof would be valuable, as it would suggest that it is futile to keep working on improved algorithms for this problem, that further improvements are certainly impossible. The realm of mathematical models and techniques for establishing such impossibility proofs is called computational complexity.
Conference Paper
Overload checking is an important method for unary as well as for cumulative resource constraints in constraint-based scheduling, as it tests for a sufficient inconsistency property. While an algorithm with time complexity O(n logn){\cal O}(n \log n) exists that is known for unary resource constraints, to our knowledge no such algorithms have been established to date for overload checking in cumulative constraints onn tasks. In this paper, an O(n logn){\cal O}(n \log n) overload checking algorithm is presented as well as its application to a more specific problem domain: the non-overlapping placement of nrectangles in a two-dimensional area. There, the runtime complexity of overload checking is O(n3 logn){\cal O}(n^3 \log n).
Article
We consider NP-complete optimization problems at the level of computing their optimal value, and define a class of functions called OptP to capture this level of structure. We show that TRAVELING SALESPERSON and KNAPSACK are complete for OptP, and that CLIQUE and COLORING are complete for a subclass of OptP. These results show a deeper level of structure in these problems than was previously known. We also show that OptP is closely related to FPSAT, the class of functions computable in polynomial time with an oracle for NP. This allows us to quantify exactly “how much” NP-completeness is in these problems. In particular, in this measure, we show that TRAVELING SALESPERSON is strictly harder than CLIQUE and that CLIQUE is strictly harder than BIN PACKING. A further result is that an OptP-completeness result implies NP-, Dp-, and Δ2P-completeness results, thus tying these four classes closely together.
Conference Paper
In this article, we explore how neighborhoods for the Large Neighborhood Search (LNS) framework can be automatically defined by the volume of propagation of our Constraint Programming (CP) solver. Thus we can build non trivial neighborhoods which will not be reduced to zero by propagation and whose size will be close to a parameter of the search. Furthermore, by looking at the history of domain reductions, we are able to deduce even better neighborhoods. This idea is validated by numerous experiments with the car sequencing problem. The result is a powerful and completely automatic method that is able to beat our hand-written neighborhoods both in term of performance and of stability. This is in fact the first time for us that a completely generic code is better than a hand-written one.
Article
RealPaver is an interval software for modeling and solving nonlinear systems. Reliable approximations of continuous or discrete solution sets are computed using Cartesian products of intervals. Systems are given by sets of equations or inequality constraints over integer and real variables. Moreover, they may have different natures, being square or nonsquare, sparse or dense, linear, polynomial, or involving transcendental functions.The modeling language permits stating constraint models and tuning parameters of solving algorithms which efficiently combine interval methods and constraint satisfaction techniques. Several consistency techniques (box, hull, and 3B) are implemented. The distribution includes C sources, executables for different machine architectures, documentation, and benchmarks. The portability is ensured by the GNU C compiler.
Article
Many real-world scheduling problems are subject to change, and scheduling solutions should be robust to those changes. We consider a single-machine scheduling problem where the processing time of each activity is characterized by a normally distributed random variable, with flowtime as the main solution criterion. The objective is to find the β-robust schedule—the schedule that minimizes the risk of the flowtime exceeding a threshold. We show how to represent this problem as a constraint model, explicitly representing the uncertainty and robustness as input parameters and objectives, and enabling the uncertainty to propagate using constraint propagation. Specifically, we develop three models (primal, dual and hybrid), and we show the effect of dominance rules on the search space.
Article
The problem of representation and handling of constraints is here considered, mainly for picture processing purposes. A systematic specification and utilization of the available constraints could significantly reduce the amount of search in picture recognition. On the other hand, formally stated constraints can be embedded in the syntactic productions of picture languages. Only binary constraints are treated here, but they are represented in full generality as binary relations. Constraints among more than two variables are then represented as networks of simultaneous binary relations. In general, more than one equivalent (i.e., representing the same constraint) network can be found: a minimal equivalent network is shown to exist, and its computation is shown to solve most practical problems about constraint handling. No exact solution for this central problem was found. Anyway, constraints are treated algebraically, and the solution of a system of linear equations in this algebra provides an approximation of the minimal network. This solution is then proved exact in special cases, e.g., for tree-like and series-parallel networks and for classes of relations for which a distributive property holds. This latter condition is satisfied in cases of practical interest.
Article
The purpose of this paper is to propose models for project scheduling when there is considerable uncertainty in the activity durations, to the extent that the decision maker cannot with confidence associate probabilities with the possible outcomes of a decision. Our modeling techniques stem from robust optimization, which is a theoretical framework that enables the decision maker to produce solutions that will have a reasonably good objective value under any likely input data scenario. We develop and implement a scenario-relaxation algorithm and a scenario-relaxation-based heuristic. The first algorithm produces optimal solutions but requires excessive running times even for medium-sized instances; the second algorithm produces high-quality solutions for medium-sized instances and outperforms two benchmark heuristics.
Article
This paper describes the design, implementation, and applications of the constraint logic language cc(FD). cc(FD) is a declarative nondeterministic constraint logic language over finite domains based on the cc framework [33], an extension of the CLP scheme [21]. Its constraint solver includes (non-linear) arithmetic constraints over natural numbers which are approximated using domain and interval consistency. The main novelty of cc(FD) is the inclusion of a number of general-purpose combinators, in particular cardinality, constructive disjunction, and blocking implication, in conjunction with new constraint operations such as constraint entailment and generalization. These combinators significantly improve the operational expressiveness, extensibility, and flexibility of CLP languages and allow issues such as the definition of non-primitive constraints and disjunctions to be tackled at the language level. The imple...
Les perspectives qui se dégagent de cette approche sont les suivantes. D'un point de vue pratique, il serait intéressant d'adapter l'algorithme « sweep+précédences
  • De Letort
Les perspectives qui se dégagent de cette approche sont les suivantes. D'un point de vue pratique, il serait intéressant d'adapter l'algorithme « sweep+précédences » de Letort et al. [LCB14].
Constraint-Based Scheduling : Applying Constraint Programming to Scheduling Problems
  • Philippe Baptiste
  • Claude Le Pape
  • Wim Nuijten
Philippe Baptiste, Claude Le Pape, and Wim Nuijten. Constraint-Based Scheduling : Applying Constraint Programming to Scheduling Problems. International Series in Operations Research and Management Science. Kluwer, 2001. 11, 28, 45, 46, 47, 48, 58, 63, 66, 70
Slack-based techniques for robust schedules
  • Andrew J Davenport
  • Christophe Jefflot
  • J. Christopher Beck
Andrew J. Davenport, Christophe Jefflot, and J. Christopher Beck. Slack-based techniques for robust schedules. In European Conference on Planning, pages 7-18, 2001. 49, 50, 71
A new characterization of relevant intervals for energetic reasoning
  • Alban Derrien
  • Thierry Petit
Alban Derrien and Thierry Petit. A new characterization of relevant intervals for energetic reasoning. In Principles and Practice of Constraint Programming -20th International Conference, CP 2014, Lyon, France, September 8-12, 2014. Proceedings, pages 289-297, 2014. 15, 70, 93
A declarative paradigm for robust cumulative scheduling
  • Alban Derrien
  • Thierry Petit
  • Stéphane Zampelli
Alban Derrien, Thierry Petit, and Stéphane Zampelli. A declarative paradigm for robust cumulative scheduling. In Principles and Practice of Constraint Programming -20th International Conference, CP 2014, Lyon, France, September 8-12, 2014. Proceedings, pages 298-306, 2014. 15, 49, 88, 90, 91, 93, 97
Scheduling under time and resource constraints
  • Jacques Erschler
  • Pierre Lopez
  • Catherine Thuriot
Jacques Erschler, Pierre Lopez, and Catherine Thuriot. Scheduling under time and resource constraints. In Proc. of Workshop on Manufacturing Scheduling, 11th IJ-CAI, 1989. 14, 29, 44, 45