Chapter

Optimization Methods for the Single‐Machine Problem

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Introduction Adjacent Pairwise Interchange Methods A Dynamic Programming Approach Dominance Properties A Branch and Bound Approach Summary References Exercises

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Strategies that define how to comply with such demands are subject of study in many research areas, such as the operational research area. This problem was named as the job-shop scheduling problem [1]. In a few words, the aim of the job-shop scheduling problem is to organize a set of tasks over a set of resources in such a way that it optimizes some criterion measure, like makespan or cost [1]. ...
... This problem was named as the job-shop scheduling problem [1]. In a few words, the aim of the job-shop scheduling problem is to organize a set of tasks over a set of resources in such a way that it optimizes some criterion measure, like makespan or cost [1]. Many scheduling problems can be viewed as problems in optimization subject to constraint, and according to [2] they also can be viewed as planning problems. ...
... Many scheduling problems can be viewed as problems in optimization subject to constraint, and according to [2] they also can be viewed as planning problems. There are many known techniques to solve the job-shop scheduling or the planning problem, such as heuristics, linear programming, other optimization algorithms [1]. As the industrial systems are becoming increasingly complex, to translate their behaviour into a mathematical language commonly used by these algorithms is a very hard task and hinders the algorithms' performance. ...
Conference Paper
Finite-state automata and Supervisory Control Theory have been used to model and solve job-shop scheduling and planning problems. However, even if it seems to be easier to work with DFA and SCT, this solution will suffer with " the curse of dimensionality " , which can cause state explosion when the system becomes bigger and more complex. This paper presents a set of sufficient conditions that allow to work with abstractions of the supervisor, instead of the supervisor itself, as the search universe to solve a planning problem. Such abstraction is the natural projection of the supervisor into the set of controllable events and it should satisfy the observer property. This abstraction is smaller then the original automaton, which reduces the search universe for the optimization algorithms. Also, we present a set of conditions for the model of the system and specifications that will results on the satisfaction of the observer property.
... We refer to these models simply as lognormal. For both distributions, we study how to select the best tour and due date for the minimization of fundamental safe scheduling objectives [2, 3]. Our lognormal analysis includes models with linear association. ...
... Our lognormal analysis includes models with linear association. (Linear association occurs when positive independent random variables are all multiplied by an additional positive independent random variable [1, 3]. Linearly associated random variables are positively correlated.) ...
... Although travel time data is not included in their analysis, we assume that the lognormal is also a better fit for travel time than the normal. Furthermore, the lognormal distribution can provide a better fit for the convolution of positive random variables than the normal: as n grows large, the distribution of the sum of n independent, positive random variables tends to lognormal [1, 3], and see also [22]. [1] and [3] refer to that as the lognormal central limit theorem. ...
Article
Full-text available
We address the stochastic traveling salesperson problem (TSP) with distances measured by travel time. We study how to select the best tour and due date for the minimization of fundamental safe scheduling objectives. Model 1 requires minimizing the due date subject to a service level constraint. Model 2 addresses a weighted trade-off between the due date and the expected tardiness. Both models require safety time and therefore the distribution of the tour length is important. In alternate formulations the due date is given and we maximize the service level for Model 1 or minimize the expected tardiness for Model 2. In an unpublished working paper (available as a web resource), we addressed normal travel times. In this paper we recap some of those results and extend them to lognormal travel times. In general, we show that Alternate 1 is equivalent to Model 1, but Alternate 2 is different to Model 2. For the normal distribution, we solve optimally for Model 2 and for Model 1 with service level 50% and higher by solving few deterministic TSP derived models. For other instances, including all lognormal models, we provide effective heuristics and tight performance guarantee certificates. As a by-product of our TSP analysis, we obtain comparable results for the shortest route problem.
... Scheduling problems are in general nontrivial and exhaustive enumeration of the scheduling solutions set is not usually efficient. Scheduling problems are generally complex, large scale, constrained, and multiobjective in nature, and classical operational research techniques are often inadequate to effectively solving them [2]. With the advent of computation intelligence, there is a renewed interest in solving scheduling problems through Swarm Intelligence (SI) based techniques. ...
... The study of Single Machine Scheduling Problem (SMSP) is identified to be very important for several technological and economic reasons, probably the most relevant of which is that good solutions to this problem provide a relevant support to manage and model the behavior of complex systems. In these systems it is important to understand the working of their components, and quite often the SMSP appears as an elementary component in a larger scheduling problem [1] [2]. Sometimes the basic SMSP is solved independently, and then results are incorporated into the larger and more complex problem. ...
... The selection of a solution component from N(sp) is guided by a stochastic mechanism, which is biased by the pheromone associated with each of the elements of N(sp). The rule for the stochastic choice of solution components vary across the different proposed ACO algorithms but, in all of them, it is inspired by the Goss model (experimental setup for the double bridge experiment) of the behavior of real ants assuming that at a given moment in time m1 ants have used the first bridge and m2 the second one, the probability p1 for an ant to choose the first bridge is given by [4] : (2) where parameters k and h are to be fitted to the experimental data. Monte Carlo simulations showed a very good fit for k ≈ 20 and h ≈ 2. ...
Article
Full-text available
This paper describes some developing issues for ACS based software tools to support decision making process and solve the problem of generating a sequence of jobs that minimizes the total weighted tardiness for a set of jobs to be processed in a single machine. An Ant Colony System (ACS) based algorithm performance is validated with benchmark problems available in the OR library. The obtained results were compared with the optimal (best available results in some cases) and permit to conclude about ACS efficiency and effectiveness. The ACS performance and respective statistical significance was evaluated.
... This section will present the rules based on a set of factors consisting of the job processing times, due dates, and combination of the processing times and due dates. It is to be noted that an extensive theoretical concept considers the job processing time and the job dates for the case of the single machine tardiness sequencing problem, but none have emerged to a unique result [1]. To conclude the treatment of the single machine tardiness problem with some specialized results concerning optimal sequences, the following are held. ...
... Step 4. Schedule jobs in non-decreasing order of TP i (TP [1] ≤TP [2] ≤…≤TP [n] ). Step 5. Choose the final schedule with the smallest total tardiness among the m +1 schedules and stop. ...
... Let j = 1. Step 2. Schedule jobs in non-decreasing order of d ij (d [1] j ≤ d [2] j ≤ … ≤ d [n] j ). Step 3. If j = m, go to step 4, otherwise let j = j + 1, and go to step 2. ...
Article
Full-text available
In this paper, the tardiness flow shop models with intermediate due date were considered. The flow shop models consist of a set of jobs each having a number of operations, while each operation is performed in a single machine. All the jobs are considered having the same unidirectional precedence order. In the tardiness flow shop models with intermediate due date, which we call the generalized tardiness flow shop models, there exist a due date associated with the completion of each operation, and we want to find a schedule which minimizes the total tardiness of the jobs. This is a more general version of tardiness flow shop in the sense that, by assigning a large value to each of the intermediate due dates, we can obtain the traditional flow shop models. Considering the generalized tardiness flow shop models as the NP-hard problems, a set heuristic sequencing rules for finding the best permutation schedule for such problems is proposed. We conducted an extensive computational experiment using randomly generated test problems for evaluating the efficiency of the proposed rules in obtaining a near-optimal solution. The efficiency of the rules was evaluated, and those rules with better solutions were designated and reported.
... Scheduling problems are generally complex, large scale, constrained, and multi-objective in nature, and classical operational research techniques are often inadequate to effectively solving them [8]. With the advent of computation intelligence, there is a renewed interest in solving scheduling problems through Swarm Intelligence (SI) based techniques. ...
... The study of Single Machine Scheduling Problem (SMSP) is identified to be very important for several technological and economic reasons, probably the most relevant of which is that good solutions to this problem provide a relevant support to manage and model the behavior of complex systems. In these systems it is important to understand the working of their components, and quite often the SMSP appears as an elementary component in a larger scheduling problem [7] [8]. Sometimes the basic SMSP is solved independently, and then results are incorporated into the larger and more complex problem. ...
... where ∈ [0,1] is the pheromone decay coefficient, and 0 is the initial trail intensity (equation 6): (6) where WT is the total weighted tardiness for a sequence generated by the EDD rule and n is the number of jobs. The main goal of the local update rule is to make the decision of appending job j on position i less desirable for the others ants so that the exploration of different sequences is favored [8]. ...
Conference Paper
The paper introduces an approach to solve the problem of generating a sequence of jobs that minimizes the total weighted tardiness for a set of jobs to be processed in a single machine. An Ant Colony System based algorithm is validated with benchmark problems available in the OR library. The obtained results were compared with the best available results and were found to be nearer to the optimal. The obtained computational results allowed concluding on their efficiency and effectiveness.
... If delaying this request causes the non-conformance to increase beyond that given in the SLA, then the request is assigned a penalty p k , else it is assigned a penalty of 0. This penalty assignment (pa j ) is performed at each end server, upon arrival of every request and so each queued request has a penalty assigned, which is charged if it does not meet the response time. So our multi-class, multi-server, percentile penalty-based scheduling problem is now converted to the well-investigated, multi-class, single machine scheduling problem, in which the penalty charged is dependent on the request completion time [7] shown in Equation (B) of Algorithm 3. Determining the schedule that minimizes the total penalty for the single server in Equation (C) in Algorithm 3 is known to be NP- hard [7]; typically, such a problem is solved with heuristics for neighborhood searches [8]. The neighborhood search method we have chosen is Simulated Annealing (SA) [5]. ...
... At the end, we choose the schedule with the lowest penalty [5]. We investigate two methods of neighborhood search in simulated annealing, namely last insertion and pairwise interchange [8]; we have selected them for the computational overhead they introduce. They differ in the way of obtaining the next (neighbor) schedule. ...
... We use simulation to evaluate the proposed scheduling algorithms. A large number of both system parameters (such as request rates, penalties p k , conformance thresholds, number of data centers, classes, etc.) as well as algorithm features (such as length of conformance interval, number of subintervals, iterations allowed in simulated annealing, tabu search [8] technique, etc.) have an effect on how the algorithms perform and are clearly of interest. We report such results in [9]. ...
Conference Paper
Full-text available
We consider geographically distributed data centers forming a collectively managed cloud computing system hosting multiple applications, each subject to Service Level Agreements (SLA). The Service Level Agreements for each application require the response time of a certain percentile of the input requests to be less than a specified value, with the non-conforming requests being charged a penalty. We present a novel approach of heuristic-based request scheduling at each server, in each of the geographically distributed data centers, to globally minimize the penalty charged to the cloud computing system. We evaluate two variants of our heuristic-based approach, one based on the simulated annealing method of neighborhood searches and another based on gi-FIFO scheduling, which has been analytically proven to be the best schedule for percentile goals in a single machine, multi-class problem. We also compare our approaches with First In First Out (FIFO) and Weighted Round Robin (WRR) scheduling policies.
... De maneira mais importante, geralmente , o número de programações possíveis cresce rapidamente à medida que o número de atividades e processos aumenta (SLACK et al., 1999; ARENALES et al., 2007). A programação da produção corrobora para a boa ordenação das tarefas que serão executadas; para isso, considera-se uma fração de tempo e a quantidade de máquinas alocadas, ou seja, trata-se da determinação de local e período de execução de cada operação, o que otimizará o processo produtivo (LAWLER, 1989; BAKER;TRIETSCH, 2009). Quanto à metodologia dessa pesquisa, trata-se de uma análise bibliográfica, que, de acordo com Vergara (1998), é desenvolvida com o objetivo de proporcionar uma visão geral, de tipo aproximativo, acerca de determinado fato. ...
... De maneira mais importante, geralmente , o número de programações possíveis cresce rapidamente à medida que o número de atividades e processos aumenta (SLACK et al., 1999; ARENALES et al., 2007). A programação da produção corrobora para a boa ordenação das tarefas que serão executadas; para isso, considera-se uma fração de tempo e a quantidade de máquinas alocadas, ou seja, trata-se da determinação de local e período de execução de cada operação, o que otimizará o processo produtivo (LAWLER, 1989; BAKER;TRIETSCH, 2009). Quanto à metodologia dessa pesquisa, trata-se de uma análise bibliográfica, que, de acordo com Vergara (1998), é desenvolvida com o objetivo de proporcionar uma visão geral, de tipo aproximativo, acerca de determinado fato. ...
... Many real-world problems are full of uncertainty, which makes it inevitable to devise complicated techniques where deterministic ones are considered as a basis. Although generally the task duration is uncertain, scholars have frequently suppressed that uncertainty when stating the problem (Baker & Trietsch , 2009). In the context of job shop scheduling, processing time of work pieces, the frequency of machine faults and the interval of orders often have a fluctuating nature and managers must make decisions without full knowledge of the coefficients. ...
... In the context of job shop scheduling, processing time of work pieces, the frequency of machine faults and the interval of orders often have a fluctuating nature and managers must make decisions without full knowledge of the coefficients. For instance, the exponential distribution may be an appropriate model for arrival processes, processing times and sometimes for waiting times (Baker & Trietsch , 2009). Gua et al. (1984) employed an expected value model (EVM) to deal with stochastic job shop scheduling problem (SJSSP). ...
Article
Full-text available
When aiming on improving efficiency and reducing cost in manufacturing environments, production scheduling can play an important role. Although a common workshop is full of uncertainties, when using mathematical programs researchers have mainly focused on deterministic problems. After briefly reviewing and discussing popular modeling approaches in the field of stochastic programming, this paper proposes a new approach based on utility theory for a certain range of problems and under some practical assumptions. Expected utility programming, as the proposed approach, will be compared with the other well-known methods and its meaningfulness and usefulness will be illustrated via a numerical examples and a real case.
... Once a new population has been generated through crossover and mutation operators, a Biased Random Sampling (BRS) search procedure [35] is applied in order to search for better solutions in the neighbourhood of the best-performing chromosomes created so far. To this end, a sub-population having size N best N pop made by the best individuals obtained after each generation, is selected. ...
... The parameter α is used to control how the probability decreases when moving from a group of Ω to the next. Conforming to Baker and Trietsch [35], a value of α = 0.8 has been selected. Thus, supposing to have G = 3, the first element of the new sub-chromosome k Ω will be drawn from Ω according to probabilities 0.410, 0.328, and 0.262 for the first, the second, and the third group of Ω, respectively. ...
Article
Full-text available
Production processes in Cellular Manufacturing Systems (CMS) often involve groups of parts sharing the same technological requirements in terms of tooling and setup. The issue of scheduling such parts through a flow-shop production layout is known as the Flow-Shop Group Scheduling (FSGS) problem or, whether setup times are sequence-dependent, the Flow-Shop Sequence-Dependent Group Scheduling (FSDGS) problem. This paper addresses the FSDGS issue, proposing a hybrid metaheuristic procedure integrating features from Genetic Algorithms (GAs) and Biased Random Sampling (BRS) search techniques with the aim of minimizing the total flow time, i.e., the sum of completion times of all jobs. A well-known benchmark of test cases, entailing problems with two, three, and six machines, is employed for both tuning the relevant parameters of the developed procedure and assessing its performances against two metaheuristic algorithms recently presented by literature. The obtained results and a properly arranged ANOVA analysis highlight the superiority of the proposed approach in tackling the scheduling problem under investigation.
... Dynamic programming (DP) is a useful general optimization technique for making a sequence of dependent decisions. In general, when performance measure of a single machine problem has an additive form, as in our subproblems, we can find an optimal solution through a dynamic programming approach [1]. In first stage of DP, it starts with a small part of the whole problem and finds the optimal solution for that small part. ...
... Definition 1. A performance measure Z is regular if (a) the scheduling objective is to minimize Z, and (b) Z can increase only if at least one of the completion times in the schedule increases [1]. It can be shown that in an optimal solution of a scheduling problem with regular measure, as in our problem, there is no machine idle time. ...
... ities has been associated with CPM, using deterministic activity duration assumptions. By contrast, PERT took into account stochastic activity durations but with few exceptions crashing was not considered. One possible reason for this state of affairs may be that PERT was never a truly effective platform for assessing stochastic activity durations. Baker & Trietsch (2009) list several PERT deficiencies, of which the stochastic independence assumption is the most pernicious. To address this issue, Trietsch (2005) presents the systemic error model, which employs a multiplicative stochastic error that is constant for each project but varies across projects. This component of the model represents estimation ...
... roperty. Furthermore, although the exact distribution of a sum of lognormal variables is unknown, the lognormal distribution itself provides reasonable approximate convolutions (Robb & Silver, 1993). Those results open the way not only to effective stochastic scheduling but also to modeling and implementing optimal crashing of a stochastic project. Baker & Trietsch (2009) discuss this issue in generic terms, and the purpose of this paper is to explore it in more depth. We do so specifically in terms of a simple but important building block: a serial project. We study optimal or near-optimal crashing of stochastic activities in basic projects with n (not necessarily independent) activities in series, and ...
Article
Full-text available
Crashing stochastic activities implies changing their distributions to reduce the mean. This can involve changing the variance too. Therefore, crashing can change not only the expected duration of a project but also the necessary size of its safety buffer. We consider optimal crashing of serial projects where the objective is to minimize total costs including crashing cost and expected delay penalty. As part of the solution we determine optimal safety buffers. They allow for activities that are statistically dependent because they share an error element (e.g., when all durations have been estimated by one person, when weather or general economic conditions influence many activities, etc). We show that under plausible conditions the problem is convex and thus it can be solved by standard numerical search procedures. The purpose of the paper is to encourage software development that will include valid stochastic analysis for scheduling and crashing using current estimates and historical performance records.
... "Scheduling is the allocation of resources over time to perform a collection of tasks" [8]. Since the early works of Johnson [11] and Smith [12] in the mid fifties, scheduling has remained at the interface between theory and practice [16]. It has given rise to a number of techniques for problem solving including dominance rules which have played an important role. ...
... Few definitions of dominance rules are to be found in the literature. Baker and Trietsch [16] define them in the following way: "Dominance properties provide conditions under which certain potential solutions can be ignored". ...
Article
Full-text available
The aim of this paper is to study the concept of a “dominance rule” in the context of combinatorial optimization. A dominance rule is established in order to reduce the solution space of a problem by adding new constraints to it, either in a procedure that aims to reduce the domains of variables, or directly in building interesting solutions. Dominance rules have been extensively used over the last 50 years. Surprisingly, to our knowledge, no detailed description of them can be found in the literature other than a few short formal descriptions in the context of enumerative methods. We are therefore proposing an investigation into what dominance rules are. We first provide a definition of a dominance rule with its different nuances. Next, we analyze how dominance rules are generally formulated and what are the consequences of such formulations. Finally, we enumerate the common characteristics of dominance rules encountered in the literature and in the usual process of solving combinatorial optimization problems.
... It means that the order of aircrafts after merging is not unique. In practice, such flight planning is not reliable for safe en-route traffic control, because it may make aircrafts loss of track(K.R. Baker and D. Trietsch, 2013). It is universally agreed that the sequencing solution should be reasonable and fair, e.g. using FCFS. ...
Article
Full-text available
This paper addresses a network-wide scheduling problem for generating, adjusting and optimizing flight schedules in en-route airspace. The aim of scheduling is twofold: to support flight plans by providing flight schedules reliable on optimal flow restrictions; and to assist en-route traffic control by ensuring safe and ordered trajectory of traffic. Firstly, we design an algorithm to generate continuous-time flight schedules based on discrete-time flow assignments of aircrafts. This algorithm is further enhanced with consideration of sequence optimization for merging traffic. A flight scheduling system has been developed in a simulation environment.
... It means that the order of aircrafts after merging is not unique. In practice, such flight planning is not reliable for safe en-route traffic control, because it may make aircrafts loss of track(K.R. Baker and D. Trietsch, 2013). It is universally agreed that the sequencing solution should be reasonable and fair, e.g. using FCFS. ...
Conference Paper
Full-text available
This paper addresses a network-wide scheduling problem for generating, adjusting and optimizing flight schedules in en-route airspace. The aim of scheduling is twofold: to support flight plans by providing flight schedules reliable on optimal flow restrictions; and to assist en-route traffic control by ensuring safe and ordered trajectory of traffic. Firstly, we design an algorithm to generate continuous-time flight schedules based on discrete-time flow assignments of aircrafts. This algorithm is further enhanced with consideration of sequence optimization for merging traffic. A flight scheduling system has been developed in a simulation environment.
... Previously, there are quite lot research using exact and metaheuristics to solve this problem. Some of the methods are Branch and Bound Algorithm [3] , an O(n2) time approximation algorithm [4], Tabu Search Algorithm [5], and etc. Nowadays, there are high numbers of research developing metaheuristics algorithm for solving NP-hard problems. The most popular of metaheuristics are those which based on biology (biology based). ...
Article
Full-text available
Climate change with the rise of greenhouse gas emissions due to the massive use of fossil-fuels could negatively affect the quality of natural environment over the globe. With this, renewable energy should be promoted as the alternative source of energy among the electronic and electrical (E&E) firms in order to reduce the dependency on fossil-fuels and move towards to energy efficiency. In Malaysia, government has committed to optimising use of green technology which is capable of assisting the E&E industry to produce products with less impact to the environment and socio-economy improvement to the society. The E&E firms have limited resources to manage renewable energy (RE) internally that currently gained suppliers supports to assist the implementation of green technology and energy management. Thus, this study aims to explore the challenges of renewable energy management implementation and how the firms managed low carbon issue in the supply chain. The outcome of this paper is expected to be the basis for the concept development of renewable energy supply chain.
... A large number of simplified academic problem has been distilled over the years (see e.g. [5] and [1]) from the practice of assembly car scheduling, which was and still is -as a rule -covered by clouds of industrial secrecy and the obvious fact that car industry is for profit, not for disseminating and refining knowledge. Hence the scarcity of academic publications dealing with real-world car assembly problems. ...
Article
Full-text available
A new approach to solving realistic car assembly scheduling problems for mixed model assembly line is presented. It is proposed to decompose the problem into two subproblems: 1) a sequencing problem that generates admissible car sequences fulfilling capacity constraints for all car models in the production plan, 2) a scheduling problem that determines an admissible car sequence with shortest makespan. The details of this approach are illustrated by a simple numerical example.
... Therefore, additional criteria such as dominance properties and a lower bound must be established to efficiently obtain an optimal solution. Baker and Trietsch [39] defined dominance properties for combinatorial optimization problems as standards for excluding a redundant subset suspected of not containing the optimal solution; dominance properties can distinguish only a valuable subset containing the optimal solution from all subsets in the solution space. Jouglet and Carlier [40] carried out a literature review of a variety of dominance properties, categorized by characteristics and organized them according to their utility in combinatorial optimization problems. ...
Article
Full-text available
We consider a two-agent single-machine scheduling problem that minimizes the total weighted tardiness of one agent under the restriction that the second agent is prohibited from having tardy jobs. The actual processing times of all jobs are affected by a sum-of-processing-times-based aging effect. After showing the NP-hardness of the problem, we design a branch-and-bound (B&B) algorithm to find an optimal solution by developing dominance properties and a lower bound for the total weighted tardiness to increase search efficiency. Because B&B takes a long time to find an optimal solution, we propose a genetic algorithm as an efficient, near optimal solution approach. Four methods for generating initial populations are considered, and edge recombination crossover is adopted as a genetic operator. Through numerical experiments, we verify the outstanding performance of the proposed genetic algorithm.
... The arrival patterns form the basis for testing the performance of RegSPT against popular sequencing methods. Three popular sequencing approaches from the literature were chosen for the analysis: SPT, MDD, and EDD. Baker and Trietsch (2009) find that SPT performs quite robustly for a variety of objective functions (seeTable 1) and, in particular, in the presence of imperfect information which is an advantage in the ETO environment. In addition, it is easy for practitioners to implement and understand. ...
Article
Full-text available
Engineer–to-order (ETO) firms produce complex – one of a kind – products and desire shorter lead time as a key component to cost competitiveness. In ETO firms, the engineering process is the largest controllable consumer of lead time. Given that lead time is a function of completion rate and scheduling policy, one critical process is to accurately sequence jobs in front of the engineering function. However, unlike other manufacturing models, such as make–to-stock or make-to-order models, the design for an ETO product is not realized until after the engineering process has been completed. Hence, the only information available does not include data normally required by most sequencing algorithms. Therefore, the problem becomes the determination of an accurate schedule within a complex transactional process for jobs which have not even been designed yet. This paper investigates this topic in the context of the engineering process within the ETO model. Based on research conducted in conjunction with multiple firms, common factors are identified which drive complexity, and a new framework and algorithm are presented for using these factors to sequence jobs. Using discrete event simulation, the performance of this new algorithm is found to be a significant improvement over current industry and published methods.
... The classical theory of scheduling assumes that the processing time of a job is not affected by its position on a schedule (e.g. Morton & Pentico, 1993;Pinedo, 2002;Baker & Trietsch, 2009). However, there are situations where the processing time of a job scheduled at a position can be faster or slower than that at the previous position due to learning and forgetting effects (Wang & Cheng, 2007;Cheng, et al., 2010;Lai & Lee, 2013). ...
Article
Full-text available
This research discusses an integer batch scheduling problems for a single-machine with position-dependent batch processing time due to the simultaneous effect of learning and forgetting. The decision variables are the number of batches, batch sizes, and the sequence of the resulting batches. The objective is to minimize total actual flow time, defined as total interval time between the arrival times of parts in all respective batches and their common due date. There are two proposed algorithms to solve the problems. The first is developed by using the Integer Composition method, and it produces an optimal solution. Since the problems can be solved by the first algorithm in a worst-case time complexity O(n2 n-1), this research proposes the second algorithm. It is a heuristic algorithm based on the Lagrange Relaxation method. Numerical experiments show that the heuristic algorithm gives outstanding results.
... The due date oriented rules – no matter whether Earliest Due Date (EDD), Least Slack Time (LST), Critical Ratio (CR) or Operation Due Date (ODD), they all attempt to minimize the variance of lateness by different but similar manners (Baker and Trietsch 2009 ). The due date plays the dominating role in setting a target which forces the lots to catch up with. ...
Conference Paper
Full-text available
Although work-in-process (WIP) balance approaches can achieve average cycle time reduction, due to the characteristics of wafer fabrication facilities (wafer fabs), e.g., re-entrant flow, setup time and batch processing, a lack of effective mechanism to ensure lot movement at the right pace results in degraded cycle time variance, which might be a potential problem when due date is concerned. This paper attempts to solve this problem. Firstly four cycle time variance minimization rules which utilize waiting time, cycle time and due date information of lot are investigated. Then they are incorporated into two WIP balance approaches in literature to figure out whether they can overcome the drawback arising from WIP balance. In the end the benefit of cycle time variance minimization is illustrated by one example to address an improved ability to meet due date reliably.
... These methods require that assumptions be made to ease the problem and obtain optimal solutions. However, it is not always possible to develop effective assumptions, so the simulation method becomes an effective method for analysing problems and evaluating different convincing results and for practical purposes678. Job shop scheduling problems are described as NP hard problems. ...
Article
Full-text available
This paper presents the development of a priority, rule-based, a production scheduling module for the Faborg-Sim simulation tool with ten priority rules. Faborg-Sim consists of three modules, i.e., modelling, simulation, and performance evaluation. In this study, a detailed conceptual framework was defined and a case study was modelled and evaluated for a machine parts manufacturing system by using Faborg-Sim. The simulations were run using only six selected priority rules for the information on customers' orders in order to integrate the scheduling module in Faborg-Sim. Simulation models were run separately for each priority rule of scheduling to obtain the best performance of the production schedule. After repeating the simulations, performance measurement parameters were obtained and evaluated on a relative basis.
... Scheduling with meeting job due dates have received increasing attention from managers and researchers since the Just-In-Time concept was introduced in manufacturing facilities. While meeting due dates is only a qualitative performance, it usually implies that time dependent penalties are assessed on late jobs but that no benefits are derived from completing jobs early [2]. In this case, these quantitative scheduling objectives associated with tardiness are naturally highlighted in different manufacturing environments. ...
Article
This paper addresses the scheduling problem of minimizing the total weighted tardiness on a single machine with step-deteriorating jobs. With the assumption of deterioration, the job processing times are modeled by step functions of job starting times and pre-specified job deteriorating dates. The introduction of step-deteriorating jobs makes a single machine total weighted tardiness problem more intractable. The computational complexity of this problem under consideration was not determined. In this study, it is firstly proved to be strongly NP-hard. Then a mixed integer programming model is derived for solving the problem instances optimally. In order to tackle large-sized problems, seven dispatching heuristic procedures are developed for near-optimal solutions. Meanwhile, the solutions delivered by the proposed heuristic are further improved by a pair-wise swap movement. Computational results are presented to reveal the performance of all proposed approaches.
... We deal with complex optimization problems with many variables and constraints. Scheduling is one of the important issues of the manufacturing process that has a major impact on the productivity of the process (Baker and Trietsch, 2009). However, usually schedules are of limited duration and scheduling is a continuous process of responding to unexpected events, for example, machine break down, illness of employees, lacking material, etc. ...
Article
Full-text available
In the paper, we investigate deterministic approaches for solving scheduling and rescheduling problems. Most of them are based on mixed-integer programming. Descriptions of solvers are also presented here. They allow solving complicated optimization problems with many variables and constraints in acceptable time. The deterministic approaches could be implemented in a production effectiveness navigator under development, one of the aims of which is to react to unexpected events occurring in shop-floors. Thus, rescheduling is necessary and optimal decisions must be made invoking the state-of-the-art optimization methods. Moreover, the management focus should be taken into account, when not only the initial schedule, but also a new schedule is created.
... Scheduling is an important tool in optimizing manufacturing and service systems and networks. There exist various models and solutions according to the characteristics of individual problems (Baker & Trietsch, 2009;Pinedo, 2008). Innovations of processes by the laser and photonic technologies have a significant potential to enhance the performance of systems and networks. ...
Article
The purpose of this article is to review the key emerging innovations in laser and photonics systems as well as their design and integration, focusing on challenges and opportunities for solutions of societal challenges. Developments, their significance, and frontier challenges are explained in advanced manufacturing, biomedicine and healthcare, and communication. Systems, networks, and integration issues and challenges are then discussed, and an integration framework for networking laser‐ and photonic‐based services and products is proposed. The article concludes with implications and an agenda for education, research and development, and policy needs, with a focus on human, society, science, and technology integration. © 2013 Wiley Periodicals, Inc.
... Now, we will provide some useful expressions. Namely, following [1] and [14], the completion time of a job scheduled in the ith position in π on machine M 2 equals ...
Article
In this paper, we focus on real-life settings that require the development of new models of flowshop scheduling problems, where job processing times can increase with the number of processed jobs due to the aging effect and decrease by the allocation of additional resource. We analyse the makespan minimization flowshop problem with such model and also with the aging effect only. We prove that the considered problems and their special cases are still polynomially solvable under given conditions, and on their basis, we provide optimal polynomial time solution algorithms.
... However, since most of above decisions imply the utilization of certain manufacturing resources assumed available during process plans implementation on the shop floor, their quality can only be judged on scheduling. Scheduling is another key function of manufacturing companies that uses fresh data on the availability status of manufacturing resources on the shop floor for allocating idle resources to parts processing jobs so that the due dates of all jobs to be met with lowest costs [2, 9, 11]. The result is a short-term parts production schedule that states the time each manufacturing operation starts and finishes on assigned machine item on the shop floor. ...
Article
Full-text available
This paper presents a manufacturing case study that advocates the use of just-in-time operations planning and scheduling strategy for make-to-order machining jobs of the parts. The study concerns two simple parts that are to be produced through machining operations according to certain specifications of material requirements plan (MRP). The attributes of manufacturing steps comprising the two machining jobs are considered available from a previously performed preplanning stage. Three different scenarios on the availability status of machine items on the shop floor are taken into account: 1) numerous machines are continuously available for production during the scheduling horizon; 2) few machines are continuously available for production during the scheduling horizon; 3) few machines are intermittently available for production during the scheduling horizon. Machines availability data for the three scenarios of the case study are alternatively obtained from the shop floor controller. An integrated computer-aided process planning (CAPP) and computer-aided scheduling (CAS) software is employed to develop a cost-effective operations plan and operations schedule for manufacturing the parts under each machines availability scenario. It is observed that planning and scheduling solutions optimized for scenarios with higher availability of machines are unfeasible for scenarios with lower availability of machines. However, these solutions are less costly than the solutions obtained for scenarios with lower availability of machines. The conclusion is that just-in-time planning and scheduling is a convenient way to increase the quality of operations plans and operations schedules for make-to-order manufacturing.
... Moreover, since the RAPID algorithm does not support service differentiation, we further modify it, refer to modified RAPID 2, by multiplying the reward of each service to its utility function. Note that the modified RAPID 2 algorithm, which schedules the available services one-by-one, can be considered as a non-preemptive [12] version of our proposed Smith ratio based resource allocation algorithm. The total reward of delivered services versus λ is shown inFig. ...
Conference Paper
Full-text available
In this paper, we investigate the on-demand data delivery to high-speed trains via trackside infostations. The optimal resource allocation problem is formulated by considering the trajectory of a train, quality of service (QoS) requirements, and network resources. The original problem is transformed into a single-machine preemptive scheduling problem based on a time-capacity mapping. As the service demands are not known a priori, an online resource allocation algorithm is proposed based on the Smith ratio and exponential capacity. The performance of the proposed algorithm is evaluated based on a real high-speed train schedule. Compared with the existing approaches, our proposed algorithm can achieve the best performance in terms of the total reward of delivered services over the trip of a train.
... Enterprise Information Systems 221 in the production scheduling problem (Baker and Trietsch 2009). In fact, this does not come to a surprise, as there are two significant differences with these two problems . ...
Article
Full-text available
In this article, we present a measure for resilience in the context of enterprise information systems or service systems in a more general sense. Resilience of the system is a property of the system, which focuses on the recovery ability of the system after a partial damage of the system. Enterprise information systems such as enterprise resource planning, supply chain management, customer relationship management, manufacturing execution system, etc. play a critical role in the daily operation of modern enterprises by timely and sustainable delivery of information. Therefore, resilience is especially important for the enterprise information systems. The proposed measure for resilience is based on the recovery ability of the system, which departs from the existing approaches in literature and presents a unique contribution. An example is given to illustrate how the proposed measure works.
... InFigure 3, we show a graphical illustration of the different policies in what we call a stochastic Gantt chart. This concept was first introduced by Baker and Trietsch (2009) who refer to it as a predictive Gantt chart. Like in a regular Gantt chart, the execution intervals of the activities are indicated by a horizontal bar per activity. ...
Article
The resource-constrained project scheduling problem involves the determination of a schedule of the project activities, satisfying the precedence relations and resource constraints while minimizing the project duration. In practice, activity durations may be subject to variability, such that a stochastic approach to the problem is more appropriate. We propose a methodology for the determination of a project execution policy and a vector of predictive activity starting times with the objective of minimizing a cost function that consists of the weighted expected activity starting time deviations and the penalties or bonuses associated with late or early project completion. In a computational experiment, we show that our procedure greatly outperforms existing algorithms described in the literature.
Article
The current study investigated the developmental trajectory of listeners’ ability to perceive regional dialect variation in American English using a free classification paradigm. Listeners ranged in age from 4 years old through late adulthood. In two experiments, listeners sorted a set of talkers into groups based on where the talkers were from. Talkers from four regional dialects of American English (Midland, New England, Northern, Southern) were presented. The results showed that some of even the youngest listeners (4- to 5-year-olds) were able to separate New England talkers from talkers from other regions, providing evidence that they perceived dialect variation and exhibited some level of sociolinguistic competence. Moreover, children׳s dialect perception abilities showed incremental improvement through childhood and adolescence, with major developmental improvements in dialect classification accuracy observed in middle childhood (between the ages of 7 and 8 years), late childhood (between 11 and 12 years), and in the teenage years (between 15 and 16 years), when adult-like abilities were reached. The overall results have implications for the experimental methods suitable to gauge young children׳s sociolinguistic knowledge and for our understanding of the time course of children׳s development of sociolinguistic competence, particularly their acquisition of the social significance of linguistic variation.
Chapter
Full-text available
The present research develops a new measure of projects " complexity: It analyzes the structural sophistication of PERT networks through Information Theory and utilizes Shannon " s first-order joint binary entropy. Higher entropy means that a project is more complex. A complex project is a project which is neither sure nor impossible. Unsurprisingly, project entropy (or structural complexity) increases with time (i.e., longest path or number of arrows that follow each other), together with the number of connections among the tasks. The main discovery of this research is that later occurring risks increase project entropy significantly more than equivalent risks occurring earlier.
Article
Full-text available
The scheduling problem in a job shop production environment is characterized by containing n jobs to be processed by m machines, where each job is represented by a specific sequence of operations with an established precedence order. The aim of this work is to perform a comparative analysis of the mathematical formulations for this environment by minimizing the makespan, i.e., the total time to complete all n jobs. Popular models and a proposed model are compared and evaluated through computational tests using cases from the literature. In addition, resolution strategies are proposed. Computational experiments using a well-known commercial software package indicate that the proposal strategies can promote a reduction of the optimality gap.
Chapter
The intended audience is scheduling practitioners and theoreticians as well as beginners in the field of scheduling. The purpose of this paper is to review the area of parallel machine scheduling (PMS) on issues of complexity. A critical review of the methods employed and applications developed in this relatively new area are presented and notable successes are highlighted. The PMS algorithms are discussed. We have given up-to-date information on polynomially type of problems based on non-preemptive criteria. It is shown that parallel machine makespan-minimization problem is NP-hard even for the two-machine problem. Moreover, the two-machine problem can be solved by the pseudo polynomial algorithm.
Chapter
Full-text available
In the previous part of the book, we have presented the concept of a scheduling model as a way to formalise the decision-making scheduling problem. This part of the book is devoted to present the methods to provide (good or even optimal) solutions for these scheduling models. In this chapter, we give an overview of scheduling methods, leaving for the next chapters the detailed discussion of specialised methods.
Chapter
The framework for manufacturing scheduling envisioned in Chap. 1 presented three main blocks—models, methods and tools—that, together with the humans involved in the process, compose a scheduling system. This part of the book deals with scheduling models, and in Chaps. 3 and 4 we have reviewed the mainelements of these models concerning systems layouts and processing constraints, respectively.
Chapter
As it has been discussed in a previous chapter (see Sect. 1. 5), scheduling refers to a decision-making process in which, in order to solve a real-world problem, a formal model is obtained.
Article
Full-text available
A novel approach for waste load allocation, namely, waste load scheduling, following the concept of machine scheduling has been developed and applied to a case study. The conceptual development and mathematical formulation for the waste load scheduling model has been discussed in the companion paper. The focus of this paper is to identify a suitable solution technique for solving the waste load scheduling model and then apply it to a case study river system in Tamil Nadu, India. The mathematical formulation for the waste load scheduling model was developed by treating it in an analogous way to a machine scheduling problem (as explained in the companion paper). Dynamic programming (DP) has been used by several researchers for solving such machine scheduling and sequencing problems and thus the same method is adopted here for solving the proposed waste load scheduling problem. To demonstrate the practical application of the proposed method, the model was applied to a real case study river system, namely, Thambraparani river system, which is located in the southern part of the Tamil Nadu state in India. The results from the waste load scheduling model proved that the method is very effective in reducing the total waste load discharge into the river and thereby achieving a high level of river water quality. Comparing the results of waste load scheduling (WLS) model with SEDP (Seasonal Effluent Discharge program) and EBP (Effluent By-Pass Piping) model shows that the proposed WLS model satisfies all the three water quality management goals (i.e., cost minimization, minimizing overall inequity, and minimizing total waste load discharge into the river) to a greater degree when compared to SEDP and EBP models.
Conference Paper
Full-text available
Current technological and market challenges increase the need for development of intelligent systems to support decision making, allowing managers to concentrate on high-level tasks while improving decision response and effectiveness. A Racing based learning module is proposed to increase the effectiveness and efficiency of a Multi-Agent System used to model the decision-making process on scheduling problems. A computational study is put forward showing that the proposed Racing learning module is an important enhancement to the developed Multi-Agent Scheduling System since it can provide more effective and efficient recommendations in most cases.
Chapter
For manufacturers operating batch plants, production scheduling is a critical and challenging problem. A thorough understanding of the problem and the variety of solutions approaches is needed to achieve a successful application. This entry will present a brief overview of batch operations and the state of the art of batch plant scheduling for nonexperts in the field.
Article
Full-text available
In this paper, the flow-shop sequence-dependent group scheduling (FSDGS) problem is addressed with reference to the makespan minimization objective. In order to effectively cope with the issue at hand, a hybrid metaheuristic procedure integrating features from genetic algorithms and random sampling search methods has been developed. The proposed technique makes use of a matrix encoding able to simultaneously manage the sequence of jobs within each group and the sequence of groups to be processed along the flow-shop manufacturing system. A well-known problem benchmark arisen from literature, made by two, three and six-machine instances has been taken as reference for both tuning the relevant parameters of the proposed procedure and assessing performances of such approach against the two most recent algorithms presented in the body of literature addressing the FSDGS issue. The obtained results, also supported by a properly developed ANOVA analysis, demonstrate the superiority of the proposed hybrid metaheuristic in tackling the FSDGS problem under investigation.
Article
Full-text available
A genetic algorithm for the parallel shop with identical machines scheduling problem with sequence dependent setup times and makespan (Cmax) minimization is presented. The genetic algorithm is compared with other heuristic methods using a randomly generated test problem set. A local improvement procedure in the evolutionary process of the genetic algorithm is introduced, which significantly improves its performance.
Article
Single machine scheduling is of essential importance due to its practical applications and the fact that it reduces bottleneck in production lines and provides results that are useful for more complex scheduling environments. In this article, Single machine scheduling problem to minimize the number of tardy jobs and maximum earliness and no slack time is examined. The reason of applying these two criteria simultaneously is the tendency to use the JIT system that finds earlier or later production undesirable. In this study, a genetic algorithm has been proposed to solve the problem. A set of instance problems have been investigated and the results are compared with the results of genetic algorithm used in the research literature. Numerical experiments show better performance of proposed algorithm.
Article
Full-text available
We study the resource-constrained project scheduling problem with stochastic activity durations. We introduce a new class of scheduling policies for solving this problem, which make a number of a-priori sequencing decisions in a pre-processing phase while the remaining decisions are made dynamically during project execution. The pre-processing decisions entail the addition of extra precedence constraints to the scheduling instance, hereby resolving some potential resource conflicts. We obtain new competitive results for expected-makespan minimization on representative datasets, which are significantly better than those obtained by the existing algorithms when the variability in the activity durations is medium to high.
Article
Consider a set of tasks that are partially ordered by precedence constraints. A subset of tasks is called feasible if, for every task in the subset, all predecessors are also in the subset. The major results are (1) a method for enumerating all feasible subsets and (2) a method for assigning to each feasible subset an easily computed label that can be used as a physical address for storing information about the subset. These two results permit a very compact computer implementation of a dynamic programming algorithm for solving one-machine sequencing problems with precedence constraints. This algorithm appears to be much more efficient than previous ones for certain one-machine sequencing problems.
Article
This note discusses sufficient conditions for a function of permutations (j1, …, jn) of the integers 1, 2, …, n to have a minimum at the permutation (j*1, …, j*n) and derives a solution algorithm that is equivalent to the known solution algorithm for special cases involving sequential search and testing.
Article
This paper first considers the problem of sequencing n jobs on one machine to minimize total tardiness. It proves theorems that establish the relative order in which pairs of jobs are processed in an optimal schedule; frequently they permit the jobs to be completely ordered, thus solving the problem without any searching. In particular, corollaries establish more general conditions than are currently recognized under which sequencing in order of nondecreasing processing times and sequencing in order of nondecreasing due dates are optimal. In general, even large problems may be at least partially ordered to the point that very few schedules remain to be searched. These results are then partly extended to the more general criterion of minimizing a sum of identical, convex, nondecreasing functions of job tardiness, and an efficient algorithm is proposed.
Article
The branch-and-bound procedure is formulated in rather general terms and necessary conditions for the branching and bounding functions are precisely specified. Results include the standard properties for finite procedures, plus several convergence conditions for infinite procedures. Discrete programming which includes integer programming and combinatorial optimization problems, is discussed and Fibonacci search is presented as an example of a nonfinite branch-and-bound procedure employing an optimal convergence rule.
Article
A new algorithm for a general cost function is presented. The algorithm is tested for the case of a weighted tardiness criterion.
Article
Suppose n jobs are to be processed by a single machine. Associated with each job j are a fixed integer processing time pi, a due date di, and a positive weight wj. The weighted tardiness of job j in a given sequence is wj, max (O, Ci, - di), where Ci is the completion time of job j. Assume that the weighting of jobs is “agreeable”, in the sense that pi < pi implies wi ⩾ wi. Under these conditions, it is shown that a sequence minimizing total weighted tardiness can be found by a dynamic programming algorithm with worst-case running time of O(n4P) or O(n5pmax), where P = Σ pi and pmax = max {pi}. The algorithm cis “pseudopolynomial”, since a true polynomialbounded algorithm should be polynomial in Σ log2,pi.
Article
The paper deals with the single-machine total tardiness problem. It investigates the authors' most recent branch and bound algorithm and discovers the following paradoxes. Deleting a lower bound drastically improves the performance of the algorithm, while adding a stronger component, like a better decomposition rule, negatively affects its performance. Guided by those paradoxes it develops a very fast branch algorithm that handles instances with up to 500 jobs. It also shows that the powerful recent result of Chang et al. (Operations Research Letters 1995; 17:221–229) can be further improved. Copyright © 2001 John Wiley & Sons, Ltd.
Article
Suppose n jobs are each to be processed by a single machine, subject to arbitrary given precedence constraints. Associated with each job j is a known processing time aj, and a monotone nondecreasing cost function cj(t), giving the cost that is incurred by the completion of that job at time t. The problem is to find a sequence which will minimize the maximum of the incurred costs. An efficient computational procedure is given for this problem, generalizing and simplifying previous results of the present author and J. M. Moore.