ArticlePDF Available

Hybrid improved cuckoo search algorithm and genetic algorithm for solving stochastic inventory model with Markov-modulated demand

Authors:
  • Bhangar Mahavidyalaya: http://en.wikipedia.org/wiki/Bhangar_Mahavidyalaya

Abstract and Figures

One of the fundamental problems in supply chain management is to design the effective inventory control policies for models with stochastic demands because efficient inventory management can both maintain a high customers' service level and reduce unnecessary over and under-stock expenses which are significant key factors of profit or loss of an organization. In this study, a new formulation of an inventory system is analyzed under discrete Markov-modulated demand. We employ simulation-based optimization that combines simulated annealing pattern search and ranking selection (SAPS&RS) methods to approximate near-optimal solutions of this problem. After determining the values of demand, we employ novel approach to achieve minimum cost of total SCM (Supply Chain Management) network. In our proposed approach, hybrid improved cuckoo search algorithm (ICS) and genetic algorithm (GA) are presented as main platform to solve this problem. The computational results demonstrate the effectiveness and applicability of the proposed approach.
Content may be subject to copyright.
A preview of the PDF is not available
... Experimentally, MOCS (Multi-objective Cuckoo Search) has shown the most efficient Pareto solution on the basis of computational time, mean ideal distance, and spacing thread with the comparison of Multi-Objective Imperialist Competitive Algorithm (MOICA) and even MOPSO (Multi-Objective Particle Swarm Optimization). Jamali et al. [70] used an inventory priority model for a supply chain network model with demand uncertainty and a cost-minimization approach. An improved hybrid cuckoo search algorithm along with genetic algorithm had implemented to solve that model. ...
Article
Full-text available
Combinatorial optimization problems are often considered NP-hard problems in the fieldof decision science and the industrial revolution. As a successful transformation to tackle complexdimensional problems, metaheuristic algorithms have been implemented in a wide area of combi-natorial optimization problems. Metaheuristic algorithms have been evolved and modified withrespect to the problem nature since it was recommended for the first time. As there is a growinginterest in incorporating necessary methods to develop metaheuristics, there is a need to rediscoverthe recent advancement of metaheuristics in combinatorial optimization. From the authors’ pointof view, there is still a lack of comprehensive surveys on current research directions. Therefore, asubstantial part of this paper is devoted to analyzing and discussing the modern age metaheuristicalgorithms that gained popular use in mostly cited combinatorial optimization problems such asvehicle routing problems, traveling salesman problems, and supply chain network design problems.A survey of seven different metaheuristic algorithms (which are proposed after 2000) for combina-torial optimization problems is carried out in this study, apart from conventional metaheuristicslike simulated annealing, particle swarm optimization, and tabu search. These metaheuristics havebeen filtered through some key factors like easy parameter handling, the scope of hybridization aswell as performance efficiency. In this study, a concise description of the framework of the selectedalgorithm is included. Finally, a technical analysis of the recent trends of implementation is discussed,along with the impacts of algorithm modification on performance, constraint handling strategy, thehandling of multi-objective situations using hybridization, and future research opportunities.
... Gholami et al. (2018) "ABC analysis of clients using axiomatic design and incomplete estimated meaning". Jamali et al. (2018) "Hybrid Improved Cuckoo Search Algorithm and Genetic Algorithm to Solve Marko Modulated Demand". ...
Chapter
As the emission of carbon dioxide has resulted in many issues in the global environment, controlling carbon emission has become a high priority for governments. One of the sectors engaged with carbon emission is inventory management. A lot of activities in inventory systems such as purchasing, warehousing, and transporting the items lead to emitting carbon. Therefore, governments have ruled policies to mitigate the emissions in inventory systems and develop sustainable supply chains. Despite the importance of this issue, no attempts have been made to study and address the vital role of different policies in controlling carbon emissions in review progress. This paper provides a systematic literature review to analyze the impact of carbon emission policies on inventory systems. 75 papers have been extracted from the most relevant academic and research databases and the results have been analyzed and synthesized. By classifying and introducing different carbon policies applicable in inventory systems, this paper introduces the policies that make effort to restrict the emissions. Finally, theoretical and managerial insights and extensive opportunities for future research are outlined.
Chapter
A stock setup for weakening things with two degree of capacity framework and time-subordinate interest with halfway accumulated deficiencies is created in this research topic. Stock is moved from hired warehouse (RW) to personal warehouse (OW) in mass delivery design and cost of transportation considered as insignificant. Rates of weakening in all the distribution centres are consistent yet unique because of the distinctive safeguarding methodology. Up to a particular time, holding cost is viewed as consistent, and after some time, it increases. Particle swarm optimization having fluctuating populace numbers is utilized to tackle the setup. In given PSO, a fraction of better kids is incorporated along with the populace of parent for future. Size of its parent set and kid’s subset is having same level. The mathematical model is introduced to validate the presented setup. Affectability examination is performed independently for every boundary.
Chapter
Cancer is one of the leading causes of death. According to World Health Organization, lung cancer is the most common cause of cancer deaths in 2020, with over 1.8 million deaths. Therefore, lung cancer mortality can be reduced with early detection and treatment. The components of early detection require screening and accurate detection of the tumor for staging and treatment planning. Due to the advances in medicine, nuclear medicine has become the forefront of precise lung cancer diagnosis. Currently, PET/CT is the most preferred diagnostic modality for lung cancer detection. However, variable results and noise in the imaging modalities and the lung's complexity as an organ have made it challenging to identify lung tumors from the clinical images. In addition, the factors such as respiration can cause blurry images and introduce other artifacts in the images. Although nuclear medicine is at the forefront of diagnosing, evaluating, and treating various diseases, it is highly dependent on image quality, which has led to many approaches, such as the fusion of modalities to evaluate the disease. In addition, the fusion of diagnostic modalities can be accurate when well-processed images are acquired, which is challenging due to different diagnostic machines and external and internal factors associated with lung cancer patients. The current works focus on single imaging modalities for lung cancer detection, and there are no specific techniques identified individually for PET and CT images, respectively, for attaining effective and noise-free hybrid imaging for lung cancer detection. Based on the survey, it has been identified that several image preprocessing filters are used for different noise types. However, for successful preprocessing, it is essential to identify the types of noise present in PET and CT images and the appropriate techniques that perform well for these modalities. Therefore, the primary aim of the review is to identify efficient preprocessing techniques for noise and artifact removal in the PET/CT images that can preserve the critical features of the tumor for accurate lung cancer diagnosis.
Article
In practice, stock market behavior is difficult to predict accurately because of its high volatility. To improve market forecasts, a method inspired by Elman neural network and quantum mechanics is presented. To render the network sensitive to dynamic information, the internal self-connection signal that is extremely useful for system modeling is introduced to the proposed technique. Double chains quantum genetic algorithm is employed to tune the learning rates. This model is validated by forecasting closing prices of six stock markets, the simulation results indicate that the proposed algorithm is feasible and effective. Accordingly, generalizing the method is deemed advantageous.
Article
The construction of metropolises in smart cities is the trend of developed countries. However, it may cause damages to the surrounding structures. For this reason, the diaphragm wall has been applied to prevent the deformation or collapse of the surrounding structures. Diaphragm walls can be deflected due to the swelling pressure or other geotechnical properties during construction. Therefore, the accurate prediction of diaphragm wall deflection (DWD) is challenging in construction aiming to ensure the safety of the surrounding structures. This study is, therefore, to propose two intelligent models for predicting DWD induced by deep braced excavations based on finite element method (FEM) and metaheuristic algorithms. Accordingly, a total of 1120 finite elements were analyzed to investigate the behaviors of DWD. Finally, a neural network with multiple layer perceptron (MLP) was optimized by two metaheuristic algorithms for predicting DWD, including whale optimization (WO) and Harris hawks optimization (HHO), called MLP-HHO and MLP-WO, respectively. The results indicated that the proposed MLP-HHO and MLP-WO provided high accuracy in predicting DWD. A comparison of the proposed models in this study and previous studies was also discussed to highlight the superiority of the proposed MLP-HHO and MLP-WO models.
Chapter
This article develops a deterministic inventory model for the wine industry for item spoilage with two storage systems and a time-dependent demand with partial bottlenecks. The inventory is transferred to OW according to the RW Bulk Discharge model and transport costs are considered negligible. The decline in the two camps is constant, but different due to the different storage methods. Use of particle swarm optimization as part of the LOFO shipping policy. The cost of ownership is considered constant up to a certain point in time and increases. Optimization of the particle swarm with different population sizes is second-hand to get to the bottom of the model. In this particle swarm optimization, a compartment of the best children is incorporated in the parent population for the next generation, and the size of that subset is a percentage of the size of their parent set. The digital sample is presented to make obvious the development of Modusland for corroboration. The kindliness analysis is performed separately for each parameter.
Chapter
In this article, the warehouse of the sugar industry system was created for the growing commodity demand and demand with an increase in dual warehouses using different variables. Warelo its warehouse (OW) has a capacity of W units; the warehouse (RW) has unlimited capacity. At this point, we think the sugar industry is holding higher RW records than at EW using different evolution. Sugar companies are allowed to quit, and the sugar industry will get worse in the near future, fluctuating when they make various changes. There is also an effect of an increase in the various costs associated with the marketing of sugar systems using different variables. Numerical symbols are also used to study the behavior of the model using different variables. Reduction costs are used to obtain a statement of total costs in other areas using a different evolution method.
Article
Full-text available
This research work develops a two-warehouse inventory model for non-instantaneous deteriorating items with interval-valued inventory costs and stock-dependent demand under inflationary conditions. The proposed inventory model permits shortages, and the backlogging rate is variable and dependent on the waiting time for the next order, and inventory parameters are interval-valued. The main aim of this research is to obtain the retailer’s optimal replenishment policy that minimizes the present worth of total cost per unit time. The optimization problems of the inventory model have been formulated and solved using two variants of particle swarm optimization (PSO) and interval order relations. The efficiency and effectiveness of the inventory model are validated with numerical examples and a sensitivity analysis. The proposed inventory model can assist a decision maker in making important replenishment decisions.
Article
Full-text available
This paper presents a model and an algorithm for an inventory ship routing problem (ISRP). It consists of two main parts: a model development of the ship routing problem in a multi-product inventory with a heterogeneous fleet and an algorithm development to solve the problem. The problem is referred to as ISRP. ISRP considers several parameters including the deadweight tonnage (DWT), product compatibility, port setup, and compartment washing costs. Considering these parameters, the objective function is to minimize the total cost, which consists of traveling, port setup, ship charter, and compartment washing costs. From the resulting model, there are two major steps used to solve the problem. The first is to select the ships in order to satisfy the constraint that restricts the mooring rule. The second is to find the best route, product allocation, and shipped quantity. ISRP is an Non Polynomial-hard problem. Finding the solution of such problem needs a high computation time. A new hybrid metaheuristics, namely the cross entropy-genetic algorithm (CEGA), was proposed to solve ISRP. The results were then compared with those resulted from a hybrid Tabu Search to measure the hybrid CEGA performance. The results showed that CEGA provided better solutions than those produced by the hybrid Tabu Search.
Article
Full-text available
In this research, an effective approach based on Multivariable Linear Regression (MVLR) and Genetic Algorithm (GA) methods has been applied to study the effect of working conditions on occupational injury, using data of occupational accidents accumulated by ship repair yards. The work aims at the development of a calculating model that will use soft computing techniques to assess the occupational risk in the working place of shipyards using occupational accidents data. For each accident the following parameters have been considered as the model's input features: day and time, individual's specialty, type of incident, dangerous situation and dangerous actions involved. Reported accident data were used as the training data for the MVLR model to map the relationship between the working conditions and occupational risk. With the fitness function based on this model, genetic algorithms were used for the prediction of occupational risk taking into consideration the severity and the frequency of occupational accidents data accumulated by ship repair yards. The working parameters' values for minimum occupational risk were obtained using GAs. By comparing the predicted values with the reported data, it was demonstrated that the proposed model is a useful and efficient method for predicting the risk of occupational injury.
Article
Reducing the system cost and achieving significant profit are the key factors for every successful business sector. A consignment contract under distribution-free approach may be a fruitful combination to achieve a profitable business. This model deals with a single-period newsvendor problem with a consignment policy. The consignment policy is an agreement between any two parties, named as the consignor and the consignee. Under Stackelberg approach, firms act as leader and follower. Both parties carry some parts of the holding cost instead of one. A new policy for paying the fixed fee to the consignee is introduced. This paper considers no specific probability distribution for customer’s demand except a known mean and standard deviation. An efficient approach is proposed to reduce the retailer’s cost and building a sustainable consignment contract. The solution of this model is obtained using distribution free approach. A comparison between the traditional supply chain policy and the consignment policy is established. The price-sensitivity on demand is analysed. Some numerical examples and graphical representations are given for both traditional and consignment policy. Result proves that consignment policy is dominating over the traditional policy and a significant reduction of retailer’s royalty is found.
Book
Optimal control methods are used to determine optimal ways to control a dynamic system. The theoretical work in this field serves as a foundation for the book, which the authors have applied to business management problems developed from their research and classroom instruction. Sethi and Thompson have provided management science and economics communities with a thoroughly revised edition of their classic text on Optimal Control Theory. The new edition has been completely refined with careful attention to the text and graphic material presentation. Chapters cover a range of topics including finance, production and inventory problems, marketing problems, machine maintenance and replacement, problems of optimal consumption of natural resources, and applications of control theory to economics. The book contains new results that were not available when the first edition was published, as well as an expansion of the material on stochastic optimal control theory.
Article
Due to globalization and extreme competition in today’s economy, knowledge management that is a systematic management of industries’ knowledge assets for creating value and strategies of the organization plays an important role in finding out new strategies in the supply chain for interacting and satisfying the customers’ demand. Consequently, the current business models are continuously developing, incorporating new trends, new industrial areas, and even new models. In this paper, the authors propose a bi-level optimization model for producer and collection centers to achieve maximum profits of the channel members. In bi-level optimization, collection points are the leaders, and producers are the followers of the strategies taken by the collection points. The profits of the channel members are maximized using General Algebraic Modeling System software. Moreover, Shapley value of game theory approach is applied for coalition of the members in collaborative system, and it is compared with bi-level optimization based on data collected from agro-industry of cocoa. The best way to distribute the profits among the participants is determined by using Microsoft Excel and Java as a programming language. The final results indicate that higher profits are obtained in a collaborative system than in an individual system.
Article
The hybrid metaheuristics algorithms (HMHAs) have gained a considerable attention for their capability to solve difficult problems in different fields of science. This chapter introduces some applications of HMHAs in solving inventory theory problems. Three basic inventory problems, joint replenishment EOQ problem, newsboy problem, and stochastic review problem, in certain and uncertain environments such as stochastic, rough, and fuzzy environments with six different applications, are considered. Several HMHAs such as genetic algorithm (GA), simulated annealing (SA), particle swarm optimization (PSO), harmony search (HS), variable neighborhood search (VNS), and bees colony optimization (BCO) methods are used to solve the inventory problems. The proposed metaheuristics algorithms also are combined with fuzzy simulation, rough simulation, Pareto selecting and goal programming approaches. The computational performance of all of them, on solving these three optimization problems, is compared together.