Article

A systematic literature review into simulation for building operations management theory: reaching beyond positivism?

Taylor & Francis on behalf of the Operational Research Society
Journal of Simulation
Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Because in all these cases the simulation is to serve as a mediator for our understanding of reality [59], a central requirement is for it to be credible [60] or trustworthy [44]. This is especially true in intervention-oriented fields [61], like OR, where a post-positivist view requires stakeholder's participation to ensure the simulation impacts practice [62]. ...
Article
Full-text available
Background: Simulation of business processes allows decision-makers to explore the implications and trade-offs of alternative approaches, policies and configurations. Trust in the simulation as a stand-in proxy of the real system depends on the validation of the computer model as well as on that of the data used to run it and judge its behaviour. Though validation frameworks exist, they provide little guidance for validation in the context of data-poor endeavours, such as those where observations as sourced from historical records were acquired for purposes other than the simulation itself. As simulation of complex business systems as logistic distribution networks can only rely on this type of data, there is a need to address this void and provide guidance for practitioners and fostering the conversation among academics. This paper presents a high-level development and validation framework applicable to simulation in data-poor environments for modelling the process of bulk distribution of commodities. Method: Traditionally accepted approaches were synthesised so as to develop an into a flexible three-stage modelling and validation approach to guide the process and improve the transparency of adapting available data sources for the simulation itself. The framework suggests the development of parallel paths for the development of computer and data models which, in the last stage, are merged into a phenomenological model resulting from the combination of both. The framework was applied to a case study involving the distribution of bulk commodities over a country-wide network to show its feasibility. Results: The method was flexible, inclusive of other frameworks, and suggested considerations to be made during the acquisition and preparation of data to be used for the modelling and exploration of uncharted scenarios. Conclusions: This work provides an integrative, transparent, and straightforward method for validating exploratory-type simulation models for endeavours in which observations cannot be acquired through direct experimentation on the target system.
... This study uses a postpositivist research paradigm suitable for systematic literature reviews (SLRs), which synthesize existing theoretical and empirical findings without direct data collection. The postpositivist approach acknowledges the limitations of achieving absolute truth, yet it strives for objectivity through rigorous and structured scrutiny of theories against observed realities (Kabak et al., 2024). This methodological framework underpins the study's purpose to explore and propose future research agendas regarding DT's impact on social sustainability. ...
Article
Full-text available
Purpose This paper aims to systematically review the constructive effects of digital transformation (DT) on social sustainability, examining its impact across democracy and governance, social cohesion, quality of life, equality and diversity. It emphasizes the need for appropriate frameworks that incorporate DT strategies in organizational practices to improve social sustainability. Design/methodology/approach A systematic literature review was carried out through Web of Science and Scopus databases to identify the distinctive papers that explored the impact of DT on social sustainability. It analyzes how various digital technologies, like Internet of Things, cloud computing and mobile computing, can be strategically embedded in organizational practices to optimize social sustainability outcomes. Findings This study reveals that although DT significantly enhances operational capabilities and consumer experiences, its integration into social sustainability practices is often overlooked. It proposes a novel framework that aligns digital capabilities with sustainability goals, aiming to enrich organizational performance and societal welfare. This paper identifies dynamic capabilities as crucial for sustaining competitive advantage in a digitally transformed business landscape. Research limitations/implications The primary limitation is the reliance on secondary data, which may not fully capture the rapid advancements in DT. Future research should focus on empirical studies to validate the proposed framework and explore the dynamic capabilities required for integrating DT in social sustainability practices. Originality/value This study extends the discourse on DT by linking it explicitly with social sustainability, presenting a structured analysis that highlights the need for strategic integration of digital technologies within organizational sustainability practices. It fills a gap in the literature by proposing a comprehensive framework for organizations to follow, thereby contributing to the theoretical and practical understanding of DT’s role in enhancing social sustainability.
... Kabak. et al. [42] argue for the potential and role of simulation modeling in operations management theory building from both positivist and post-positivist perspectives. Based on this, we will utilize MATLAB to validate the model's conclusions with real-world cases and perform sensitivity analysis on various parameters of the system in the next step. ...
Article
Full-text available
The rapid development of supply chain finance (SCF) has significantly alleviated the financing difficulties of small and medium-sized enterprises (SMEs). However, it is important to recognize that within the accounts receivable financing segment of the SCF credit market, the credit risk associated with SMEs poses a serious challenge and potential threat to the stability, health, and sustainable development of the SCF system. This paper pays special attention to the stability of the two-party evolutionary game between SMEs and financial institutions (FIs) within the context of the Chinese SCF credit market. To identify a pathway to reduce credit risks for SMEs while simultaneously enhancing system stability, this paper adopts the stochastic evolutionary game (SEG) model and combines the fixed-point method to determine the conditions that satisfy the stability of the system’s index p mean square of the system. This study has made attempts in various aspects, such as the innovative construction and investigation of a nonlinear SEG model, the endeavor to study the stability of SEG systems using fixed-point methods, and the innovative construction of a more realistic two-player SEG system. The data and simulation results generated from hypothetical scenarios show that the conclusions of the article are credible and feasible. Through the study, we conclude that the higher credit ratio from FI and the higher penalty intensity from core enterprises (CEs) will accelerate the stability of the system. Based on solid data and modeling analysis, insights into the regulation of FI are provided.
Article
Full-text available
In the literature of operations management, the reliability of multistage manufacturing systems has been always modeled with uncorrelated failure processes where the reliability of each machine is assumed to be independent of any failure in the other machines. However, in real-life, machines may be subject to complex correlated failures such as increased degradation and tool wear caused by defective parts produced in preceding machines. Ignoring the correlation effect when modeling the reliability of multistage systems generally results in inaccurate estimation of the overall system reliability and inefficient operations policy accordingly. In this paper, we deal with the problem of integrated production, quality and maintenance control of production lines where machines are subject to quality and reliability operation-dependent degradation. Also, machines’ reliability is correlated with the level of incoming product quality. For illustration, we study in this paper a two-machine line model. We propose a combined mathematics and simulation-based modeling framework to jointly optimize the production, quality and maintenance control settings. The objective is to minimize the total cost incurred under a constraint on the outgoing quality. Numerical examples are given to show the effectiveness of the resolution approach and to study important aspects in multistage systems such as the allocation of inspection and maintenance efforts, the Quality-Reliability chain and the interdependence between production, quality and maintenance control settings. The results obtained demonstrate that failure correlation has a significant impact on the optimal control settings and that maintenance and quality control activities in preceding stages can play an important role in the reliability improvement of the subsequent machines.
Article
Full-text available
Because of their higher processing priority, hot lots often interrupt the production of regular lots in a Thin-Film-Transistor Liquid-Crystal Display (TFT-LCD) fab. As a result, how to manage the production of hot lots so that their production demand can be met and their negative effects on regular lots minimized is a very important issue. In this paper, we identity three problems – the lot selection of interbay AGVs, the lot selection of intrabay machines, and the photo bay selection of lots – that can affect the production of hot lots and regular lots and propose methods for them. A fuzzy-based dynamic bidding (FBDB) method is proposed for the first two problems. The bidding functions in this FBDB method consider several attributes of the current system to obtain the true values of bids. An earliest possible time (EPT) method that also considers several attributes of the current system is proposed for the third problem. These two methods are compared with methods used by a Taiwanese TFT-LCD fab through computer simulations. The effects of hot lot ratios on the performance of these methods are also analyzed. Six performance measures are adopted to measure the throughput and tardiness performance of all lots, regular lots and hot lots. The simulation results show applying the FBDB and EPT methods to the three problems studied here can result in very good results in all performance measures.
Article
Full-text available
This article examines five common misunderstandings about case-study research: (1) Theoretical knowledge is more valuable than practical knowledge; (2) One cannot generalize from a single case, therefore the single case study cannot contribute to scientific development; (3) The case study is most useful for generating hypotheses, while other methods are more suitable for hypotheses testing and theory building; (4) The case study contains a bias toward verification; and (5) It is often difficult to summarize specific case studies. The article explains and corrects these misunderstandings one by one and concludes with the Kuhnian insight that a scientific discipline without a large number of thoroughly executed case studies is a discipline without systematic production of exemplars, and that a discipline without exemplars is an ineffective one. Social science may be strengthened by the execution of more good case studies.
Conference Paper
Full-text available
Several articles have addressed the issue of rankings for Operations Management journals. Such rankings are important because promotion and tenure decisions are based on journals in which research is published. Soteriou, Hadjinicola and Patsia (1999) noted that the publication rates of European researchers in highly ranked U.S. journals was low. Barman, Hanna and LaForge (2001) furthermore mentioned the complexity of journal perceptions. We started from the idea that research paradigms may influence journal perceptions and that they might also influence article selection within journals. We set out to conduct an exploratory analysis on this topic by focusing on four highly ranked
Article
Full-text available
This article identifies a methodological void in operations management (OM) research as the lack of empirical theory building. It addresses two questions: What is empirical science? and How can empirical theory building be nurtured in the area? The rationale for theory building in this article is derived from the classical empirical science perspective. Further, an empirically sound model OM theory is identified and evaluated, and selected theorylike statements and informal "theories" embedded in the OM literature are classified into an accepted classification scheme made up of grand theories, middle range theories, and empirical generalizations.
Article
Full-text available
The area of asset maintenance is becoming increasingly important as greater asset availability is demanded. This is evident in increasingly automated and more tightly integrated production systems as well as in service contracts where the provider is contracted to provide high levels of availability. Simulation techniques are able to model complex systems such as those involving maintenance and can be used to aid performance improvement. This paper examines engineering maintenance simulation research and applications in order to identify apparent research gaps. A systematic literature review was conducted in order to identify the gaps in maintenance systems simulation literature. Simulation has been applied to model different maintenance sub-systems (asset utilisation, asset failure, scheduling, staffing, inventory, etc.) but these are typically addressed in isolation and overall maintenance system behaviour is poorly addressed, especially outside of the manufacturing systems discipline. Assessing the effect of Condition Based Maintenance (CBM) on complex maintenance operations using Discrete Event Simulation (DES) is absent. This paper categorises the application of simulation in maintenance into eight categories.
Article
Full-text available
The main contribution of this paper is twofold: it surveys different types of simulation for supply chain management; it discusses several methodological issues. These different types of simulation are spreadsheet simulation, system dynamics, discrete-event simulation and business games. Which simulation type should be applied, depends on the type of managerial question to be answered by the model. The methodological issues concern validation and verification, sensitivity, optimisation, and robustness analyses. This sensitivity analysis yields a shortlist of the truly important factors in large simulation models with (say) a hundred factors. The robustness analysis optimises the important factors controllable by management, while accounting for the noise created by the important non-controllable, environmental factors. The various methodological issues are illustrated by a case study involving the simulation of a supply chain in the mobile communications industry in Sweden. In general, simulation is important because it may support the quantification of the benefits resulting from supply chain management.
Conference Paper
Full-text available
Experienced and wise industrial engineering educators and practitioners have long understood that industrial engineering is a coherent discipline encompassing techniques that work best synergistically, not a motley collection of specialized techniques each isolated in a separate chimney. As an example of the synergies which industrial engineering can bring to process improvement in a production environment, this case study presents the integrated use of process simulation, production scheduling, and detailed analysis of material-handling methods and their improvement. The study undertook the identification and improvement of production and scheduling policies to the benefit of a manufacturing process whose original throughput capacity fell significantly short of high and increasing demand.
Article
Full-text available
To date, the theory of production in operations management has lacked a production strategy for one major segment of the manufacturing industry. For large engineered equipment, a relatively recent but increasingly common production strategy has arisen to better meet today's competitive pressures for faster delivery of more customized products without increasing costs. A hybrid of the make‐to‐order (MTO) and make‐to‐stock (MTS) production strategies, manufacturers launch major product models to a demand forecast (MTS) and then modify the partially completed products as the actual orders arrive (MTO), a production strategy we refer to as make‐to‐forecast (MTF). This paper has two purposes: (1) it describes and conceptualizes the MTF situation in a variety of industries and places the MTF strategy among the other major production strategies in the theory of production and (2) it analyzes decision rules for matching partially completed units to incoming customer orders—one of the unique and perhaps most demanding challenges of the MTF environment. It shows that two order matching rules developed in the paper outperform the ad hoc rules commonly used in practice. We test and confirm the generalizability of the superior performance of these two rules in 13 different industry variations of the MTF production situation. Last, the insights provided by the model are discussed in terms of their practical relevance.
Article
Full-text available
The extraordinary importance of the services sector for the economy both in production and employment cannot be denied. As a result, there have been both demands for an increase in Service Operations Management (SOM) research since the 1980s on the one hand, and, on the other, predictions that such an increase will take place [e.g.: Buffa, E.S., 1980. Research in Operations Management. Journal of Operations Management 1 (1), 1–8; Miller, J.G., Graham, M.B.W., Freeland, J.R., Hottenstein, M., Maister, D.M., Meredith, J., Schmenner, R.W., 1981. Production/Operations Management: agenda for the 80s. Decision Science 12 (4), 547–571; Mabert, V.A., 1982. Service Operations Management: research and application. Journal of Operations Management 2 (4), 203–209; Amoako-Gyampah, K., Meredith, J.R., 1989. The Operations Management research agenda: an update. Journal of Operations Management 8 (3), 250–262; Chase, R.B., 1996. The mall is my factory: reflections of a service junkie. Production and Operations Management 5 (4), 298–308; Pannirselvam, G.P., Ferguson, L.A., Ash, R.C., Siferd, S.P., 1999. Operations Management research: an update for the 1990s. Journal of Operations Management 18 (1), 95–112; Roth, A.V., Menor, L.J., 2003. Insights into service Operations Management: a research agenda. Production and Operations Management 12 (2), 145–164; Slack, N., Lewis, M., Bates, H., 2004. The two worlds of Operations management research and practice. Can they meet, should they meet? International Journal of Operations and Production Management, 24 (4), 372–387]. And yet, the amount of SOM research done in OM research has still only been minimal. This contradiction calls for an in-depth study of the state of affairs of SOM research, and we have conducted just such a study in 10 of the most relevant and representative outlets in the OM field, as well as in pipeline research (Proceedings of the POMS, DSI and EurOMA Conferences). Our results aim to provide answers to the following questions, amongst others: (1) Is greater importance now attached to SOM research within OM research? (2) What are the main topics of research? Are they the same topics that have been proposed in SOM research agendas? (3) What methods are used in SOM research? (4) What are the most commonly studied sectors of economic activities? (5) Are there any differences from OM research in terms of content and methods? Some of our findings show that although a growth in SOM research had been predicted in earlier studies, there is still only a minimal amount done (7.5% of OM research); seven topics command 61.5% of SOM research, which (with some exceptions) is consistent with SOM research agendas; there is more research done on strategic issues than on tactical/operational issues; models and simulations are still more common than empirical research, but trends point to a shift to the latter; the majority of research focuses on a specific sector, and three sectors account for 50% of the total; there is a clear connection between type of journal and type of research, research method and sector of activity.
Article
Full-text available
Bottlenecks within a production line significantly reduce the productivity. Quick and correct identification of the bottleneck locations can lead to an improvement in the operation management of utilising finite manufacturing resources, increasing the system throughput, and minimising the total cost of production. Most of the current bottleneck detection schemes focus on the long-term bottleneck detection problem and an analytical or simulation model is usually needed. Due to recent developments, short-term process control and quick decision making on the plant floor have emerged as important qualities for operation management. This research proposes a new data driven method for throughput bottleneck detection in both the short and long term. The method utilises the production line blockage and starvation probabilities and buffer content records to identify the production constraints without building an analytical or simulation model. The method has been verified both analytically and by simulation. An industrial case study has also been used in order to demonstrate the implementation and validate the efficiency of the proposed bottleneck detection method.
Article
Full-text available
Members of the editorial board of Action Research responded to the question, `Why action research?' Based on their responses and the authors' own experiences as action researchers, this article examines common themes and commitments among action researchers as well as exploring areas of disagreement and important avenues for future exploration. We also use this opportunity to welcome readers of this new journal and to introduce them to members of the editorial board.
Book
The relationship between theory and practice, research and action, is fundamental to all fields of applied social science. Should research findings and knowledge be useful for science, practice, and policy? If so, how should such research be designed, carried out and disseminated to achieve the twin goals of rigor and relevance? These challenges are particularly relevant in the applied areas of management and organization studies where there is a distinct responsibility for researchers to engage with the ‘real world’. In this carefully crafted and thoughtful book, leading management researcher Andrew Van de Ven both presents the broad intellectual challenge of ‘engaged scholarship’, and also sets out a clear framework and guidelines for carrying out soundly based and useful research for advancing both science and practice. At a time when some may question the value and status of academic knowledge; and others, contrastingly, urge a closer relationship between researchers and research users – be they businesses, governments or other institutions – the challenge of engaged scholarship is as relevant as ever, and there is a real need for the thoughtful and considered approach offered by Van de Ven. The book both provides a manifesto for engaged scholarship in the social sciences, and clear framework for research design and methodology. It will be an invaluable reference point and guide for academics, researchers and graduate students across the social sciences concerned with rigorous and relevant research in the contemporary world.
Article
Health and safety of human operators in production systems is tightly linked to overall work system design. Well-designed production systems must grant adequate allowances to operators, to cope with the variability of predictable situations, whether normal or not. In this article, time allowances concept is presented from two standpoints: industrial engineering and occupational risk prevention field. This is followed by an introduction of a new time allowances indicator. Combined with an agent-based simulation model, this indicator can be used to assess time allowances in a given work organization during its design process. The proposed model integrates worker's fatigue, learning and reliability as relevant human factors. To uphold the article proposal, three production system configurations are investigated: a classical production line designed based on Ford work paradigm, a cellular configuration with autonomous breaks, inspired from Volvo Uddevalla plant and finally, a hybrid configuration, cellular with planned breaks. The simulations showed that cellular configuration with autonomous breaks grants to operators the time allowances they need to carry out their work activity safely and efficiently.
Book
Offering an up-to-date account of systems theories and its applications, this book provides a different way of resolving problems and addressing challenges in a swift and practical way, without losing overview and grip on the details. From this perspective, it offers a different way of thinking in order to incorporate different perspectives and to consider multiple aspects of any given problem. Drawing examples from a wide range of disciplines, it also presents worked cases to illustrate the principles. The multidisciplinary perspective and the formal approach to modelling of systems and processes of 'Applied Systems Theory' makes it suitable for managers, engineers, students, researchers, academics and professionals from a wide range of disciplines; they can use this 'toolbox' for describing, analysing and designing biological, engineering and organisational systems as well as getting a better understanding of societal problems. This revised, updated and expanded second edition includes coverage of abductive reasoning, the relevance of systems theories for research methods and a new chapter about problem analysis and solving based on systems theories. © Springer International Publishing AG 2017. All rights reserved.
Article
Managing production systems where production rates change over time due to learning and forgetting effects poses a major challenge to researchers and practitioners alike. This task becomes especially difficult if learning and forgetting effects interact across different stages in multi-stage production systems as rigid production management rules are unable to capture the dynamic character of constantly changing production rates. In a comprehensive simulation study, this paper first investigates to which extent typical key performance indicators (KPIs), such as the number of setups, in-process inventory, or cycle time, are affected by learning and forgetting effects in serial multi-stage production systems. The paper then analyses which parameters of such production systems are the main drivers of these KPIs when learning and forgetting occur. Lastly, it evaluates how flexible production control based on Goldratt’s Optimised Production Technology can maximise the benefits learning offers in such systems. The results of the paper indicate that learning and forgetting only have a minor influence on the number of setups in serial multi-stage production systems. The influence of learning and forgetting on in-process inventory and cycle time, in contrast, is significant, but ambiguous in case of in-process inventory. The proposed buffer management rules are shown to effectively counteract this ambiguity.
Article
The manufacturing sector as a whole has undergone remarkable changes in terms of scale, complexity and technology over the past decades and this applies across most modern high-technology manufacturing such as electronics, semiconductor, aerospace and automotive industries. In order to remain competitive, manufacturers have to produce high-quality products at low cost, and at the same time retain sufficient flexibility and to meet rapidly changing customer demands. Production planning and control (PPC) is a key role which enables the manufacturer to gain visibility and control over all aspects of manufacturing activities. PPC in itself forms a subject of study, within which simulation techniques have proven themselves to be one of the most practical methodologies available to investigate and evaluate manufacturing issues. In this review paper, we focus on state-of-the art applications of simulation techniques in PPC to demonstrate their applicability to modern manufacturing issues. The review reports on academic publications on simulation applications in manufacturing from 2002 to 2014, incorporating surveys of peer-reviewed literature. The review covers three types of simulation techniques (system dynamic, discrete event simulation and agent-based simulation) and eight PPC issues (facility resource planning, capacity planning, job planning, process planning, scheduling, inventory management, production and process design, purchase and supply management). Literature survey is analysed on the basis of simulation application to PPC problems which can give a guideline for simulation technique selection and also can help for simulation modelling in PPC problemsWould you consider changing the term “modeling” to “modelling” in the title. Please check, and correct if necessary.
Article
Enterprises are now facing and in the transition stage of the new era in production structure. Owing to the intense progresses of today's technology and the global businesses interaction situation, manufacturing automation related rigorous concepts and idea needed to be developed to meet the rapid changes in environment and needs. In order to develop a system model for the large scale interconnected systems and general job shop structures performed in the real world, a unified methodology was established in this research that brings techniques together while relating them to standard industrial engineering (IE) techniques for the design of shop-floor manufacturing dispatchers and routers. A matrix framework that based on the Petri nets theory (PN) and concepts was introduced into this research. In the matrix framework, one of the important parts is how to introduce the time into the matrix system model for parts input, for operations and processing, for resources arriving, and time for finished goods or products output. Therefore, different time matrices Ti.e. Tu, Tv, Tr, Tyare introduced in this study. Those time matrices are the key factors for integrating the manufacturing systems to approach the real time manufacturing world. Here, the key procedure for developing those time matrices is to develop and integrate the manufacturing matrix framework with the techniques of max/plus and dioid algebra. The result of this research is to introduce and establish time into the matrix manufacturing system to approach the real world production situation.
Article
This essay describes differences between papers that contain some theory rather than no theory. There is little agreement about what constitutes strong versus weak theory in the social sciences, but there is more consensus that references, data, variables, diagrams, and hypotheses are not theory. Despite this consensus, however, authors routinely use these five elements in lieu of theory. We explain how each of these five elements can be confused with theory and how to avoid such confusion. By making this consensus explicit, we hope to help authors avoid some of the most common and easily averted problems that lead readers to view papers as having inadequate theory. We then discuss how journals might facilitate the publication of stronger theory. We suggest that if the field is serious about producing stronger theory, journals need to reconsider their empirical requirements. We argue that journals ought to be more receptive to papers that test part rather than all of a theory and use illustrative rather than definitive data.
Article
Bottlenecks inhibit the performance of companies. Up to now, bottleneck management research has concentrated on manufacturing processes, while neglecting product design and engineering processes. This research fills this gap through developing and testing of a new bottleneck management concept for product design and engineering processes. The new concept is developed using a system theory modelling approach and comprises of four bottleneck management counter measures. Two propositions were developed to test the concept through an event-discrete simulation model. The simulation is grounded on empirical data from three design-driven companies and tests the impact of the four bottleneck management counter measures on the performance of product design and engineering processes. The findings from the simulation confirm the applicability of the newly developed bottleneck concept to improve the performance of product design and engineering processes. In doing so, this research study expands bottleneck management for the first time from manufacturing to product design and engineering processes.
Article
When capacity differences are minimized through an efficient algorithm, and integration of capacity planning with any production planning system is performed, it affects some elements of production planning functions. In the reverse way, some elements of production planning and management techniques also affect the effectiveness of capacity planning. These happen because capacity planning processes, production planning processes and production management techniques are not standalone sub-systems, rather these are totally dependent on each other. This paper aims at determining and formulating the effects of some of the selected elements of capacity and production planning functions on each other. This study is conducted using simulation in object-oriented SIMPLE+ + system.
Article
The computer simulation of manufacturing systems is commonly carried out using discrete event simulation (DES). Indeed, there appears to be a lack of applications of continuous simulation methods, particularly system dynamics (SD), despite evidence that this technique is suitable for industrial modelling. This paper investigates whether this is due to a decline in the general popularity of SD, or whether modelling of manufacturing systems represents a missed opportunity for SD. On this basis, the paper first gives a review of the concept of SD and fully describes the modelling technique. Following on, a survey of the published applications of SD in the 1990s is made by developing and using a structured classification approach. From this review, observations are made about the application of the SD method and opportunities for future research are suggested.
Article
In order to use the philosophy of JIT to improve the production planning method of MRP-II, we propose the experimental software system of the earliness/tardiness produc tion planning problem with due window. By means of the approaches and model reported in this paper, the optimal production planning can be achieved. The recommended model extends the problem of due window from the shop scheduling level into the aggregated planning level of mass manufacturing systems. Simulation results have demonstrated that the experimental software is a useful tool for the production management of repetitive manufacturing enterprises.
Article
Purpose – Progress in theory building in the field of collaborative networks in manufacturing is preponderantly seen in contributions from disciplines outside manufacturing science. Interdisciplinary research is one way of accelerating the development of appropriate theory for this emerging domain where industrial practice has moved beyond the state of the art of scientific knowledge for establishing workable, competitive solutions. The purpose of this paper is to examine to what extent interdisciplinary research has contributed to a better understanding of collaborative (manufacturing) networks. Design/methodology/approach – To find out more about provenances of on‐going studies, to identify clusters of contributions and to provide direction for future work of researchers in this domain, publications of the past 22 years have been evaluated. To retrieve these contributions, a structured literature review has been undertaken by applying keywords to selected databases and using a strictly defined stepwise procedure. In total, 202 publications of all kinds have been evaluated. Findings – From the analysis of the results, it appears that most interdisciplinary contributions to collaborative (manufacturing) networks rely on one original outside discipline for either developing solutions or advancing theoretical insight. Consequently, and after further analysis, it seems that researchers in collaborative networks hardly resort to multi‐disciplinary approaches, unless “natural”; further advances might arrive from stimulating these multi‐disciplinary avenues rather than sticking to more mono‐disciplinary, and less risky, takes on both applications and theoretical insight. A more detailed investigation of the value of contributions reveals that efforts to make interdisciplinary advances are either difficult or limited. Also, the findings indicate that researchers tend to follow a more “technical” approach to decision making by actors in networks rather than searching for a shift in paradigm. Originality/value – While setting out these directions for future research and guiding research, this first‐of‐its‐kind review introduces the collaboration model as a systematic approach to collaborative (manufacturing) networks. This model might serve as a reference model to integrate disciplines for addressing the characteristics of Collaborative Networks. Its use in the review led to the finding that typical traits of networks, such as changeability, supplementary assets and decentralisation of decision making, are under‐researched.
Article
Semiconductor wafer fabrication involves very complex process routing, and reducing flow times is very important. This study reports a search for better dispatch rules for achieving the goal of reducing flow times, while maintaining high machine utilization. We explored a new simulation-based dispatch rule and a queue prediction dispatch rule. Using simulation experiments and an industrial data set, we also compared several other dispatch rules commonly used in semiconductor manufacturing with our proposed dispatch rules. Among these rules, in addition to the simulation-based dispatching rule, the shortest-remaining-processing-time, earliest-due-date and leastslack rules also performed well in terms of reducing flow times. The reasons behind these good rules are discussed in this paper. Based on the previous works and this study, accurately predicting and effectively utilizing future flow times can improve the quality of production management decisions.
Article
Production management and scheduling problems in flexible manufacturing systems (FMSs) are more complicated than those problems in job shops and transfer lines. Due to the flexibility of FMSs, alternative operation is common. When considering the operational problems of FMSs such as scheduling, simulation methodology seems to be useful to address these issues. Real-time scheduling schemes use simulation techniques and dispatching rules to schedule parts in real time. The advantage of this kind of scheduling with respect to analytical approaches is its capability of considering any unexpected events in the simulation model. This paper presents a fuzzy approach to real-time operation selection. This approach uses membership functions to find the share of each objective in final decision rules. Simulation methods are used to show the effectiveness of the approach.
Article
It is shown how production management constraints can be taken into account as early as the design stage of the product. It is then shown that the indication of physical and economic performances is a means to obtain integration and to continually improve the manufacturing processes. To perform continuous and efficient evaluation of physical and economic performances, the concept of activity is introduced, now acknowledged as a basic concept of cost management systems. A modelling approach is then proposed to allow each member of the design team, in a concurrent engineering context, to perform activity based modelling and estimations with simulation. After describing the set of possible situations in which this approach can be used, an application case is presented concerning a French firm, and the advantages and disadvantages of the approach are discussed.
Article
This paper aims to explain the production flow and the distribution logic of bobbins for rewinding process in a yarn dyeing factory, comparing the different scenarios of production (manual and automatic) using the computer simulation tools. The goal of this project is to build a model in which all the involved processes can be simulated with the consideration of all the parameters and constraints. The simulation model is used as a tool for the comparison of present manual setup and future automated setup for the production management of bobbin distribution in yarn rewinding process in terms of delays and costs. Since, the manual operation involves defaults, improper time management, errors and with the growing competitiveness globally, the companies in Europe need to automate as much as possible their production lines. The expected impacts are to increase the productivity and profitability, to have the possibility to customize the production, to develop production tools, implementation of the lean manufacturing tools.
Article
Production management aims to maximize profit by increasing salable output while reducing the cost related with inspection, where inspection is defined as the measurement and quality assessment of items produced. This study is based on a semiconductor production line with consecutive deteriorating machines. Each machine is inspected via the items it produces and an inspection result triggers a machine's repair, if needed. Inspection related cost includes fixed and variable cost of inspection capacity, Yield Loss Cost generated due to unsalable throughput, and delivery delay cost caused by inspection flow-time. The effects of inspection capacity and inspection rate on cost are investigated using analytical and simulation models. Under a given inspection capacity, Yield Loss Cost decreases with growing inspection rate until a minimum is reached, and then starts to increase with further growing rate. This increase is explained by the impact of higher load on the inspection facility, which prolongs the inspection response time. Thus, an optimal inspection rate can be derived for a given inspection capacity. It will be shown that the higher the capacity, the higher the optimal rate, and the lower the yield loss. Determination of optimal inspection capacity considers the capacity cost against the other costs and minimizes the total expected inspection related costs.
Data
Two paradigms characterize much of the research in the Information Systems discipline: behavioral science and design science. The behavioral-science paradigm seeks to develop and verify theories that explain or predict human or organizational behavior. The design-science paradigm seeks to extend the boundaries of human and organizational capabilities by creating new and innovative artifacts. Both paradigms are foundational to the IS discipline, positioned as it is at the confluence of people, organizations, and technology. Our objective is to describe the performance of design-science research in Information Systems via a concise conceptual framework and clear guidelines for understanding, executing, and evaluating the research. In the design-science paradigm, knowledge and understanding of a problem domain and its solution are achieved in the building and application of the designed artifact. Three recent exemplars in the research literature are used to demonstrate the application of these guidelines. We conclude with an analysis of the challenges of performing high-quality design-science research in the context of the broader IS community.
Article
The complex operations and considerable process time variability of printed circuit board (PCB) fabrication create difficulties in finding effective and efficient planning techniques for today's PCB production management. A great deal of money is involved. By modeling and testing a real world PCB fabrication facility, this paper shows that computer simulation can provide a viable planning tool to estimate production capacity and to explore optimum arrangement in batch work size of key bottleneck machines to minimize product throughput time. Many simulation experiments are performed and the results analyzed as a response surface. The general characteristic of product throughput time is found to be that its minimal value exists when batch job numbers of subsequential key machines are matched in batch size or in multiples thereof. A nonlinear empirical equation to estimate product throughput time has been derived from the simulation results.
Conference Paper
Traditional industrial engineering techniques including mathematical models are not sufficient to examine sophisticated manufacturing systems such as semiconductor manufacturing. As such, simulation modeling is used extensively in the design and analysis of semiconductor manufacturing operations. This study explores the use of simulation modeling of single semiconductor toolsets. In the literature a number of modeling approaches for single toolset analysis can be identified. The purpose of this study is to review and evaluate these approaches.
Article
This paper reports a research and development of a suite of generic software program entitled TEXSIM (TEXtile SIMulator). The software is mainly intended to create simulation models of weaving of production systems without any programming and automatically performs the simulation study and produces results to understand the stochastic behaviour of the system as well as to analyze the system performances to solve the real life weaving production management problems. The ’TEXSIM’ reads the input parameters from the user in an on-line session through its user-interface, written in FORTRAN’77, and interactively uses WITNESS, a manufacturing simulation package containing the basic simulation model building blocks, and creates the simulation model in accordance with the user’s specifications and conducts the simulation experiments and produces results. The objective is to focus on the practicality and simplicity of simulation model building of a weaving production system with a readily available suite of user-friendly program TEXSIM within few minutes without expertise and back ground of simulation technique and the knowledge of computer simulation programming as well as the skill of handling of commercial simulation package. It also highlights the importance of use of computer simulation technique as a modern, powerful and flexible management analysis tool in weaving factories. Textile engineers and technologists, particularly the managers who have no background of simulation can take full advantages of the use of simulation technique to analyze their present complex weaving production systems, rather than using the conventional analytical rule of thumb methods, to help the management to plan, design and operate their systems in an efficient manner to improve the manufacturing productivity. TEXSIM also facilitates the scheduling of production within the factory through simulation.
Article
An approach for modeling the distribution of the random variable (U) that is the maximum of several dependent random variables is described in this paper. Applications of the model to various problems associated with the Industrial Engineering profession are described and one — the accumulation of components in small-lot assembly systems — is studied in detail. Numerical tests investigated the accuracy of the approach and identify several fundamental characteristics of the accumulation process. Apparent advantages of the model are that it requires computer run time that is fraction of that required by Monte Carlo sampling (simulation) procedure and that it is readily amenable to parallel processing.
Article
Many leading OR professionals think that our primary mission should be to apply the mathematical and scientific principles of OR to real-world problems rather than to pursue theoretical research. To encourage the application of OR models, in 1976 the editors of Interfaces shifted the emphasis of the journal from meta research to applications and survey papers. I analyzed the citations and selective references over the last two decades or so of Interfaces to obtain a snapshot of where the field has been and where it is heading. I analyzed 1,294 articles, 2,190 authors, and more than 2,500 references.
Article
Questionnaires have been sent to Operations Research Society of America (ORSA) members at five-year intervals over the past 15 years (1973, 1978, 1983, 1988). The most recent set of questionnaires (1988) indicates what operations research (OR) educators and practitioners believe are the quantitative techniques needed for a proper foundation in OR. The results show some change since the first questionnaire (1973). Three quantitative techniques stand out as consistently believed to be the most important: math programming, statistics, and simulation. Other techniques vary in relative importance. Practitioners indicate the use of a more diverse set of techniques than educators.
Article
This case study develops an innovative management and scheduling system for corrective maintenance of machines in a manufacturing facility. The study also involves a comparative evaluation of the proposed and the existing systems under a spectrum of operating conditions. A comprehensive simulation is used to evaluate system performances under a variety of settings which include reliability, service level, and cost consequences. The analysis is based on a full factorial experimental design. In summary, the developed self-regulating management system which involves dynamic work allocation and pre-emption is shown to yield higher machine availability and higher mechanic utilisation even with fewer mechanics. The study also finds that the new system is more streamlined, agile, and robust although it is subject to more-constrained machine reliability and mechanic service time environments. Further, a major reduction of current manpower can still achieve at least 95% machine availability, illustrating the cost effectiveness and efficacy of the developed system. This rule-based corrective maintenance system can be operated in uncertain environments on a real-time basis without additional reformatting costs and provides a competitive measure to deal with managerial issues such as low retention rate for skilled mechanics, highly uneven training levels and pay scales. The financial consequences and gains in strategic advantage with respect to the facility's operational structure are promising after implementation. Moreover, the system developed in this case study represents a meaningful starting point for a more vigorous theoretical research on the bucket brigade system to different functions in industrial and operations management.
Article
A manufacturing system consists of a structure and distributed working procedures that include operating parameters. A new approach named unified structural–procedural approach (USPA) for designing the integrated structure and the distributed but integrated working procedures of a manufacturing system is included in this paper. The designed structure and working procedures bring about the efficiency desired by the target market on the ordered products even when the desired efficiency is turbulent. Here, the USPA approach is applied to redesign a real apparel factory. The USPA includes identification of the target market requirements, conception of the target manufacturing system, design of the system structure and working procedures. Conception of the target manufacturing system is done using pseudo-neural networks that exploit the improvements introduced by the higher performing firms throughout the industry. Design of the system structure is done using simulation models that bring forth building structural improvements whose implementation investments equate the saved inefficiency costs because the system structure is improved. The distributed working procedures are specified using flowcharts that include integrated values of the procedural parameters. The nature of these parameters are identified from available non-integrated operations management models and their integrated values obtained by using simulation models that evaluate their joint effect on product efficiency.
Article
In order to react to the continuous and unpredictable changes in product demand, in product variety, and in process technologies, reconfigurable manufacturing systems allow quick adjustment of production capacity and functionality by rearranging or changing their modular components. In this kind of system, operation management issues, such as exception handling policies, become more complex since correct reconfiguration strategies have to be selected. This paper explores the potential of the reconfigurability feature to be a basis for the development of new strategies to handle out-of-the-ordinary events in the production process; in particular, maintaining production flow when machine breakdowns occur. Decisions regarding how to deal with exceptions to the production process are complex and depend on the manufacturing system configuration and on many performance and economic variables. The authors propose agent-based manufacturing control for exception handling because of its ability to be very agile, as well as being reactive and efficient. Manufacturing agents, while working to pursue their specific goals, achieve the global target of the system. Complex decisions can be made due to the synergy arising from the agents' internal reasoning and the negotiation process among these agents. The adopted negotiation mechanism is based on the contract-net protocol, while different strategies have been designed for the internal reasoning. The authors demonstrate that, under certain conditions, an agent's internal strategies based on fuzzy reasoning improve the global performance of the system. The proposed control model has been tested on a discrete event simulation test-bed.
Article
The aim of this article is to define a new simulation game in operation management called Logistic Game™. The main objectives are to introduce a new simulation game approach in solving the different correlated subsystems based on ‘visual interactive learning’ and to verify its positive effects on the learning process with respect to the usual simulation games. The game is based on an inside plant virtual supply chain simulation and copes with the educational challenges of teaching Industrial Logistics in a new, effective way. By applying a visual interactive simulation package, the game creates a virtual dynamic scenario directly visible by participants, with an improvement of experimentation and conceptualisation phases, and offers several logistic decisions and their strategic links from a holistic point of view. The challenge goes beyond a pure theoretical setting and students learn strategies and gain experience directly by operating in a virtual supply chain and sharing knowledge. The Logistic Game has been used to train more then 300 students since December 2006 in three different Italian workshops and has been designed to encourage the employment selection process by the companies involved.
Article
Since 1984, the microelectronics industry in the United States which enjoyed phenomenal demand and growth complemented by technical excellence and innovative engineering has had major financial problems. One contributing factor could be that little emphasis has been placed on the aspects of manufacturing cost control and scientific decision making. In 1985, the microelectronics industry was forced into a critical shift from a predominantly pioneering stage with healthy research and development budgets, into a cost-competitive one. Manufacturers realized that they had to pay attention to cost control and productivity improvement. Among other things, this necessitated the development and use of structured scientific decision making methods in support of all facets of plant operations and management. A simulation model was designed and implemented with the purpose of improving the scheduling, control, and production management of the manufacturing line for a medium size silicon wafer manufacturer. The simulation model was designed to be interactive on a microcomputer, easily adaptable to other similar manufacturing environments, and readily usable at various decision levels ranging from manufacturing floor supervisors to upper management. It can easily interface with other decision analysis packages to assist in a variety of decision processes throughout the plant.
Article
This paper reviews and classifies literature on the use of discrete event simulation for manufacturing system design and operation problems. Simulation has been a widely used tool for manufacturing system design and analysis for more than 30 years. During this period, simulation has proven to be an extremely useful analysis tool, and many hundreds of articles, papers, books, and conferences have focused directly on the topic. This paper presents a classification of a subset of these publications and the research and applications that underlie these publications.
Article
Purpose The purpose of this paper is to identify the contemporary research themes published in IJOPM in order to contribute to current debates about the future directions of operations management (OM) research. Design/methodology/approach All 310 articles published in IJOPM from volume 24 issue 9 in 2004 through volume 29, issue 12 in 2009 are analysed using content analysis methods. This period of analysis is chosen because it represents all the articles published in issues for which the authors are able to have full control, during their period of tenure as Editors of the journal. This analysis is supplemented by data on all 1,853 manuscripts submitted to the journal during the same time period and further, by analysis of reviews and feedback sent to all authors after review. Findings The paper reports the main research themes and research methods inherent in the 310 published papers. Statistics on the countries represented by these papers and the size and international composition of author teams are provided, together with the publication success rates of the countries that submit in the highest volumes, and the success rates associated with the size of the author team. Finally, data on the reasons for rejection of manuscripts are presented. Research limitations/implications There is some residual inaccuracy in content analysis methods, whereby, in extracting research themes there is often more than one topic covered. In the same vein, as regards categorisation of the causes of rejection of manuscripts during the review process, there is frequently more than one reason for rejection, so perhaps a weighted scoring system would have been more insightful. In determining the country of origin of papers, while the country of the corresponding author is used, it should be recognised that some studies originate from international collaborations so that this method may give a slightly distorted picture. Finally, in computing publication success rates by comparison of submissions and published papers there is a time delay between the two data sets within any defined period of analysis. Practical implications The analysis adds generally to debates about contemporary research themes; in particular it extends the work of Pilkington and Fitzgerald, which analyses all articles solely in IJOPM between 1994 and 2003. In addition, the findings suggest a need for more frequent exploitation of multiple research methods, for greater rigour in the planning and execution of fieldwork, for greater engagement with the world of OM practice and finally, consideration of how OM research can address wider social and political issues. Originality/value This paper represents an inside view of the publication process from a leading OM journal; this kind of insight is rarely available in the public domain.
Article
The continuing debate on production and operations management (POM) research has led to a new emphasis on empirical methods. Claims that, while surveys and case research are increasingly recommended to POM researchers, action research has been relatively neglected. The distinct characteristic of action research is the intervention by the researcher in the situation under study. The nature of the intervention, and of action research outputs, differs however from consulting or from the applications reported by APICS. Explains these differences and offers a simple model of action research. Action research is particularly valuable for theory building, as has been seen in the fields of organization behaviour (OB) and management information systems (MIS), where qualitative methods have often been employed rather than traditional scientific methods. POM researchers can learn from the experience of other disciplines and use action research to create new theory. Since many POM researchers will be unfamiliar with action research, explores some practical aspects of conducting such investigations with illustrations from the author's own research experience. Concludes by showing that a properly conducted action research project can be as rigorous as other methods.