Conference Paper

Taylor II manufacturing simulation software

F&H Simulations Inc., Orem, UT, USA;
DOI: 10.1109/WSC.1994.717245 Conference: Simulation Conference Proceedings, 1994. Winter
Source: IEEE Xplore

ABSTRACT Taylor II is a menu-driven simulation package mainly used in manufacturing and logistics. It is developed for the analysis and quantitative evaluation of complex processes especially of those with a dynamic character. A lot of applications in different industries show that there is an increasing need for simulation tools. The paper demonstrates the process of building, analyzing and presenting models of real world systems with Taylor II.

1 Bookmark
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: teaching an introductory simulation course using an interactive CD-ROM titled "Simply Simulation". This method utilizes several multimedia tools and a hypertext based web format. The simulation literature currently shows no studies on this proposed new teaching method. Course structure, requirements, and benefits of Simply Simulation are described in this paper. Simply Simulation gives detailed explanations on simulation concepts and easy-to- follow instructions in five modules. The student uses Taylor II process simulation software to model and analyze progressively more complex real life situations. Competencies gained are measured via a pretest at the beginning of each module and a quiz at the end of each module. This paper and Simply Simulation contribute to the simulation education literature by exemplifying how to enhance the learning effectiveness by utilizing various information technologies and teaching methods.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Simulations are designed to emulate a system or process within a certain set of specified assumptions. Such simulations can then be used as experimental platforms for exploring a system's or process's behavior under a variety of circumstances. Experiments are conducted by systematically varying the inputs to the simulation model, collecting the model outputs, analyzing the resulting data, and using the insights gained from the analysis to formulate new experiments and/or to answer questions concerning expected behavior of the system or process under study. As models become increasingly complex, in order to learn the most from the least number of runs, exploration of these models' behavior must be systematic and focused. Careful planning and experimental design must be done in order to efficiently and effectively use large, complex models to answer key questions. The Experimental Design and Analysis Simulation Interface supports this process for large, complex, deterministic models. 1
  • [Show abstract] [Hide abstract]
    ABSTRACT: Discrete-event simulation is one of the most popular modelling techniques. It has developed significantly since the inception of computer simulation in the 1950s, most of this in line with developments in computing. The progress of simulation from its early days is charted with a particular focus on recent history. Specific developments in the past 15 years include visual interactive modelling, simulation optimization, virtual reality, integration with other software, simulation in the service sector, distributed simulation and the use of the worldwide web. The future is then speculated upon. Potential changes in model development, model use, the domain of application for simulation and integration with other simulation approaches are all discussed. The desirability of continuing to follow developments in computing, without significant developments in the wider methodology of simulation, is questioned.Journal of the Operational Research Society (2005) 56, 619–629. doi:10.1057/palgrave.jors.2601864 Published online 22 September 2004
    Journal of the Operational Research Society 01/2005; 56(6):619-629. · 0.99 Impact Factor


Available from