ArticlePDF Available

Analysis of the Influence of Standard Time Variability on the Reliability of the Simulation of Assembly Operations in Manufacturing Systems

Authors:

Abstract and Figures

Objective:: The aim of this article is to analyze the influence of the variability of the standard time in the simulation of the assembly operations of manufacturing systems. Background:: Discrete event simulation (DES) has been used to provide efficient analysis during the design of a process or scenario. However, the modeling activities of new configurations face the problem of data availability and reliability when it comes to seeking standard times that are effective in representing the actual process under analysis, especially when the process cannot be monitored. Method:: The methods-time measurement (MTM) is used as a source of standard times for simulation. Assembly activities were performed at a Learning Factory facility, which provided the necessary structure for simulating real production processes. Simulation performances using different variability of standard times were analyzed to define the impact of data characteristics. Results:: The MTM standard time presented an error of approximately 5%. The definition of the data variability of standard times and the statistical distribution impacts were shown in the simulation results, with errors above 6% being observed, interfering with the model reliability. Conclusion:: Based on the study, to increase the adherence of a simulation to represent a real process, it is recommended to use triangular distributions with central values greater than those established via the MTM for the representation of the standard times of new assembly processes or scenarios using DES. Application:: The study contributions can be applied in assembly line design, providing a reliable model representing real processes and scenarios.
Content may be subject to copyright.
Objective: The aim of this article is to analyze the influ-
ence of the variability of the standard time in the simulation of
the assembly operations of manufacturing systems.
Background: Discrete event simulation (DES) has been
used to provide efficient analysis during the design of a pro-
cess or scenario. However, the modeling activities of new
configurations face the problem of data availability and reliabil-
ity when it comes to seeking standard times that are effective
in representing the actual process under analysis, especially
when the process cannot be monitored.
Method: The methods-time measurement (MTM) is used
as a source of standard times for simulation. Assembly activi-
ties were performed at a Learning Factory facility, which pro-
vided the necessary structure for simulating real production
processes. Simulation performances using different variability
of standard times were analyzed to define the impact of data
characteristics.
Results: The MTM standard time presented an error
of approximately 5%. The definition of the data variability of
standard times and the statistical distribution impacts were
shown in the simulation results, with errors above 6% being
observed, interfering with the model reliability.
Conclusion: Based on the study, to increase the adher-
ence of a simulation to represent a real process, it is rec-
ommended to use triangular distributions with central values
greater than those established via the MTM for the represen-
tation of the standard times of new assembly processes or
scenarios using DES.
Application: The study contributions can be applied in
assembly line design, providing a reliable model representing
real processes and scenarios.
Keywords: discrete event simulation, methods-time mea-
surement, work measurement, assembly workstation, opera-
tions research, industrial environment, virtual environment
INTRODUCTION
Globalization and increased competitive-
ness have resulted in changes in manufacturing
systems. Companies have sought to use new
methods aimed at reducing time and costs in
the product and process development stages. In
this context, discrete event simulation (DES)
has been widely used for the development of
new manufacturing systems or in the analysis of
scenarios for process improvement and decision
making (Dode, Greig, Zolfaghari, & Neumann,
2016).
In addition to its application in process devel-
opment, simulation has been used in capacity
analysis (Argoneto & Renna, 2013), the identifi-
cation and reduction of bottlenecks in the pro-
duction flow (Atieh, Kaylani, Almuhtady, & Al-
Tamimi, 2016; Yang, Bukkapatnam, & Barajas,
2013), increasing operational efficiency (Zeng,
Wong, & Leung, 2012), and resource manage-
ment (Lu, Petersen, & Storch, 2007). This wide
use demonstrates the diversity of analysis pro-
vided through DES that facilitates process anal-
ysis, as changing real systems can be expensive
in terms of cost and time (Budgaga et al., 2016).
Considering the manufacturing case, the use
of simulation models is even more recom-
mended because their analysis results in a con-
siderable increase in time and cost, as their sys-
tems can be extremely complex due to factors
such as manufacturing different items in the
same line and batch processing (Fowler & Rose,
2004). Due to these characteristics, DES has
been applied successfully in assembly opera-
tions in the manufacturing area because the flex-
ibility of this type of simulation allows for the
reproduction of production system complexities
(Iannone, Miranda, Prisco, Riemma, & Sarno,
2016; Negahban & Smith, 2014).
829596HFSXXX10.1177/0018720819829596Human FactorsInfluence of Standard Time at Simulationresearch-article2019
Address correspondence to Rafaela Heloisa Carvalho
Machado - Angelina Bernardes, n°90, Santo Antônio, Minas
Gerais, Brazil; e-mail: rafaela.h.machado@gmail.com.
Analysis of the Influence of Standard Time
Variability on the Reliability of the Simulation of
Assembly Operations in Manufacturing Systems
Rafaela Heloisa Carvalho Machado, Universidade Metodista de Piracicaba,
Santa Bárbara d´Oeste, Minas Gerais, Brazil, André Luis Helleno, Maria Célia de
Oliveira, Mário Sérgio Corrêa dos Santos, and Renan Meireles da Costa Dias,
Universidade Metodista de Piracicaba, Santa Bárbara d´Oeste, São Paulo, Brazil
HUMAN FACTORS
Vol. 61, No. 4, June 2019, pp. 627 –641
DOI: 10.1177/0018720819829596
Article reuse guidelines: sagepub.com/journals-permissions
Copyright © 2019, Human Factors and Ergonomics Society.
628 June 2019 - Human Factors
DES uses a software platform to represent the
system under analysis. The simulation has a sto-
chastic nature, generating randomness through
the use of statistical distributions to represent
process events as the time to execute an activity
or to change equipment (Tako & Robinson,
2012). These statistical distributions are defined
by statistical tests to decide which configuration
is appropriate for the data representation. As the
software runs the process, the data will vary
according the defined distribution, in this way
providing the mechanism to represent the natu-
ral process variability and, as a consequence, the
variability of the system as a whole. Therefore,
natural variability (i.e., the number of items
manufactured by a production line) can be repro-
duced to enable a better analysis of the system
reality.
However, despite its application-related
advantages, the effectiveness of a simulation
study is conditioned to the reliability of the
input data of the virtual system, which directly
impacts the efficacy of the model defined to
simulate the real system under analysis. In the
case of the development of new processes or
new scenarios, the difficulty associated with
the availability of information on the process
should be taken into consideration, as the pro-
cess observation and timing are disabled, con-
sidering that the process implementation will
be applied only after the virtual analysis. In
assembly operations, in which scenario analy-
sis is the most used optimization method (Praj-
apat & Tiwari, 2017), the availability of reli-
able data for the simulation of production alter-
natives becomes a critical factor.
Simulations have been successfully used to
study and analyze systems that human work has
supported (Wickens, Sebok, Li, Sarter, & Gacy,
2015; Steiner, Burgess-Limerick, & Porter,
2014), but the need to increase the reliability of
operator representation in the simulation of
assembly operations is still an important issue as
highlighted by Dode et al. (2016). At this study,
the goal is to insert human factors, such as
fatigue and the operator’s learning curve, into
manufacturing system modeling. Małachowski
and Korytkowski (2016) also presented alterna-
tives to incorporating the operator’s learning
curve into DES. However, the authors did not
address the reliability of the data sources used to
represent the operator, so this remains a relevant
question to be analyzed in cases where the actual
values of the processes are unavailable. Thus,
the need to provide reliable standard times that
are effective in representing the actual processes
under analysis, specifically with regard to their
values and variability, is highlighted. Given this
issue, the aim of this article is to analyze the
influence of the variability of standard times in
the simulation of the assembly operations of
manufacturing systems.
To this end, the methods-time measurement
(MTM) will be used, defined as a system of
standard times established according to statisti-
cal bases (Cakmakci & Karasu, 2007). Cur-
rently, the MTM has been used as a data source
for virtual systems, as in Kuo and Wang (2009,
2012), because it is considered an intuitive and
reliable database for supporting the analysis of
human movements, serving in this research as a
source of standard times in the evaluated pro-
cesses. The MTM contributes to facilitating
human-computer interaction (HCI) with regard
to the simulation software because its language
is simple and intuitive, allowing the analyst to
understand the process and its activities, so that
it is possible to define the simulated model more
closely to reality, thus improving the modeling
process and the data entry in the software.
The study was carried out in the Learning
Factory that Helleno et al. (2013) developed for
the integral teaching of lean manufacturing con-
cepts in the university and industry environ-
ments. A Learning Factory offers an approach to
representing manufacturing works at an opera-
tional level (Barton & Delbridge, 2001), provid-
ing a facility dedicated to simulating real pro-
duction processes and environments according
to academic and educational purposes (Tisch,
Hertle, Abele, Metternich, & Tenberg, 2015).
According to Dinkelmann, Siegert, and Bauern-
hansl (2014) and Tisch et al. (2015), besides
educational objectives, a Learning Factory may
also be used for research purposes. This struc-
ture was chosen because it provided the indus-
trial environment necessary for applying the
study proposal and enabled the management of
the process variables without disrupting the
production.
Influence of Standard tIme at SImulatIon
629
LITERATURE REVIEW
The models defined through the use of DES
are intended to understand how a system works
over time and how it behaves under different
conditions. DES models a system defined by
entities, as for instance, the parts in a factory.
These entities have attributes to determine how
they will act in the system (Tako & Robinson,
2012), defining which equipment will be used in
the manufacturing process and in which order.
By means of this mechanism, a queue of events
is created, and as they occur, the state of the sys-
tem is affected (Alrabghi & Tiwari, 2016), with
parts being produced, setups, and maintenance
occurring according the simulation modeling.
The sequence and variation of these events
are defined by the data input into the system.
Hence, the data modeling process is an essential
step of the simulation (Oliveira, Lima, & Mon-
tevechi, 2016). DES can represent the observed
variations in real processes through the use of
probabilistic distributions (Prajapat & Tiwari,
2017). However, the exact data for determining
the statistical behavior of the variables are not
always available, with part of these data being
related to the standard times of the execution of
activities.
The use of standard time is often necessary
due to the use of operators in production lines,
which provides flexibility in the processes to
absorb the changes that consumers demand
without the need for major operational changes.
The use of labor in manufacturing systems has
not yet been surpassed by the technology sys-
tems available, due to its high cost and the loss
of flexibility obtained with its implementation
(Małachowski & Korytkowski, 2016).
Stopwatch techniques and predetermined
time systems (PTSs) consist of methods to
define the standard time. Stopwatch methods
provide the process variability; however, they
require the systematic observation of the activi-
ties to obtain the set of values necessary to
determine the operating time. Therefore, these
methods represent an inappropriate approach in
cases of new processes and new scenarios in
which the observation may be impossible
because the simulation is going to represent vir-
tually a configuration that does not exist and,
for this reason, cannot be observed. Moreover,
according to Nakayama Nakayama, and
Nakayama, (2002), stopwatch techniques may
induce subjective impressions during the
parameter definition that the analyst performs.
PTSs define movements such as grab and
move, defining codes and estimated value times
for their execution. When analyzing a process, it
is possible to identify the standard movements
necessary for its realization and, through the
sum of the values assigned to these movements,
to obtain the value of the total time required to
perform an activity. To determine the sequence
of movements, it is necessary to observe the pro-
cess movements; however, because the PTSs
use tabulated movements and values, systematic
and multiple observations of the process are not
necessary for the determination of the execution
time. When the process does not exist (new pro-
cess and scenarios), the sequence of movements
can be determined by the definition of the pro-
cess activities and, when viable, the physical
simulation of movements to understand the
sequence. In that case, a simple and unique sim-
ulation is sufficient for analysis but different
from the stopwatch technique that requires mul-
tiple and detailed reproductions of activities to
determine the process time with statistical reli-
ability. The PTS bases provide the time with sta-
tistical reliability and eliminate the necessity of
systematic process observation and evaluation
of the operator activities and also still allow the
working time to be estimated in the planning
phase of a process (Cakmakci & Karasu, 2007).
Among the PTSs, the MTM stands out as one
of the most used time systems (Alrabghi &
Tiwari, 2016) and is therefore applied as an
international standard for work performance.
Another PTS called the work factor (WF) is also
applied in industry, but its methods consider the
execution time for a more skilled operator ver-
sus considerations related to the MTM, so the
values determined via the WF are on average
20% smaller than those established via the MTM
(Lehto & Buck, 2007).
The objectives of the MTM application are
linked to the standardization and optimization of
work efficiency (Kuo & Wang, 2009). Besides
that, the times that the method defines are used
to determine the work rates in the industry (Di
Gironimo, Di Martino, Lanzotti, Marzano, &
630 June 2019 - Human Factors
Russo, 2012), thus providing a standard of com-
parison for the performance of a process. By
using the MTM analysis, proposed changes in
the processes can be immediately quantified
concerning the operating time (Kuhlang, Edt-
mayr, & Sihn, 2013) through the analysis of the
sequence of movements, providing the time val-
ues necessary for the simulation.
The MTM has been applied to the analysis of
assembly line movements as in the studies of
Kothiyal and Kayis (1995), Tseng and Tang
(2006), Kernbaum, Franke, and Seliger (2009),
Kuo and Wang (2009), Desai and Mital (2010a),
and Baraldi and Kaminski (2011). In addition,
the method has been applied in the analysis of
maintenance procedure movements, such as in
Desai and Mital (2010b), Desai and Mital
(2011), and Di Gironimo et al. (2012).
The accuracy of the MTM was assessed in
the studies of Knott and Sury (1986) and Kothi-
yal and Kayis (1995). Knott and Sury compared
the tabulated times with the real times collected
in the study, finding positive and negative differ-
ences among the cases verified. In the study of
Kothiyal and Kayis, the authors reported that the
MTM records the times of the activities in an
underestimated way, defining less time than nec-
essary for the accomplishment of the tasks. The
times observed in the studies of Kuo and Wang
(2009) and Di Gironimo et al. (2012) were also
higher than those determined via the MTM.
In current studies, such as Desai and Mital
(2011), Kern and Refflinghaus (2013), and
Bedny and Harrys (2013), the MTM has been
applied with deterministic values for the
execution times of activities, without consider-
ing the variability inherent in human activities.
However, for the DES application, the system
must take into account the variability of the pro-
cesses to obtain greater reliability regarding the
reality. Therefore, variability in the standard
time values used in the modeling is necessary to
consider.
Robinson (2004) stated that when the proba-
bilistic curve of an event is unknown, uniform
and triangular distributions can be used to repre-
sent the data. According to Banks and Chwif
(2011), the triangular distribution is usually
applied to simulate the process time at the begin-
ning of a simulation project, in the absence of
exact data.
However, these distributions are considered
approximations regarding the actual distribution
of operating times, which, according to Murrel
(1961) and Dudley (1963), should be determined
with a normal distribution in which, for experi-
enced operators, the highest occurrence fre-
quency is concentrated in values positioned to
the left of the mean. However, according to
Banks and Chwif (2011), manual service times
generally follow the lognormal distribution.
Despite this application, the use of the triangular
distribution may result in errors when represent-
ing a process whose real distribution would be
lognormal, as shown in Figure 1.
However, errors resulting from the approxi-
mations made via the substitution of real values
for the uniform and triangular approximation
curves may not affect the simulation results.
According to Banks and Chwif (2011), the
incorrect definition of the distribution curve of a
variable may not result in significant differences
in the simulated model, as the choice among a
lognormal, uniform, or triangular is possibly
indifferent in the simulation results.
RESEARCH METHODS
Participants
The systematic execution of activities was
conducted with production engineering students
representing the operators’ performance. Of 37
students who collaborated, 73% were male,
100% of the sample was between 19 and 22
years, and none had experience working with
assembly-line activities.
Figure 1. Operation represented as a lognormal and
simulated as triangular (Banks & Chwif, 2011).
Influence of Standard tIme at SImulatIon
631
This research complied with the American
Psychological Association Code of Ethics and
was approved by the institutional review board at
Universidade Metodista de Piracicaba. Informed
consent was obtained from each participant.
Procedures
The approach defined for the study was
divided into five stages as shown in Figure
2. Stages I, II, and III were performed in the
assembly line structured in the Learning Fac-
tory. Plant simulation software was adopted in
this study as the virtual modeling environment
used in Stage V. The plant simulation system,
which uses a discrete event-based framework
for dynamic process modeling, has been widely
applied in the production environment as pre-
sented in Owida Byrne, Heavey, Blake, and
El-Kilany (2016).
The goal of Stage I was to define the assem-
bly line configuration, because the structure
used in the work offers flexibility for determin-
ing the division and order of activities to reach
the study goals. For the correct implementation
of MTM-1, according to Di Gironimo et al.
(2012), the process presents a short cycle time
(less than 3 min) as a prerequisite. Thus, the dis-
tribution of activities between operators must
guarantee that each is going to work for less than
3 min at each part being processed. In addition
Figure 2. Research methods.
632 June 2019 - Human Factors
to this requirement, the process must be defined
with consideration of the resource constraints.
The division of activities into movements,
performed via the MTM, should be applied after
the analysis of the entire structure of activities,
including the logical organization of cognitive
and motor actions (Bedny, Karwowski, & Vos-
koboynikov, 2015). Therefore, during Stage I,
the best sequence of task execution was ana-
lyzed and defined before the application of the
MTM took place.
Stage II subsequently sought to obtain the
probability distribution of the assembly-line
data for its comparison with the MTM values.
To achieve this goal, two scenarios were defined:
operator in the learning period (Scenario 1) and
trained operator (Scenario 2). Statistical proce-
dures were executed to determine the number of
participants to perform Scenarios 1 and 2, giving
confidence to the result.
The probability distribution of each scenario
was determined for every operator following the
statistical tests usually applied in simulation stud-
ies. Stage III refers to the application of the MTM’s
standard movements in the defined sequence, as
well as the definition of the standard time.
The comparison between the values obtained
via the operators’ time was developed in Stage
IV, simultaneously with the comparison between
the values and distributions obtained in each
scenario. The probability distributions to be
introduced in the MTM values were also defined
at this stage of the research, taking into account
the analysis of the distributions observed in the
process.
The DES model was developed in Stage V to
represent the assembly line and, based on the
human times verified in the scenarios, validate
the proposed distributions for the MTM. During
the modeling process, the conceptual model was
developed according to the instructions that
Robinson (2015) provided, and the verification
and validation process were carried out follow-
ing the guidelines of Sargent (2013). The results
of the simulations regarding the different pro-
posals for the distributions of the MTM and the
values that the method originally defined were
compared to define the appropriate approach for
the use of the MTM in DES.
RESULTS AND DISCUSSIONS
In Stage I, the process structure was identi-
fied via the assembly of a pilot product. During
this pilot, the activities were timed to base the
division of tasks between the workstations. The
process configuration was defined with three
workstations as shown in Figure 3, resulting in
a simplified assembly line. Workstations 1 and 2
were analyzed concerning the process time and
MTM values. The third workstation received
outputs from workstations 1 and 2.
In Stage II, 31 operators were selected to carry
out the activities so as to allow a comparison of
the results. The systematic execution of the pro-
cess was carried out in 2 days, with 1 day for each
workstation to avoid the interference of possible
fatigue at the time of the second workstation exe-
cution if it was performed the same day.
On the first day, a training exercise was exe-
cuted, showing the operators the basic assembly
sequence to be followed. After that, operators
performed the process for the first time to under-
stand the assembly requirements. The execution
of the process followed, collecting the data for
Scenario 1: learning with 150 executions of
Workstations 1 and 2. In this period, the authors
observed that the operators’ skills with and
knowledge of the assembly sequence appeared
to be under development. The next day, the pro-
cess execution was carried out to collect data for
Scenario 2: trained with 150 executions of
Workstations 1 and 2.
The results of this initial experiment were
analyzed to determine the number of partici-
pants necessary to represent the operators’ per-
formance according Equation 1:
nZ
=(*)
/
α
2
σ
(1)
Figure 3. Assembly line configuration.
Influence of Standard tIme at SImulatIon
633
where n represents the sample size, Z is the criti-
cal value according the confidence determined
by α, σ represents the deviation of an initial
sample, and ϵ represents the acceptable error.
The initial deviation was determined by the
media of the 31 operators for the two worksta-
tions. The deviation was defined as 7.68 s. With
a 95% confidence interval and 2.5 s of accept-
able error, the sample size was defined as 37 par-
ticipants. To complete the data, the same sys-
tematic execution of the process was conducted
with six more participants.
The results of each execution were analyzed,
and the outliers were identified by examining
the box plot graphs. Using the graphs, the outlier
was identified, considering the median as a ref-
erence and 25% of the data above and under this
value as the first and third quartiles. Based on
the difference of these quartiles, the values of
the second and fourth quartiles were defined,
and the outlier was the data plotted above these
quartiles. In the simulation data, the outlier can
originate from a measuring error or an anomaly
in the process, the loss of a part during the
assembly process, or some outside interruption.
Anderson Darling and Kolmogorov–Smirnov
statistical tests were performed. These tests ver-
ify if a family of distributions can represent a
sample of data measuring the difference between
each statistical distribution and the sample data
probabilistic distribution. These tests can show,
for example, whether the normal distribution
can represent a sample of data. As a result, the
tests show which statistical curves could repre-
sent the human data, together with the respective
parameters for this representation. For the defi-
nition of the most appropriate curves between
the ones considered appropriate by the tests, the
simulation analysts apply graphical compari-
sons between the observed values and the values
of each of the possible representative curves.
Therefore, the chosen curve is defined by a com-
bination of statistical tests and visual selection.
The distributions defined for each of the pro-
posed scenarios are presented in Table 1.
According to the results, most cases could be
represented by the lognormal, with the excep-
tion of 18.9% of operators in the learning
scenario and 13.5% of the cases in the trained
scenario. Banks and Chwif’s (2011) assertion
that manual service times can usually be repre-
sented by a lognormal distribution could be
applied to the study conditions because it is the
chosen distribution for most cases. Analyzing
Table 1, it was found that lognormal was the dis-
tribution considered appropriate for most sce-
narios, especially the trained scenario, which
was considered to be the closest to a stable
industrial process. The statements that Murrel
(1961), Dudley (1963), and Knott and Sury
(1986) made in relation to the normality of the
manual process were not verified, because the
normal curve was defined as the most adequate
for the representation of the processes in only
8.1% of the cases in the learning scenario.
However, according to the statistical tests,
more than one curve was usually considered
acceptable for the representation of a given group
of data. Although it was not defined for the rep-
resentation of the values in the simulation, the
normal curve was considered adherent to the data
in 89% of the cases in the learning scenario and
81.1% of the cases in the trained scenario. How-
ever, according to Law (2008), normal function
is rarely considered the correct function for the
representation of manufacturing processes.
TABLE 1: Statistical Distributions Definition
Scenario Statistical Distribution Percentage of Use (Between 37 Operators)
1: Learning Lognormal 81.1%
Weibull 10.8%
Normal 8.1%
2: Trained Lognormal 86.5%
Erlang 8.1%
Weibull 5.4%
634 June 2019 - Human Factors
All operators’ times to assemble one part are
given in Figure 4 as the mean, minimum value,
maximum value, and deviation in each situation.
In analyzing Figure 4, it can be seen that all
the statistics present a decrease in values
between the learning and trained scenarios. The
decrease of the deviation value between the sce-
narios may indicate the stability of the process
with less process time difference. These results
might be a consequence of the learning experi-
ence, because the operators became better
trained during the assembly sequences. Accord-
ing to Małachowski and Korytkowski (2016),
repetition of activities leads an operator to
become more familiar with parts and tools, thus
resulting in less process time.
During the experiments, the authors observed
differences in values between operators’ perfor-
mances, showing that personal characteristics
can influence process time. This confirmed the
researchers’ perception that an operator’s char-
acteristics might influence processing time and,
consequently, the accuracy of the MTM values.
In Stage III, the MTM analysis was done
based on the sequence of activities defined for
each workstation. Through the identification of
the sequence of performed movements, the
MTM values were defined as 73.75 s for Work-
station 1 and 37.45 s for Workstation 2.
In Stage IV, the differences between the mean
values of the sample and the MTM values were
evaluated. Table 2 shows the deviation between
MTM and the values in each scenario.
Regarding the accuracy of the MTM, Table
2 shows that all mean values of the samples for
the Trained scenario, considered similar to a
usual industrial environment, are higher than
the values determined via the MTM. In addi-
tion to the accuracy of the MTM, the errors
originating from the identification of move-
ments, cases, and distances that the researchers
performed can also be considered a source of
differences between the observed values and
the MTM.
Table 3 shows the percentage of difference
among the minimum, maximum, and mean val-
ues of the sample for both operators in the expe-
rienced scenario compared with the value deter-
mined in the MTM for each of the workstations.
In addition, at the end of the table is presented
Figure 4. Sample statistics.
TABLE 2: Difference Between the Human Data and MTM
Scenario
Workstation 1 Workstation 2
Mean (s) MTM Value (s) Deviation (s) Mean (s) MTM Value (s) Deviation (s)
Learning 84.35 73.75 −10.60 49.35 37.45 −11.90
Trained 77.32 73.75 −3.57 38.77 37.45 −1.35
Influence of Standard tIme at SImulatIon
635
the mean difference of each of the values regard-
ing the MTM.
The determination of the parameters of the
curves in relation to the MTM was defined by
means of the results presented in Table 3. All
values were defined using the percentage of dif-
ference between MTM and the sample statistics.
The minimum value was defined based on the
mean of the minimum values (90.32%), using
90% and 95% of the MTM as the minimum val-
ues for the proposed distributions. Differences
around 18.58% and 49.56% were presented in
the maximum value. Workstation 2 presented a
larger difference, resulting in an elevated mean
value. We verified that sometimes more attempts
were required to assemble a specific part at this
workstation, resulting in this value. Most of the
actual data collected from the participants were
represented by the lognormal, and this curve has
a right tail with lower incidence of probability
(i.e., its higher values have a low probability of
occurrence). This format is not observed by the
uniform curves, in which the probability of dis-
tribution is the same throughout the whole
stretch, and even with the triangular the proba-
bility difference also decreases more sharply to
the right of the graph compared with lognormal.
Therefore, the values defined were lower than
those observed (i.e., between 120% and 125%
was defined for the maximum value in the trian-
gular distributions and 115% and 120% for the
uniforms).
The central value for one of the triangular dis-
tributions was defined as a value higher than the
MTM, because the mean values observed in the
process were higher than the standard time that
the MTM defined. The mean value was estab-
lished based on the value of 104.18%. Because
the difference between the sample and the MTM
was less than 5%, the values used were estab-
lished within 100% and 105% of the MTM.
Considering these values, two configurations
of uniform and triangular curves were tested,
and their configuration parameters are shown in
Table 4.
A comparison between the proposed uniform
and triangular curves and the two operators’ log-
normal curves, as an example of the adherence
of the proposed curves, is shown in Figure 5.
Graphically, differences between operators can
be observed, showing that the proposed curves
can be more adequate for a set of data than other
types of curves. The continuous curve represents
the proposed distributions.
In relation to the probability distributions
proposed for the MTM, it can be seen through
the evaluation of Figures 5 and 6 that the
graphical arrangement of the triangular curve
TABLE 3: Percentage Difference Between Sample Statistics for Trained Scenario and MTM
Work
Sequence
Minimum
Value (s) of
Trained
% of Difference
Considering
MTM
Mean
(s) of
Trained
% of Difference
Considering
MTM
Maximum
Value (s) of
Trained
% of Difference
Considering
MTM
Workstation 1 68.74 93.21% 77.32 104.84% 87.45 118.58%
Workstation 2 32.74 87.42% 38.77 103.52% 56.01 149.56%
Mean 90.32% 104.18% 134.07%
TABLE 4: Values According to the MTM for the Proposed Distributions
Proposed Distribution
Values Regarding MTM
Minimum Mode Maximum
Uniform 1 90%*MTM 115%*MTM
Uniform 2 95%*MTM 120%*MTM
Triangular 1 90%*MTM 100%*MTM 120%*MTM
Triangular 2 95%*MTM 105%*MTM 125%*MTM
636 June 2019 - Human Factors
is more similar to the lognormal curve than the
uniform curve. However, the uniform curve
covers a larger area in cases where the devia-
tion of the process is wider. Because the graph-
ical analysis was defined for only two opera-
tors, the results regarding the general adher-
ence cannot be made, although the analysis of
similarity with the lognormal curve is valid for
most of the cases in which it was defined for
human data representation.
In Stage V, the conceptual model considered
few simplifications as represented in Figure 7.
The inventories were not considered to increase
the dependency between the workstations and,
therefore, provided a better visualization of the
impact of using different data in the simulations.
Production takes place at Workstations 1 and 2,
which only begin the production of a new unit if
Workstation 3 is free to process the items
released by the previous workstations.
The data of each operator were used in differ-
ent simulations, and the mean result among the
37 operators was determined. In this way, it
would be possible to evaluate which proposed
distribution (Uniform 1, Uniform 2, Triangular
1, or Triangular 2; presented in Table 5) is closer
to the real process, which was represented as the
mean among operators. In addition, a simulation
using the MTM deterministic values was elabo-
rated. Table 5 presents the results obtained for a
week of operation.
According to the results presented in Table 6,
it can be noticed that the uniform curves pre-
sented smaller values for the working time than
the simulation did using the values that the
MTM determined when compared with the
average values of the operators. It can also be
noticed that the values that the triangular curves
represent are closer to the average of the opera-
tors, with the results of Triangular 2 being clos-
est to the desired values. This curve had its
central parameter set at a value above that deter-
mined with the MTM (105% of the MTM).
In analyzing the quantity of pieces, the values
presented a small variation between the simula-
tions, ranging from 1,792 to 1,930 pieces. This
Figure 5. Comparison between uniform and lognormal distributions.
Influence of Standard tIme at SImulatIon
637
analysis could invalidate the affirmation that
Banks and Chwif (2011) made, that the defini-
tion of the probability curve for the simulation
data representation may not present significant
differences in the model results.
To deepen the analysis of this result, the vari-
ability of the model regarding the quantity of
pieces produced through the performance of
simulation rounds was verified. Table 6 shows
the mean and standard deviations of the number
of pieces produced.
The results showed that the simulation with
Triangular 2 resulted in a closer variability com-
pared with the average between operators.
Figure 6. Comparison between triangular and lognormal distributions.
Figure 7. Conceptual model.
638 June 2019 - Human Factors
Uniform 2 presented the farthest results com-
pared with the mean values of the operators.
Table 6 also shows the variations in percent-
age in the average results in relation to the aver-
age of operators. It can be verified that in the
Uniform 2 case, the variation difference is higher
than 6%, showing that its use in the simulation is
not feasible for representing the processes under
analysis. The Triangular 2 curve presented a dif-
ference percentage close to zero, showing its
validity in representing the actual data of the
process with a low error rate. Due to the differ-
ences in values, the impact of the choice of the
distribution curve and its values on the simula-
tion reliability are highlighted.
Thus, through the evidence presented, it can be
stated that the triangular curve with a central
value greater than that established via the MTM is
considered more adequate for the representation
of the MTM values in DES considering the case
under analysis. In practical industry cases, the
percentage placed above the values established
via the MTM should be measured to adapt the
method to the workplace reality.
CONCLUSION
The objective of this article was to analyze
the influence of the variability of standard times
in the simulation of the assembly operations
of manufacturing systems. By means of the
research results, it can be verified that the aver-
age of the working times observed in the analysis
presented higher values than those determined
via the MTM tables. Thus, the need to adapt
the method to workplace reality to approximate
the simulation analysis can be emphasized. The
variation observed between the human data and
MTM may originate from fatigue, recovery, or
tolerance values; nevertheless, the study scope
does not involve the origin of the variation,
focusing instead on the scale of it.
It was also verified that the inadequate choice
of the distribution curve of the values used in the
simulation may incur errors in the data, with a
TABLE 5: Simulation Results With Different Probability Distributions
Parameters
Mean of
Operators MTM Uniform 1 Uniform 2 Triangular 1 Triangular 2
Working time
workstation 1
93.28% 90.01% 91.17% 88.41% 92.03% 94.73%
Working time
workstation 2
51.59% 45.77% 41.11% 41.22% 46.18% 49.82%
Production (pieces) 1919 1930 1925 1792 1929 1919
Mean time
workstation 1 (s)
76.13 73.64 75.41 78.54 74.62 77.16
Mean time
workstation 2 (s)
41.85 37.45 38.55 36.73 37.21 39.53
TABLE 6: Confidence Interval of Simulations
Simulation Mean Deviation Variation Regarding the Mean of Operators
Mean of operators 1919.13 0.82
MTM 1930 0.57%
Uniform 1 1925.22 0.42 0.32%
Uniform 2 1792.43 12.95 −6.60%
Triangular 1 1929.98 0.63 0.57%
Triangular 2 1919.25 0.76 0.01%
Influence of Standard tIme at SImulatIon
639
difference of up to 6% being observed in the
study between the curves and values analyzed.
In addition, the application of a triangular
curve—with a central value greater than the
deterministic value of the MTM—presented
results closer to the real data observed than the
simulation using the deterministic values of the
MTM. Therefore, it is considered an adequate
variability curve for the DES analysis and brings
greater reliability to the simulated models.
In this way, it is recommended to use triangu-
lar curves with central values greater than those
established via the MTM for the representation
of standard times during the scenario analysis or
in the planning stages of assembly processes
using DES. In addition to its application in
assembly processes, the proposed approach can
be extended to maintenance operations that,
according to Desai and Mital (2010b), mostly
include disassembly and assembly activities,
which can be represented by the MTM.
KEY POINTS
This approach to using standard times in simula-
tions may have more reliability in the design of
new processes or scenarios.
The use of the MTM as a standard time resource
is analyzed, with the accuracy of the method pre-
sented.
The impact of different data and probabilistic dis-
tributions at simulation results is showed regard-
ing errors above 6% in the simulation.
REFERENCES
Alrabghi, A., & Tiwari, A. (2016). A novel approach for modelling
complex maintenance systems using discrete event simulation.
Reliability Engineering & System Safety, 154, 160–170.
Argoneto, P., & Renna, P. (2013). Capacity sharing in a network of
enterprises using the Gale–Shapley model. The International
Journal of Advanced Manufacturing Technology, 69(5–8),
1907–1916.
Atieh, A. M., Kaylani, H., Almuhtady, A., & Al-Tamimi, O. (2016).
A value stream mapping and simulation hybrid approach:
Application to glass industry. The International Journal of
Advanced Manufacturing Technology, 84(5–8), 1573–1586.
Banks, J., & Chwif, L. (2011). Warnings about simulation. Journal
of Simulation, 5(4), 279–291.
Baraldi, E. C., & Kaminski, P. C. (2011). Ergonomic planned supply
in an automotive assembly line. Human Factors and Ergonom-
ics in Manufacturing & Service Industries, 21(1), 104–119.
Barton, H., & Delbridge, R. (2001). Development in the learning
factory: Training human capital. Journal of European Indus-
trial Training, 25(9), 465–472.
Bedny, G. Z., & Harris, S. R. (2013). Safety and reliability analysis
methods based on systemic-structural activity theory. Proceed-
ings of the Institution of Mechanical Engineers, Part O. Jour-
nal of Risk and Reliability, 227(5), 549–556.
Bedny, G. Z., Karwowski, W., & Voskoboynikov, F. (2015). Appli-
cation of standardized motions in temporal analysis of work
activity. Human Factors and Ergonomics in Manufacturing &
Service Industries, 25(4), 469–483.
Budgaga, W., Malensek, M., Pallickara, S., Harvey, N., Breidt,
F. J., & Pallickara, S. (2016). Predictive analytics using sta-
tistical, learning, and ensemble methods to support real-time
exploration of discrete event simulations. Future Generation
Computer Systems, 56, 360–374.
Cakmakci, M., & Karasu, M. K. (2007). Set-up time reduction pro-
cess and integrated predetermined time system MTM-UAS:
A study of application in a large size company of automobile
industry. The International Journal of Advanced Manufactur-
ing Technology, 33(3–4), 334–344.
Desai, A. A., & Mital, A. (2010a). Facilitating design for assembly
through the adoption of a proactive design methodology. Inter-
national Journal of Industrial Engineering: Theory, Applica-
tions and Practice, 17(2).
Desai, A. A., & Mital, A. (2010b). Improving maintainability of
products through the adoption of a comprehensive DfX Meth-
odology. International Journal of Industrial Engineering:
Theory, Applications and Practice, 17(2).
Desai, A., & Mital, A. (2011). Simplifying the product mainte-
nance process by building ease of maintenance into the design.
International Journal of Industrial and Systems Engineering,
9(4), 434–454.
Di Gironimo, G., Di Martino, C., Lanzotti, A., Marzano, A., &
Russo, G. (2012). Improving MTM-UAS to predetermine
automotive maintenance times. International Journal on Inter-
active Design and Manufacturing, 6(4), 265–273.
Dinkelmann, M., Siegert, J., & Bauernhansl, T. (2014). Change
management through learning factories. In M. Zaeh (Ed.),
Enabling Manufacturing Competitiveness and Economic Sus-
tainability (pp. 395–399). Berlin: Springer International Pub-
lishing.
Dode, P., Greig, M., Zolfaghari, S., & Neumann, W. P. (2016).
Integrating human factors into discrete event simulation: A
proactive approach to simultaneously design for system per-
formance and employees’ well being. International Journal of
Production Research, 54(10), 3105–3117.
Dudley, N. A. (1963). Work-time distributions. The International
Journal of Production Research, 2(2), 137–144.
Fowler, J. W., & Rose, O. (2004). Grand challenges in modeling
and simulation of complex manufacturing systems. Simulation,
80(9), 469–476.
Helleno, A. L., Simon, A. T., Papa, M. C. O., Ceglio, W. E.,
Rossa Neto, A. S., & Mourad, R. B. A. (2013). Integration
university-industry: Laboratory model for learning lean
manufacturing concepts in the academic and industrial envi-
ronments. International Journal of Engineering Education,
29(6), 1387–1399.
Iannone, R., Miranda, S., Prisco, L., Riemma, S., & Sarno, D.
(2016). Proposal for a flexible discrete event simulation model
for assessing the daily operation decisions in a Ro–Ro termi-
nal. Simulation Modelling Practice and Theory, 61, 28–46.
Kern, C., & Refflinghaus, R. (2013). Cross-disciplinary method for
predicting and reducing human error probabilities in manual
assembly operations. Total Quality Management & Business
Excellence, 24(7–8), 847–858.
640 June 2019 - Human Factors
Kernbaum, S., Franke, C., & Seliger, G. (2009). Flat screen moni-
tor disassembly and testing for remanufacturing. International
Journal of Sustainable Manufacturing, 1(3), 347–360.
Knott, K., & Sury, R. J. (1986). An investigation into the minimum
cycle time restrictions of MTM-2 and MTM-3. IIE Transac-
tions, 18(4), 380–391.
Kothiyal, K. P., & Kayis, B. (1995). Workplace design for manual
assembly tasks: Effect of spatial arrangement on work-cycle
time. International Journal of Occupational Safety and Ergo-
nomics, 1(2), 136–143.
Kuhlang, P., Edtmayr, T., & Sihn, W. (2011). Methodical approach
to increase productivity and reduce lead time in assembly and
production-logistic processes. CIRP Journal of Manufacturing
Science and Technology, 4(1), 24–32.
Kuo, C. F., & Wang, M. J. (2009). Motion generation from MTM
semantics. Computers in Industry, 60(5), 339–348.
Kuo, C. F., & Wang, M. J. J. (2012). Motion generation and virtual
simulation in a digital environment. International Journal of
Production Research, 50(22), 6519–6529.
Law, A. M. (2008). How to build valid and credible simulation
models. In Proceedings of the 40th Conference on Winter Sim-
ulation, 39–47. doi:10.1109/WSC.2001.977242
Lehto, M. R., & Buck, J. R. (2007). Introduction to human factors
and ergonomics for engineers, Hillsdale, NJ: Erlbaum.
Lu, R. F., Petersen, T. D., & Storch, R. L. (2007). Modeling cus-
tomized product configuration in large assembly manufactur-
ing with supply-chain considerations. International Journal of
Flexible Manufacturing Systems, 19(4), 685–712.
Małachowski, B., & Korytkowski, P. (2016). Competence-based
performance model of multi-skilled workers. Computers &
Industrial Engineering, 91, 165–177.
Murrell, K. F. H. (1961). Operator variability and its industrial con-
sequences. The International Journal of Production Research,
1(3), 39–55.
Nakayama, S. I., Nakayama, K. I., & Nakayama, H. (2002). A
study on setting standard time using work achievement quo-
tient. International Journal of Production Research, 40(15),
3945–3953.
Negahban, A., & Smith, J. S. (2014). Simulation for manufacturing
system design and operation: Literature review and analysis.
Journal of Manufacturing Systems, 33(2), 241–261.
Oliveira, J. B., Lima, R. S., & Montevechi, J. A. B. (2016). Per-
spectives and relationships in supply chain simulation: A sys-
tematic literature review. Simulation Modelling Practice and
Theory, 62, 166–191.
Owida, A., Byrne, P. J., Heavey, C., Blake, P., & El-Kilany, K. S.
(2016). A simulation based continuous improvement approach
for manufacturing based field repair service contracting. Inter-
national Journal of Production Research, 54(21), 6458–6477.
Prajapat, N., & Tiwari, A. (2017). A review of assembly optimisa-
tion applications using discrete event simulation. International
Journal of Computer Integrated Manufacturing, 30(2–3),
215–228.
Robinson, S. (2004). Simulation: The practice of model develop-
ment and use. Hoboken, NJ: Wiley.
Robinson, S. (2015, December). A tutorial on conceptual modeling
for simulation. In Proceedings of the 2015 Winter Simulation
Conference (pp. 1820–1834). doi:10.1109/WSC.2015.7408298
Sargent, R. G. (2013). Verification and validation of simulation
models. Journal of Simulation, 7(1), 12–24.
Steiner, L., Burgess-Limerick, R., & Porter, W. (2014). Directional
control-response compatibility relationships assessed by physi-
cal simulation of an underground bolting machine. Human
Factors, 56(2), 384–391.
Tako, A. A., & Robinson, S. (2012). The application of discrete
event simulation and system dynamics in the logistics and sup-
ply chain context. Decision Support Systems, 52(4), 802–815.
Tisch, M., Hertle, C., Abele, E., Metternich, J., & Tenberg, R.
(2015). Learning factory design: A competency-oriented
approach integrating three design levels. International Journal
of Computer Integrated Manufacturing, 29(12), 1355–1375.
doi:10.1080/0951192X.2015.1033017
Tseng, H. E., & Tang, C. E. (2006). A sequential consideration
for assembly sequence planning and assembly line balancing
using the connector concept. International Journal of Produc-
tion Research, 44(1), 97–116.
Wickens, C. D., Sebok, A., Li, H., Sarter, N., & Gacy, A. M.
(2015). Using modeling and simulation to predict operator per-
formance and automation-induced complacency with robotic
automation: A case study and empirical validation. Human
Factors, 57(6), 959–975.
Yang, H., Bukkapatnam, S. T., & Barajas, L. G. (2013). Continuous
flow modelling of multistage assembly line system dynamics.
International Journal of Computer Integrated Manufacturing,
26(5), 401–411.
Zeng, X., Wong, W. K., & Leung, S. Y. S. (2012). An operator allo-
cation optimization model for balancing control of the hybrid
assembly lines using Pareto utility discrete differential evolution
algorithm. Computers & Operations Research, 39(5), 1145–1159.
Rafaela Heloisa Carvalho Machado holds a Master’s
degree (2017) in production engineering from the
Universidade Metodista de Piracicaba - UNIMEP.
She has experience in the areas of simulation, pro-
duction management and human factors.
André Luís Helleno holds a Master’s degree (2004)
and a PhD (2008) in production engineering from
the Universidade Metodista de Piracicaba - UNIMEP.
He completed a doctoral intern (2005) and postdoc-
toral degree (2011) at Technische Universität Berlin,
Germany.
Maria Celia de Oliveira holds a Bachelor’s Degree
in Mathematics from the Universidade Metodista de
Piracicaba - UNIMEP (2004). She holds a PhD in
production engineering from the Universidade
Metodista de Piracicaba - UNIMEP (2012) and com-
pleted an internship at the Technische Universität
Berlin, Germany (2010).
Mário Sérgio Corrêa dos Santos holds a Master’s
Degree (2004) in mechanical engineering from the
Fundação Educacional Inaciana Padre Sabóia de
Influence of Standard tIme at SImulatIon
641
Medeiros - FEI (2014) and is a PhD student at the
Universidade Metodista de Piracicaba - UNIMEP.
Renan Meireles da Costa Dias is a graduate student
at Universidade Metodista de Piracicaba - UNIMEP.
He has experience in the areas of simulation and
methods-time measurement and is part of the pro-
cess engineering research team.
Date received: October 10, 2017
Date accepted: January 13, 2019
... Machado et al. [17] and Harari et al. [18] provide the standardization of the work method as a solution to their problems. On the other hand, Desai [19] proposes a design methodology in his solution, with which he manages to reduce the assembly time from 102 seconds to 11.9 seconds. ...
... Widely used in industry [4,5,6], however, there are already studies with its application in service providers, allowing an effective solution for product/service costs, with a focus on reducing time and eliminating waste [7]. ...
Article
Full-text available
The Methods-Time Measurement (MTM) tool has been gaining ground for offering cost-effective solutions for products/services, focusing on time reduction and wastage elimination. Aiming at the optimization of operations, this research aimed to evaluate the operational results from the MTM tool in a medium-sized beauty salon in the Tianguá-CE city. To carry out the study, the services provided by the salon were evaluated to verify the one that was most relevant, highlighting the services of manicure and pedicure. In these services, a diagnosis of the current scenario was carried out, and MTM were applied to identify opportunities for their optimization. After obtaining the results and analyzing them, it was possible to reduce unnecessary activities by 54.61% in the pedicure service and 62.57% in the manicure service, resulting in more agile, efficient, and low-cost services. In addition to reducing unnecessary activities, this improvement was possible through the standardization of the other activities that make up the service and the reorganization of the layout. Therefore, it is concluded that the application of the MTM provided gains in productivity and in the quality of the service provided, as there was a reduction in waste and greater agility in the service.
... Simulation in the context of manufacturing systems refers to the application of software to develop computerized models of manufacturing systems, in order to analyze these systems and obtain important information such as impact of a local or specific action on the entire system (Machado et al., 2019). Computerized simulation is considered the second most popular field in management decision sciences among industrial engineers and manufacturing managers (Polenghi et al., 2018). ...
Chapter
Artificial Intelligence (AI) is indeed a technique that is increasingly evolving throughout the worldwide and banking industry has become one of its earliest users. From manufacturing to service industries, AI have been a part of the company, in this modern era, where everything is handled by means of computers or human computer interface (HCI’s). AI is not a new innovation, but it has grown exponentially in recent years, catering a lot for sustainable growth. US and China are important countries that contribute to different applications using AI. As per Forrester report the customised customer platform offers quantitative benefits in the form of reduced costs, highly efficient human resources and enhanced customer engagement outcomes. AI is growing, however there are barriers to support and maintain since it can deal with possible biases or accountability of senior executives and government legislations. Big data is a channel to AI’s service and Virtual reality cannot operate without data. In the Present research the development and success of AI have been analysed focussing on the banking industry. The study also finds different ways to minimise costs and provide reliable data based on Site intelligence. AI’s a machine blessing, but also a threat. This report discusses the vulnerabilities and diverse prospects for growth in this particular service sector.
... Simulation in the context of manufacturing systems refers to the application of software to develop computerized models of manufacturing systems, in order to analyze these systems and obtain important information such as impact of a local or specific action on the entire system (Machado et al., 2019). Computerized simulation is considered the second most popular field in management decision sciences among industrial engineers and manufacturing managers (Polenghi et al., 2018). ...
Chapter
The fourth industrial revolution is at the beginning, and it alters the fundamental way we work and live. The technologies like Artificial Intelligence, Internet of Things, Robotics, Nanotechnology, 3D printing, Data Science are strengthening one another. The fourth industrial revolution has made an impact in social and economic domains in the form of loss of many current jobs or shifting of nature of work, and in the delivery of public and private services. This paper explores the transformation of the labor market which demands innovative professional skills due to industrial revolutions 4.0. The topics like the origin of IR 4.0, technical revolution, the impact of I.R 4.0 technology, employment crisis due to IR 4.0, the transformation of the job market, and the necessity of emerging skills in the industry were discussed in this paper. An analysis of the impact of digitization in the labor market is done here.
... Simulation in the context of manufacturing systems refers to the application of software to develop computerized models of manufacturing systems, in order to analyze these systems and obtain important information such as impact of a local or specific action on the entire system (Machado et al., 2019). Computerized simulation is considered the second most popular field in management decision sciences among industrial engineers and manufacturing managers (Polenghi et al., 2018). ...
Chapter
Artificial Intelligence (AI) is indeed a technique that is increasingly evolving throughout the worldwide and banking industry has become one of its earliest users. From manufacturing to service industries, AI have been a part of the company, in this modern era, where everything is handled by means of computers or human computer interface (HCI’s). AI is not a new innovation, but it has grown exponentially in recent years, catering a lot for sustainable growth. US and China are important countries that contribute to different applications using AI. As per Forrester report the customised customer platform offers quantitative benefits in the form of reduced costs, highly efficient human resources and enhanced customer engagement outcomes. AI is growing, however there are barriers to support and maintain since it can deal with possible biases or accountability of senior executives and government legislations. Big data is a channel to AI’s service and Virtual reality cannot operate without data. In the Present research the development and success of AI have been analysed focussing on the banking industry. The study also finds different ways to minimise costs and provide reliable data based on Site intelligence. AI’s a machine blessing, but also a threat. This report discusses the vulnerabilities and diverse prospects for growth in this particular service sector.
... Simulation in the context of manufacturing systems refers to the application of software to develop computerized models of manufacturing systems, in order to analyze these systems and obtain important information such as impact of a local or specific action on the entire system (Machado et al., 2019). Computerized simulation is considered the second most popular field in management decision sciences among industrial engineers and manufacturing managers (Polenghi et al., 2018). ...
Book
The book explains strategic issues, trends, challenges, and future scenario of global economy in the light of Fourth Industrial Revolution. It consists of insightful scientific essays authored by scholars and practitioners from business, technology, and economics area. The book contributes to business education by means of research, critical and theoretical reviews of issues in Fourth Industrial Revolution.
... Simulation in the context of manufacturing systems refers to the application of software to develop computerized models of manufacturing systems, in order to analyze these systems and obtain important information such as impact of a local or specific action on the entire system (Machado et al., 2019). Computerized simulation is considered the second most popular field in management decision sciences among industrial engineers and manufacturing managers (Polenghi et al., 2018). ...
Chapter
Full-text available
Industry 4.0 (I4.0) fueled by technological advancements in the context of cyber-physical space has brought about phenomenal changes in the way goods and services can be manufactured. However, despite the widespread use of the term in popular vernacular, little is known about what exactly I4.0 is, and the potential contribution it is expected to make and its’ possible fallouts on society. The advent of the I4.0 age has not only brought the promise of an era of immense productivity, but also brought with it many challenges that lie in the path of adoption of such an advanced manufacturing ecosystem. This study presents I4.0 in the context of manufacturing by focusing on three areas. Firstly, the nine (9) core foundational technological phenomenon driving I4.0 in the manufacturing environment, followed by the challenges in adoption of I4.0. Finally, the role of industry-academia partnership in paving the path for adoption of I4.0 is presented as a potential for future research focus.
Article
The situation of the covid-19 epidemic is a driving force of the global market’s demand increase of electronic devices and parts. Entire electronic component manufacturers, especially the transformer manufacturing industry, which is a device that supplies power to many electronic devices, encounters problems in producing products that are unable to keep up with the quickly increasing demand. This research aims to increase the productivity of small transformers by lean approach. The paper depicts processes relevant to improving production processes, reducing waste, and finding unnecessary processes. The method begins with two actions. First, study the current situation in transformer manufacturing of a case study. Second, study the customer order to delivery process using the Value Stream Mapping (VSM) and analyze entire processes of transformer manufacturing to identify standard time by unit work. The main technique is for measuring working time by timing the forward motion with the time measurement method version 2 (MTM-2). The Cause and Effect diagram was displayed with improving guidelines on two operations. First the concept of lean manufacturing was used in principal role, second the ECRS technique (Eliminate, Combine, Rearrange and Simplify) was applied to reduce "waste" as well as to optimize and reduce the manufacturing process of the transformer. The results lead to an increase in the final product per hour from 45 pieces per hour to 75 pieces per hour which increases up to 30% per hour. In addition, the productivity improvements increased the productivity of 3.46 workers per hour to 6.82 per hour (increase of 97.11%) and production time was reduced from 1,109 seconds to 229 seconds (73.04% of productivity).
Book
Emphasizing customer oriented design and operation, Introduction to Human Factors and Ergonomics for Engineers explores the behavioral, physical, and mathematical foundations of the discipline and how to apply them to improve the human, societal, and economic well being of systems and organizations. The book discusses product design, such as tools, machines, or systems as well as the tasks or jobs people perform, and environments in which people live. The authors explore methods of obtaining these objectives, uniquely approaching the topic from an engineering perspective as well as a psychological standpoint. The 22 chapters of this book, coupled with the extensive appendices, provide valuable tools for students and practicing engineers in human centered design and operation of equipment, work place, and organizations in order to optimize performance, satisfaction, and effectiveness. Covering physical and cognitive ergonomics, the book is an excellent source for valuable information on safe, effective, enjoyable, and productive design of products and services that require interaction between humans and the environment.
Book
https://www.macmillanihe.com/page/detail/?SF1=barcode&ST1=9781137328021
Article
This paper develops and tests a novel extension to traditional supplier selection practice, with a particular focus on the concluding stages of a manufacturing-based field service. Action-based research was used to design and develop a discrete event simulation decision support for a large multinational manufacturing organisation with a significant after-sales service supply chain. The framework has been designed to identify and validate the value attributable to collaborative supplier contracting with built-in costed performance improvement targets. Use of the framework in the case organisation was found to produce greater cost savings over traditional practice, facilitating extended supply chain contracts. The results provide evidence of the high level of savings achievable while also improving customer delivery through targeted service improvements over the contracts life cycle. This framework advances beyond the prevalent practice of cost-focused short-term adversarial supply contracting and is innovative in terms of its continuous improvement simulation based framework design.
Article
Existing approaches for modelling maintenance rely on oversimplified assumptions which prevent them from reflecting the complexity found in industrial systems. In this paper, we propose a novel approach that enables the modelling of non-identical multi-unit systems without restrictive assumptions on the number of units or their maintenance characteristics. Modelling complex interactions between maintenance strategies and their effects on assets in the system is achieved by accessing event queues in Discrete Event Simulation (DES). The approach utilises the wide success DES has achieved in manufacturing by allowing integration with models that are closely related to maintenance such as production and spare parts systems. Additional advantages of using DES include rapid modelling and visual interactive simulation. The proposed approach is demonstrated in a simulation based optimisation study of a published case. The current research is one of the first to optimise maintenance strategies simultaneously with their parameters while considering production dynamics and spare parts management. The findings of this research provide insights for non-conflicting objectives in maintenance systems. In addition, the proposed approach can be used to facilitate the simulation and optimisation of industrial maintenance systems.
Chapter
In today’s ever faster changing economy it is crucial to adapt the production of a company in shorter intervals. Many change processes fail, often due to the resistance of the workforce. Approaches on the field of change management address this issue, but do often not consider the special circumstances in production. Learning factories are designed to meet these requirements, but usually focus on the aspect of qualification. This paper describes how change processes in production can be supported by learning factories by offering a test bed for new ideas, qualification and communication through participation.
Article
The aim of this research is to: (1) Develop an approach to integrating both human fatigue-recovery patterns and human learning into Discrete Event Simulation models of a production system to predict productivity and quality; (2) Validate the predicted fatigue against operators’ perceived fatigue; and (3) Demonstrate how this Human Factors-enabled simulation approach can be applied in a case study comparing two manufacturing line designs in the context of electronics assembly. The new approach can predict the accumulation of operator fatigue, fatigue-related quality effects and productivity changes based on system design configurations. In the demonstration comparison, fatigue dosage was 7–33% lower in the proposed system where HF was taken into consideration at the engineering design (ED) stage. In the existing system, the fatigue dose measure correlated with quality deficits with 26% of the variance accounted for – a large portion given the multi-causal nature of production deficits. ED models that do not include human aspects may provide unreliable results in terms of productivity and quality estimates. This research shows that it is possible to design production systems that are more productive while being less hazardous for the system operator.
Article
The main purpose of this article is to develop a meta-analysis about the relationships and potential perspectives of modeling and simulation in supply chains. The research methodology used in this paper was a systematic literature review, exploring the state of the art in Supply Chain Simulation. The methodological procedures were based on a systematic literature review and statistical analysis of a sample of papers. The results indicated that modeling and simulation in supply chains can be better integrated. The models could be more sophisticated to capture the dynamics and behavior of these networks. The combination of optimization methods with agent-based simulation is an observed trend. Hybrid simulations involving normative models and empirical applications can be useful to represent the reality of supply chains, generating alternative solutions that improve supply chain performance. The relevance of this article is to analyze the interfaces related to this field of research, in order to establish a theoretical framework that improves the process of modeling, simulation and decision-making in supply chains.
Article
This article reviews literature on the application of Discrete Event Simulation (DES) and optimisation methods for assembly systems. Data from papers is collated and classified based on application domain, optimisation objective functions, model formulations and optimisation methods. This classification enables the identification of key trends within the research. The most common objective functions applied within studies are time and throughput. In addition, what–if scenario analysis is identified as the most common optimisation method. An increase in the use of hybrid methods for simulation modelling and growing application of Artificial Intelligence methods for multi-objective optimisation of DES models has been noted. These growing trends provide a variety of interesting areas for progress in future research.