ArticlePDF Available

Abstract and Figures

This study used a Delphi technique to identify relevant modeling and simulation practices required in present-day workplace engineering. Participants consisted of 37 experts divided into two panels: 18 experts in academia on one panel, and 19 experts from industry on another panel. Panel members participated in three rounds of data collection in which they offered their opinions about the relevance of specific modeling and simulation skills, and opinions about when these practices should be introduced into the undergraduate and graduate engineering curricula. The guiding research question was: What are the required modeling and simulation practices to be integrated as part of the engineering curricula at the undergraduate and graduate levels? Findings from this study were used to inform a preliminary learning progression for modeling and simulation in undergraduate and graduate engineering education. Outcomes from this study represent a linchpin for future research about learning and achievement of modeling and simulation practices, which can delineate the nature of productive instructional pathways toward modeling and simulation proficiency.
Content may be subject to copyright.
Modeling and Simulation in Engineering
Education: A Learning Progression
Alejandra J. Magana1
Abstract: This study used a Delphi technique to identify relevant modeling and simulation practices required in present-day workplace
engineering. Participants consisted of 37 experts divided into two panels: 18 experts in academia on one panel, and 19 experts from industry
on another panel. Panel members participated in three rounds of data collection in which they offered their opinions about the relevance of
specific modeling and simulation skills, and opinions about when these practices should be introduced into the undergraduate and graduate
engineering curricula. The guiding research question was: What are the required modeling and simulation practices to be integrated as part of
the engineering curricula at the undergraduate and graduate levels? Findings from this study were used to inform a preliminary learning
progression for modeling and simulation in undergraduate and graduate engineering education. Outcomes from this study represent a linchpin
for future research about learning and achievement of modeling and simulation practices, which can delineate the nature of productive
instructional pathways toward modeling and simulation proficiency. DOI: 10.1061/(ASCE)EI.1943-5541.0000338.© 2017 American
Society of Civil Engineers.
Introduction
Modern engineering workplaces commonly use modeling and
simulation practices, coupled with computational tools, to aid
the analysis and design of systems (Emmott 2008;McKenna
and Carberry 2012). As a result, modeling and simulation skills
have been integrated across many science and engineering disci-
plines as analytic tools that support the study of complex phenom-
ena and as predictive tools that can anticipate the suitability of new
designs. The discipline of civil engineering is not the exception
(Lenox et al. 1997). Graduates in civil engineering are now ex-
pected to (1) identify the techniques, skills, and modern engineer-
ing tools that are necessary for engineering practice; (2) explain
how these techniques, skills, and modern engineering tools are used
in engineering practice; and (3) apply relevant techniques, skills,
and modern engineering tools to solve problems (ABET 2013;
ASCE 2008). At the same time, professionals in science and
engineering have emphasized the need for a new and modern
approach to educating and training the next generation of engi-
neering professionals to effectively complement experimental
and theoretical approaches to discovery and innovation processes
(e.g., NRC 2003,2008;Thornton et al. 2009). When used in educa-
tional contexts, modeling and simulation skills and tools can further
support the integration of both divergent and convergent thinking
(Dym et al. 2005).
In spite of the emerging importance of the role of modeling and
simulation in science and engineering, faculty and educators are not
keeping pace with the need for graduates with this complex skill set
(NRC 2011;Guzdial 2011;WTEC 2009;Emmott 2008), particu-
larly at the undergraduate level. Similarly, graduate students aiming
to pursue advanced degrees in this area (e.g., M.S. or Ph.D.) arrive
without the proper prior preparation (Magana and Mathur 2012).
Consequently, a shortage of scientists and engineers who are ade-
quately prepared to take advantage of, or contribute to, such highly
interdisciplinary, highly computational scientific challenges is
evident (Shiflet 2002;WTEC 2009;Emmott 2008). Proficiency
with domain-specific software, numerical and scientific computing
programming languages, and computational tools has become an
essential form of literacy for contributing in the engineering
problem-solving process (Magana and Coutinho 2017).
The engineering education community has, along with policy-
makers, started to recognize the importance of these skills and has
recommended their incorporation into the undergraduate engineer-
ing curriculum. Specifically, the Transforming Undergraduate Ed-
ucation in Engineeringreport (ASEE 2013) indicated that industry
professionals highly value studentsability to use computational
tools to support problem solving. ABET (2013) also stated the abil-
ity to use the techniques, skills, and modern engineering tools nec-
essary for engineering practice as a student outcome. Additionally,
the Washington Accord (International Engineering Alliance 2011)
identified as a graduate attribute studentsability to apply knowl-
edge of mathematics, science, and engineering fundamentals to
the conceptualization of engineering models. It also identified as
a desirable attribute studentsability to create, select, and apply
appropriate techniques, resources, and modern engineering tools,
including prediction and modeling, to complex engineering activ-
ities, with an understanding of the limitations.
To successfully integrate these practices into the undergradu-
ate engineering education, a first step is needed to identify the
requisite modeling and simulation skills, as well as studentslevels
of proficiency and learning pathways in obtaining these skills suc-
cessfully. Educational research at the intersection of the learning
sciences and engineering education is needed to identify a suit-
able curriculum along with effective pedagogical methods and
learning strategies that can equip future workforce professionals
with practice-ready computational thinking (Vergara et al. 2009).
This interdisciplinary approach can increase the chances of taking
advantage of the promise of computation, modeling, and simula-
tion in engineering education sooner, better, and with greater
confidence.
1Associate Professor, Computer and Information Technology and
Engineering Education, Purdue Univ., 401 N. Grant St., West Lafayette,
IN 47906. E-mail: admagana@purdue.edu
Note. This manuscript was submitted on September 15, 2016; approved
on February 16, 2017; published online on May 11, 2017. Discussion per-
iod open until October 11, 2017; separate discussions must be submitted for
individual papers. This paper is part of the Journal of Professional Issues
in Engineering Education and Practice, © ASCE, ISSN 1052-3928.
© ASCE 04017008-1 J. Prof. Issues Eng. Educ. Pract.
J. Prof. Issues Eng. Educ. Pract., 2017, 143(4): -1--1
Downloaded from ascelibrary.org by Purdue University Libraries on 05/11/17. Copyright ASCE. For personal use only; all rights reserved.
A first step in the curriculum design process is to identify the
required enduring understandings (Kadiyala and Crynes 2000;
Chen et al. 2000) that can then guide the design of learning out-
comes (Wiggins and McTighe 1997). Enduring understandings
refer to statements summarizing important core ideas or skills of
a discipline that have a perpetual value beyond the classroom
(Wiggins and McTighe 2005). Learning outcomes can then inform
the design of pedagogical methods and assessment techniques that
will provide engineering educators and stakeholders with a way to
measure the progress and implementation of accredited programs
(Felder and Brent 2003). This research study focuses on the initial
step of identifying the required enduring understandings and skills
of modeling and simulation. To this end, the guiding research ques-
tion is: What are the required modeling and simulation practices to
be integrated as part of the engineering curricula at the undergradu-
ate and graduate levels?
The following sections present the practical and theoretical
grounding of this study. The paper starts with a general definition
of modeling and simulation and the identification of related proc-
esses. Some preliminary work in engineering education that has
aimed to characterize these skills is presented. Then the study is
grounded in the concept of learning progressions, and a case is
made regarding the need for a learning progression of modeling
and simulation in engineering education. The paper proceeds with
a description of how a Delphi method was implemented to generate
a preliminary version of a proposed learning progression, and pro-
vides thorough descriptions of how each data collection round was
implemented, as well as the outcomes of each stage. The paper con-
cludes with the presentation of the proposed learning progression
along with a discussion of the implications for teaching, learning,
and future research directions.
Modeling and Simulation Practices
Modeling and simulation refer to a combination of processes in
which a systems behavior is demonstrated or predicted by a reduc-
tive computational representation. These processes are highly inter-
related and at times are used interchangeably. For instance, Shiflet
and Shiflet (2014, p. 7) referred to modeling as the application of
methods to analyze complex, real-world problems in order to make
predictions about what might happen with various actions.Maria
(1997), on the other hand, distinguished them by defining modeling
as the production of a model to represent the inner workings of a
system, and simulation as the operation of a model that can be
reconfigured and explored. The Department of Defense (2008)
has also made a similar distinction. Modeling was defined as an
attempt to simulate an abstract model of a particular system,
such as natural systems, human systems and new engineering
technology(DoD 2008, p. 8), while simulation was defined as
a technique used for testing, analysis or training, where the model
represents a real-worldsystem or concept(DoD 2008, p. 9).
What professionals in this area have agreed upon is that these
two processes are combined and often undertaken in an iterative
cycle where conclusions derived from each simulation experiment
can feed back into the system under study until it results in the
desired altered system (Maria 1997).
Shiflet and Shiflet (2014) and earlier Maria (1997) described
the steps involved in the modeling and simulation process. Maria
(1997) summarized the modeling and simulation process in five
major activities: model development, experiment design, output
analysis, conclusion formulation and making decisions to alter the
system under study.Shiflet and Shiflet described it in six main
steps: analyze the problem, formulate a model, solve the model,
verify and interpret the models solution, report the model, and
maintain the model. During the analysis stage, individuals must
first understand the problem, define an objective, and determine
the classification of the problem. During the model formulation
stage, an abstraction and simplification of the system is formed,
often informed by existing data or theory. In this stage a model
is generated along with the corresponding variables, units, and
relationships between such variables. All these components will
help determine the equations of the model. In the solve the model
step, different tools, methods, and techniques must be considered
before implementing a proper solution. During the verification
and interpretation stage the model must be examined to verify
that its solution makes sense, and validate it so that it solves
the original problem. Trade-offs between simplification and re-
finement must also be made at this point. Reporting on the model
is also an important stage where proper documentation is devel-
oped. This documentation needs to include details describing the
simplifying assumptions, a rationale for employing them, the
techniques used to solve the problem, source code, interpreta-
tions, implications, recommendations, and so forth. In the main-
tenance stage, corrections, improvements, and enhancements are
performed.
Although Shiflet and Shiflet described the steps involved in
modeling and simulation in a sequential form, they clearly empha-
sized that this process is actually very agile; steps can be performed
out of order or even simultaneously, iteratively going back at any
step to make revisions. Steps can also be distributed when per-
formed by different team members. This suggests that individuals
can possibly work on a single component or submodel in parallel
with other team members. Individuals can also work on verifying
existing models or enhancing or extending previous ones.
Modeling and Simulation in Engineering Education
Engineering educators have started to identify the breadth and
depth of modeling and simulation skills needed at the undergradu-
ate and graduate level. To this end, two main approaches have
been followed: One approach has been to identify industry needs
by conducting survey studies. For example, Vergara et al. (2009)
conducted a survey with diverse industry engineering sectors. The
study revealed that employers consider of high importance stu-
dentsabilities to understand engineering principles and computa-
tional principles that allow them to use computational tools to solve
engineering problems by moving between physical systems and
abstractions in software(Vergara et al. 2009). Specific abilities
identified included
1. Studentsability to characterize and solve problems at the op-
erational and conceptual levels, translating between the physical
and virtual world;
2. Studentsability to manage (e.g., collect, store, secure) data,
draw meaning from information, and communicate that infor-
mation to others in a meaningful way;
3. Studentsability to learn multiple software and computational
systems; and
4. Studentsability to use information technology (e.g., collabora-
tive tools, instant messaging) to increase business productivity.
A second approach has been to identify instructorsintended
learning outcomes for the integration of these skills into the
undergraduate and graduate engineering curricula. For instance,
Magana et al. (2012) conducted open-ended interviews with
14 faculty members who integrated computation and expert mod-
eling and simulation tools into their undergraduate and graduate
courses. Their analysis revealed eight different goals that the in-
structors wanted to accomplish when incorporating simulation
tools as learning activities into courses they were teaching at
© ASCE 04017008-2 J. Prof. Issues Eng. Educ. Pract.
J. Prof. Issues Eng. Educ. Pract., 2017, 143(4): -1--1
Downloaded from ascelibrary.org by Purdue University Libraries on 05/11/17. Copyright ASCE. For personal use only; all rights reserved.
the time. These eight categories are summarized into two major
learning goals: (1) using simulations to identify and describe the
governing fundamental physical principles or behaviors of devi-
ces, materials, and other artifacts; and (2) building simulations to
apply modeling and computational techniques to approach engi-
neering design tasks. Specific skills instructors aimed to develop
in their students are
1. To recognize and be aware of the potential role of computational
simulations in a particular field of science and engineering;
2. To measure materials or devices by collecting data as in a
laboratory experiment;
3. To explain the cause-effect relationship of a given underly-
ing model;
4. To test the accuracy of a given model and/or its implementation;
5. To validate the results of the product of a design task;
6. To implement computational techniques for the creation of
computer models;
7. To predict the results of an experiment in a design task; and
8. To discriminate between models to represent a given physical
phenomenon.
The skills identified suggest that students may be required to
know when, why, and how computational methods work and how
to apply or configure existing numerical methods or methodolo-
gies to successfully solve problems or design solutions (Hu 2007).
Consequently, additional knowledge is required to identify what
other specific skills are needed in each stage of the modeling and
simulation process.
Learning Progressions in Science
Learning progressions (LPs) describe empirically grounded and
testable hypotheses about how studentsunderstanding of, and
ability to use, core scientific concepts and explanations and related
scientific practices grow and become more sophisticated over
time, with appropriate instruction(Corcoran et al. 2009). In other
words, a LP is a theoretically defined road map that shows how
studentsunderstanding of particular ideas or practices may build
over time with appropriate instruction. Identification of learning
progressions is an important endeavor because LPs (1) provide
educators with rich information about the status of studentsknowl-
edge, (2) offer a roadmap indicating how close students are to
targeted levels of attainment and understanding, (3) suggest which
conceptual gaps need to be addressed in instruction in order to
move students along in their thinking, (4) are applicable across
multiple disciplines to support the development of skills and learn-
ing across multiple years of learning, and (5) are useful frameworks
for guiding the coherent development of curriculum, assessment,
and instruction (Duncan and Hmelo-Silver 2009;Smith et al. 2006;
Corcoran et al. 2009).
Two main steps in crafting a learning progression involve
unpacking the learning goals detailing the implicit understand-
ings they entail, and organizing them into a logical framework
(Krajcik et al. 2008). These logical frameworks have been de-
scribed as having five characteristics: they (1) are focused on a
few foundational and generative disciplinary ideas and practices,
(2) are bounded by upper and lower anchors describing what
students are expected to know and be able to do by the end of
the progression, (3) provide a set of assumptions about the prior
knowledge and skills of learners as they enter the progression,
(4) describe varying levels of achievement (i.e., learning perfor-
mances) as the intermediate steps between the two anchors, and
(5) are mediated by targeted instruction and curriculum (Duncan
and Hmelo-Silver 2009).
Over the last decade several researchers have developed LPs in
science education. A relevant LP for this study is the one developed
by Schwarz et al. (2009) for scientific modeling. The main goal of
their LP is to engage students in model-based reasoning, which
consists of exposing them to the process of creating, testing, revi-
sing, and using externalized scientific models that may reflect their
own mental models (Schwarz and White 2005). Schwarz et al.
(2009) have defined scientific modeling as consisting of two
dimensions: (1) practices such as constructing, using, evaluating,
and revising scientific models, and (2) the metaknowledge of mod-
els such as the understanding of the nature and purpose of models,
which guides and motivates the practice. This learning progression
combine[s] metaknowledge and elements of practice.(Schwarz
et al. 2009, p. 632). Fig. 1depicts how Schwarz et al. described
modeling practice as the interaction between metamodeling knowl-
edge and elements of practice. Sensemaking and communicating
understanding emerge from the use of these two elements.
The practice dimension has been embodied into a set of four
learning goals: (1) construct models consistent with prior evi-
dence and theories to illustrate or explain phenomena; (2) use mod-
els to illustrate, explain, and predict phenomena; (3) compare and
evaluate the ability of different models to accurately represent and
account for patterns in phenomena, as well as to predict new phe-
nomena; and (4) revise models to increase their explanatory and
predictive power, taking into account additional evidence or aspects
of a phenomenon. The metaknowledge dimension includes an
understanding that (1) models change to capture improved under-
standing built on new findings, and (2) models are generative tools
for predicting and explaining.
The use of LPs is proposed as a framework that can allow their
use across multiple engineering disciplines and support the devel-
opment of modeling and simulation practices across multiple years
of learning (Duncan and Hmelo-Silver 2009;Smith et al. 2006).
Schwarz et al.s(
2009) LP for scientific modeling, specifically
the first dimension describing scientific practices, was used as a
baseline for this work. Because developing and fully validating an
LP is a lengthy process, the scope of this study will only propose a
preliminary LP that will be iteratively refined and validated through
empirical and theoretical work.
Fig. 1. Interaction between metamodeling knowledge and elements
of practice (adapted from Schwarz et al. 2009)
© ASCE 04017008-3 J. Prof. Issues Eng. Educ. Pract.
J. Prof. Issues Eng. Educ. Pract., 2017, 143(4): -1--1
Downloaded from ascelibrary.org by Purdue University Libraries on 05/11/17. Copyright ASCE. For personal use only; all rights reserved.
Methods
The aim of this study is to identify enduring understandings and
skills that can inform the development of a preliminary learning
progression of modeling and simulation practices in undergraduate
and graduate engineering education. To the authors knowledge,
there is limited preliminary work in the area of engineering edu-
cation in this topic, and therefore an appropriate source for such
knowledge is engineering experts with extensive previous experi-
ence in the use of modeling and simulation for research and inno-
vation. To consider divergent opinions, a Delphi survey (Dalkey
et al. 1969) of professionals in industry and academia was selected
as the data collection method. The Delphi method or modified
versions of it have been widely used in engineering education as
a mechanism to identify concepts and importance of difficult con-
cepts in engineering (e.g., Streveler et al. 2003;Prince et al. 2012).
It has also been used as a mechanism to inform the design of
engineering and technology curricula (Rossouw et al. 2011). For
instance, Balogh and Criswell (2013), using a modified version of
the Delphi method, created a framework of knowledge to identify
and quantify the needs of the profession by an assessment of
the expected level of achievement in each of a number of structural
engineering topics using Blooms taxonomy.Similarly, Cortes
et al. (2011) identified a framework for including occupational
risk-prevention education in Spains undergraduate engineering
programs. Li and Fu (2012), in addition to identifying a context-
specific engineering ethics curriculum using the Delphi method,
also identified appropriate delivery strategies and instructional
methods.
The Delphi method (Dalkey et al. 1969) consists of a series of
data collection rounds during which participants are presented with
a series of statements. Participants are asked to make judgments
on the statements or provide comments on the items presented.
Participant responses are completed anonymously to other partic-
ipants but not to the researcher. Results from each round are ana-
lyzed, and this analysis informs the design of the following round.
The main outcome of this method is a consensus on the topic. This
consensus is reached by participantsviews converging through a
process of feedback and decision making (Duffield 1993). Feed-
back between rounds can be supplied with statistical mean results
either alone or together with a summary of comments provided
by panel members. This process allows the investigator to reach
consensus in a nonadversarial manner, and provides participants
with an opportunity to consider elements they may have missed in
previous rounds (Hasson et al. 2000).
Although the primary aim of a Delphi study is to gain perspec-
tive about the most important issues, researchers have also devel-
oped variations of the method tailoring it to specific problem types
(Okoli and Pawlowski 2004). For example, Kendall et al.s(1992)
study emphasized differences of opinions in order to develop alter-
native perspectives. Another type of application for a Delphi study
relates to the development of a concept or framework. This type of
design usually involves a two-step process. The first step consists
of the identification of and elaboration on a set of concepts, which
is then followed by a classification (Okoli and Pawlowski 2004).
This is precisely the intended goal of the use of the Delphi method
for this particular study. Outcomes of the Delphi method were then
used as inputs to generate a preliminary learning progression of
modeling and simulation practices in engineering education. Fig. 2
presents an overview the procedures for data collection.
Composition of the Panel and Recruitment
Participants of a Delphi study are required to have knowledge
and interest on the topic being investigated. Balance must then
be sought by selecting experts who will be relatively impartial
(Hasson et al. 2000). To form the participating panel, guidelines
from Okoli and Pawlowski (2004) were followed. First two panels
were formed, one panel being industry practitioners and the other
panel being faculty in academia. The goal for forming two panels
was to compare these two perspectives of different stakeholder
groups. To identify experts, a knowledge resource nomination
worksheet was prepared to help categorize and avoid overlooking
any important class of experts. The next step consisted of popu-
lating the worksheet with names. Different lenses were used for
considering experts, including years of experience, academic
Fig. 2. Summary of procedures for data collection
© ASCE 04017008-4 J. Prof. Issues Eng. Educ. Pract.
J. Prof. Issues Eng. Educ. Pract., 2017, 143(4): -1--1
Downloaded from ascelibrary.org by Purdue University Libraries on 05/11/17. Copyright ASCE. For personal use only; all rights reserved.
background, and discipline in science or engineering. A baseline
procedure was established first by going through the investigators
personal list of contacts. A second step was to identify authors from
related national reports. Nominations were then taken from other
participants in the study. These nominations were reviewed through
online searches for their biographies, and, based on their back-
grounds and experience, a decision was made about whether or
not to consider them as participants of the study. A preliminary list
of 76 possible participants was formed. After a ranking process and
comparison upon criteria, 55 participants were contacted for invi-
tation to the study. From those, 37 agreed and completed the first
round and 32 of the participants completed the second round, with a
third of them providing minor comments and revisions during the
last round.
Participants included 37 experts; 18 were from academia (10
full professors, five associate professors, and three assistant profes-
sors) and 19 were from industry (seven managers, seven engineers,
three researchers, and two marketing). Their academic preparation
consisted of doctoral degrees (n¼26), masters degrees (n¼9),
and bachelors degrees (n¼2). This group of individuals repre-
sented a spectrum of diverse engineering and science disciplines
including nine participants from mechanical engineering, six from
electrical engineering, five from aerospace, aeronautics, or astro-
nautics engineering or astronomy and astrophysics, four from civil
engineering, three from materials engineering, two from applied
mathematics, two from computer science, two from chemical en-
gineering, one from robotics, one learning scientist, and two more
who did not identify their discipline. There were 23 males and
14 females.
The expertsyears of experience with modeling and simulation
ranged between more than 20 years (n¼8), between 10 and
20 years (n¼25), and less than 10 years (n¼4). Many of these
individuals were first exposed to modeling and simulation practices
during their bachelor degrees (n¼14), primarily as part of their
capstone design courses toward the end of their undergraduate stud-
ies. Fifteen were first exposed to modeling and simulation skills as
part of their graduate studies (eight during their M.S. studies and
seven during their Ph.D. studies). Six reported being first exposed
to modeling and simulation skills through their job and two had
their first exposure in high school.
Data Collection and Data Analysis Methods
Traditional data collection methods for Delphi studies are qualita-
tive questionnaires. Alternatively, qualitative data can be collected
through other sources, such as focus groups, interviews, or docu-
ment analysis, and used to create a quantitative first round of the
Delphi (Hasson et al. 2000). This second approach requires an
additional pilot test with a small group before conducting the first
implementation. This Delphi study consisted of three rounds of
data collection. Between each round, a summary of opinions was
provided to each participant. Providing a summary of opinions be-
tween rounds allowed consensus to be reached by two or at most
three rounds (Duffield 1993).
The first round consisted of identifying the level of importance
or need of a set of 30 modeling and simulation skills. This first set
of skills was derived from a total of eight national reports deter-
mined after conducting a document analysis using a total of 24
documents. Table 1lists the eight reports that generated the prelimi-
nary list of modeling and simulation practices. These practices were
supplemented with information identified through the literature
review in engineering education. The preliminary survey was va-
lidated through three different mechanisms. First, content validity
was based on the literature review. The list was also reviewed for
face validity including accuracy, completeness, and possible re-
dundancy by two doctoral students in computational science and
engineering, and three experts (two full professors and one industry
practitioner) in computational science and engineering. To pilot
the list, face-to-face cognitive interviews were conducted with three
additional experts. Cognitive interviews have been proposed as an
appropriate mechanism to improve the validity and reliability of
surveys (Desimone and Le Floch 2004). The goal of this type of
interview is to gain insights about how respondents interpret survey
questions with the goal of improving the quality of questionable
ones (Desimone and Le Floch 2004). During the interviews, ex-
perts were asked to go through survey items and state aloud their
thinking about what they thought the question was asking as well as
describing their thinking regarding how they would answer each of
them and why (Van Someren et al. 1994). This process identified
possible conceptual or grammatical difficulties within each ques-
tion and provided other miscellaneous feedback. The official data
collection started afterward.
Results
Round 1: Validation of the Modeling and Simulation
Skills and Rating Their Importance
This first round consisted of two main sections. The first section
collected background information in which participants stated their
occupation, their academic background, their prior experience with
modeling and simulation, number of years of experience applying
these skills, and a description of when in their academic or profes-
sional careers they were first introduced into these practices. The
second section consisted of a Likert-scale survey in which partic-
ipants, according to their opinion and experience, ranked the
level of importance of 30 different modeling and simulation skills
needed in current engineering workplaces. Descriptive statistics
were first used to report measures of central tendency for the Likert-
scale survey for each panel. Inferential statistics (chi-square) was
then used to identify significant differences between the distribu-
tions of responses among the two panels.
Appendix Isummarizes the results from each of the two panels
(i.e., industry and academia) and for each of the identified modeling
and simulation skills. The table also depicts means, standard devi-
ations, and modes as calculated for each of the panels. To identify if
the two groups were significantly different in their ratings, the last
column reports the p-value. Due to the series of comparisons per-
formed between industry and academia groups for each question,
the alpha level, originally established at 0.05, was divided by the
number of total questions (i.e., the Bonferroni adjustment) to offset
the chances of a Type I error.
Table 1. National Reports That Generated the Preliminary List of
Modeling and Simulation Practices (Adapted from Magana and
Coutinho 2017)
Organization Report
Association for Computing
Machinery and IEEE
ACM-IEEE (2013)
Department of Defense DoD (2006)
Department of Defense DoD (2008)
Department of Energy DOE (2010)
National Science Foundation NSF (2004)
National Science Foundation NSF (2006)
National Science Foundation NSF (2011)
N/A Shiflet and Shiflet (2014)
© ASCE 04017008-5 J. Prof. Issues Eng. Educ. Pract.
J. Prof. Issues Eng. Educ. Pract., 2017, 143(4): -1--1
Downloaded from ascelibrary.org by Purdue University Libraries on 05/11/17. Copyright ASCE. For personal use only; all rights reserved.
Overall results suggest that participants from all disciplines
ranked all the modeling and simulation skills as needed to highly
needed in workplace engineering. This first initial level of agree-
ment also suggests that these practices are equally relevant to all
disciplines. Results also suggest that there are no significant differ-
ences between industry and academia expertsopinions regarding
the importance of each of these skills. The only skill where agree-
ment was not reached was scrutinize and upgrade any portion of
a model or simulation in a controlled manner.Industry experts,
on average, found this skill to be highly important for workplace
engineering, while experts in academia only found it to be impor-
tant. Considering the sample size, the results from the chi-square
test were double-checked with results from Wilcoxon-Mann-
Whitney test and the same significant differences were identified.
In a different analysis (Magana and Coutinho 2017), individual
t-tests were performed item by item. That additional analysis
identified a second statement where significant differences were
identified, involve simulations and experiments (or field data)
interactively to improve the fidelity of the simulation tool, its ac-
curacy, and reliability,where experts from industry also appear to
value this applied practice more than the experts from academia.
Participants were also prompted to identify additional required
skills not provided in the original list. Twenty participants provided
statements suggesting additional skills. The suggested skills were
compared and contrasted with each of the existing skills from
the list, and from the total only three were considered as not explic-
itly included on the original set. The three suggested skills are
(1) development of intuition and being critical of results (e.g., by
identifying assumptions and limitations), suggested by six partic-
ipants; (2) ability to learn a variety of software packages (once the
fundamental aspects of computational methods are established),
suggested by five participants; and (3) ability to perform basic
economic evaluation of value for simulations, suggested by two
participants. A participant who suggested the skill for developing
intuition and being critical of results wrote: Computational tools
are widely available for solving multi-physics problems. I feel that
graduates today need to be able to use these tools and skeptically/
analytically evaluate (validate/verify) the results. At this time,
the average graduate does not need to know how to program the
algorithms to solve the problems but rather to truly use their
engineering intuitionand skills to understand if the solution is
correct and to choose the correct methods and inputs/meshing/
boundary conditions.
These statements were used after the final round of the Delphi
study to refine and enhance the final version of the preliminary
learning progression.
Round 2: Categorization and Ranking of the Modeling
and Simulation Skills
The second round of data collection started by contacting partic-
ipants from each panel and sharing with them a summary of the
results gathered from the first round (Hsu and Sandford 2007;
Hasson et al. 2000). No significant differences were found between
opinions gathered from both panels in the first round (see last
column of Appendix I); consequently, the summary of the results
included overall means and standard deviations for each statement
derived from both panels.
Following guidelines from the Delphi method (Hsu and
Sandford 2007), the second round consisted of a two-step process.
In the first step, participants were prompted to first categorize each
of the modeling and simulation skills into one of three bins or cat-
egories, identifying the curricular level in which each skill should
be incorporated (i.e., freshmen and sophomore, junior and senior,
master and doctoral level). In the second step, for each of the cat-
egories (bins), participants were prompted to rank each statement
by decreasing level of importance going from mastery, to profi-
ciency, to familiarity (e.g., the most important skills should be
ranked at the top). The definitions provided for each level were
as follows: Mastery was defined as student ability to consider a
concept from multiple viewpoints and/or justify the selection of
a particular approach to solve a problem. This level of mastery
implies more than using a concept; it involves the ability to select
an appropriate approach from understood alternatives. Proficiency
was defined as a students ability to use or apply a concept in a
concrete way. Using a concept may include, for example, appropri-
ately using a specific concept in a program, use of a particular
proof or technique, or performing a particular analysis. Familiarity
was defined as a students ability to understand what a concept is
or what it means. This level of mastery concerns a basic aware-
ness of a concept as opposed to expecting real facility with its
application.
Data for the second round were analyzed separately for each of
the panels. The first step consisted of identifying the counts for
each statement for each of the three: (1) freshmen and sophomore
level, (2) junior and senior level, or (3) M.S. and Ph.D. categories.
Once the counts in each category were identified, the next step con-
sisted of comparing the distributions for both panels and for each of
the questions. This analysis was performed using a chi-square test.
To offset the chances of Type I error due to the multiple com-
parisons, a Bonferroni adjustment was performed and the alpha
value was set to α¼0.00167. After comparing the distributions
from each question it was determined that all of them had similar
distributions; therefore, the two groups were merged to proceed
with the categorization by academic level and ranking by level of
mastery.
Each of the statements describing a specific modeling and sim-
ulation skill was first placed in one of three categories: (1) freshmen
and sophomore level, (2) junior and senior level, or (3) M.S. and
Ph.D. level. This categorization was done by identifying the highest
counts for each group. Once the categories were formed, the skills
were ranked within each cluster in order to organize them by level
of proficiency. The top item for each cluster was based on the high-
est ranking provided by the overall participant count, and so forth
with each proceeding item. Appendix II depicts the categorization
and ranking for each of the statements, along with the results of the
chi-square test reporting similar distributions of categories between
both panels.
Round 3: Reach Consensus on the Categorized and
Ranked Modeling and Simulation Skills
The goal of the third round was to revise the collective responses
and reach consensus and stability among the respondents. Partic-
ipants were provided with the list of skills organized by possible
grade level (i.e., freshmen and sophomore, junior and senior,
M.S. and Ph.D.). At this point, Schwarz et al.s(2009) practices for
scientific modeling were used as a way to further categorize each
of the skills within the three groups formed in the previous step.
The practices are (1) construct models, (2) use models, (3) compare
and evaluate models, and (4) revise models. This categorization
was conducted separately (internally) between two educational
researchers, two computational scientists, and one computer engi-
neer. Categorizations were compared and contrasted, and consen-
sus was reached for the cases in which disagreements were
identified. When more than one skill was grouped within each of
the specific practices for scientific modeling, the rank information
was used to order the skills as part of the category (Appendix III).
© ASCE 04017008-6 J. Prof. Issues Eng. Educ. Pract.
J. Prof. Issues Eng. Educ. Pract., 2017, 143(4): -1--1
Downloaded from ascelibrary.org by Purdue University Libraries on 05/11/17. Copyright ASCE. For personal use only; all rights reserved.
At this point panel participants were asked to identify if they
agreed with the proposed organization and categorization of skills,
whether they propose any changes, and if so, to provide a rationale.
Participantsresponses were then analyzed qualitatively and sug-
gestions were incorporated as part of the final version of the pro-
posed progression.
One-third of the participants (seven from industry and six from
academia) provided feedback and revisions regarding the categori-
zation and ranking, as well as feedback on the clarity of the de-
scriptions of each of the modeling and simulation skills. The first
step in analyzing the provided feedback consisted of identifying
those suggestions that focused on reorganization of statements in
different categories or levels. Each suggestion was evaluated indi-
vidually, and when it was well justified, it was taken into further
consideration. For example, an expert from industry suggested
that the following statement be moved from the junior and senior
level to the M.S. and Ph.D. level: Identify existing algorithms
or computational methods to describe physical models and engi-
neered systems as computational representations of simulations.
The following rationale was provided: :::while I could see
doing this for a specific problem or two as a senior, I tend to see
this general capability requiring deeper penetration than a typical
undergrad has capacity for.The skill in question was then com-
pared against findings from Magana et al. (2012), which identified
that the skill requiring the highest-order level of thinking was stu-
dentsability to discriminate between models used to represent a
given physical phenomenon. Therefore, a decision was made to
relocate such a statement.
Once recategorization of statements was taken into considera-
tion, the next step focused on revisions suggested to reduce ambi-
guity in the statements. For example, two statements were further
refined based on the following comment from another industry
expert: I feel that quantifying reliability ties into quantifying the
uncertainty and questions about the validity of the model for the
given case, which gets advanced. I feel Ive witnessed more junior
engineers wrongly trusting results because they didnt understand
how to properly quantify the reliability. I would expect this is more
an acknowledgeskill at the Junior/Senior level and a quantify
skill at the Masters/Ph.D. level.
Based on this comment, the wording of the corresponding
statements was revised to distinguish between being aware or
familiar with reliability considerations and actually being able to
quantify or determine them. A similar revision was performed at
the freshmen and sophomore level to distinguish between basic
versus complex models. For instance, one participant from aca-
demia commented Although I agree that students at this level
should be able to construct mathematical models for abstract con-
cepts, that would only work if they have the technical knowledge
required. If we are looking at basic principles like speed, accel-
eration, voltage, then I agree. If we are expecting more advanced
mathematical models of engineering systems, then that would be a
junior/senior level skill.
This comment emphasizes that simple models (as opposed to
complex models) be introduced at the freshmen and sophomore
levels. In addition, two participants, both of them from academia,
offered general suggestions for revising the presentation of the
statements because the originals were hard to read.The state-
ments were then simplified to eliminate big wordsor information
that was too specific (e.g., visualization of tensor fields). Similarly,
when a skill was related to an evaluation or comparison process, the
statement was refined to specify the elements being compared, or
the criteria being evaluated against. The following is a comment
from an expert from academia: I find the following statement very
vague: Identify tradeoffs of modeling and simulation including
performance, accuracy, validity, and complexity. (M¼4.62,SD¼
0.59, Rank ¼2).Tradeoffs are in comparison to something. I am
not sure what the students are comparing to. If the question means
to determine when a model might be more appropriate compared to
doing an experiment or when to use models, then this is appropri-
ate. If it is between two software packages, or modeling compared
to some other way of finding the information then this seems maybe
to be at a higher level.
Other suggestions addressed ways to simplify or make more
accessible for students the modeling and simulation practices con-
tained in the first level. For example, one participant from academia
suggested that students can be first introduced to the process by
creating existing models using standard application programming
interfaces, as opposed to starting with high-level programming
languages. The rest of the comments were in line with I agree
with your finding and I dont have any suggested changes.
Nice work!
Preliminary Learning Progression
At this point, a preliminary learning progression was formed by
transforming each of the modeling and simulation skills into per-
formances. Appendixes IIII identify specific details of the trans-
formation throughout the three rounds from a set of modeling
and simulation skills to a set of performances. A skill refers to an
ability that a person may possess, while a performance refers to the
observable use or application of that skill. The transformation con-
sidered each statement and rephrased it as the action or process of
applying the skill. A first step consisted of visually aligning each
skill with its corresponding performance in a table side by side.
By doing so, it was identified that some of the skills were highly
related, and based on the feedback from the third round, it was
deemed necessary to simplify some of the statements. For example,
an expert in modeling and simulation who is also an educational
researcher mentioned, Overall I find the criteria hard to read and
would have difficulty using them to design courses and curricula
at different levels.The simplification of statements also led to the
combination of two or more statements into one. For example, the
skills visualize the data (e.g., scalars, vector, and tensor fields)
collected experimentally from multiple sourcesand use stan-
dard application programming interfaces (APIs) and tools to cre-
ate visual displays of data, including graphs, charts, tables, and
histogramswere considered as very similar. Both of them refer
to studentsability to visualize data, and therefore these two
were combined into the single statement: Students construct vi-
sual displays of data, including graphs, charts, tables, and his-
tograms using standard domain-specific software, application
programming interfaces, or built-in libraries of scientific computing
software.
The preliminary learning progression was iteratively revised by
the researcher and an expert with extensive experience in modeling
and simulation, as well as extensive experience in engineering
education research. Due to these minor changes, a final round of
face-to-face cognitive interviews (Desimone and Le Floch 2004)
was performed again with eight additional experts. Four experts
were advanced doctoral students in engineering education with
educational or teaching backgrounds involving modeling and sim-
ulation practices. Four more experts were participants from the
Delphi study, two from industry and two from academia.
During each cognitive interview, participants were presented
with a version of the table aligning skills and performances, along
with a rationale of how or why some of the statements were
changed or merged (see Appendix IV for an example). Participants
© ASCE 04017008-7 J. Prof. Issues Eng. Educ. Pract.
J. Prof. Issues Eng. Educ. Pract., 2017, 143(4): -1--1
Downloaded from ascelibrary.org by Purdue University Libraries on 05/11/17. Copyright ASCE. For personal use only; all rights reserved.
were asked to compare and contrast statement per statement side by
side, and respond to the following questions: Is the statement of the
second column accurately capturing the information from the first
column? Would you please suggest any rewording changes in order
to better represent the original statement? Are there any other
statements that need to be merged into one because they are very
similar? After participants reviewed the statements one by one,
they were asked two more final questions: Considering newly re-
vised statements (i.e., performances) as a whole, can you identify
repeated information that needs to be removed or merged with
another one? For every suggestion provided by a participant, they
were prompted to state their rationale. This rationale was captured
in the last column, as shown in Appendix IV. The final version of
the preliminary learning progression after feedback from the seven
cognitive interviews is depicted in Table 2.
Discussion and Implications for Teaching and
Learning
The proposed preliminary learning progression extends prelimi-
nary work by (1) proposing the use of learning progressions as
Table 2. Preliminary Learning Progression of Modeling and Simulation in Engineering Education
Category Performances
Level 1
Construct models Students construct visual representations of data, such as graphs, charts, tables, and histograms using standard
domain-specific software, application programming interfaces, or built-in libraries within scientific computing software
Given a simple model, students identify the corresponding mathematical model and use computer-programming methods or
APIs to implement an appropriate algorithm representing abstractions of reality via mathematical formulas, constructions,
equations, inequalities, constraints and so forth
Use models Students use existing computational models or simulations to comprehend, characterize, and draw conclusions from visual
representations of data by evaluating appropriate boundary conditions, noticing patterns, identifying relationships, assessing
situations, and so forth
Evaluate models Students compare the results of models and simulations to laboratory experiments, theory, measurements, test cases,
and so forth, to determine their alignment, overlap, or goodness of fit, among other metrics
Revise models Students extend or adapt simple models from one situation to another, either by configuring the model through a graphical
user interface or by modifying or extending existing code
Level 2
Construct models Students connect simulation and visualization by first visualizing data using numerical outputs from a simulation and then
interacting with the visualization to engage in critical thinking about the simulated model
Students implement simple computational models by creating discretized mathematical descriptions of an event or
phenomenon using high-level programming languages or scientific computing software
Use models Students use simulations at different scales to deploy the correct solution method, inputs, and other parameters to explore
theories and identify relationships between modeled phenomena
Students use computational models or simulations to design, modify, or optimize materials, processes, products, or systems
Students use computational models or simulations to design experiments to test theories, prototypes, products, materials,
and so forth
Students use computational models or simulations to infer and predict physical phenomena or the behaviors of engineered
systems
Evaluate models Students evaluate the benefits and disadvantages of competing computational models or simulations by determining and
weighing factors such as assumptions, limitations, precision, accuracy, reliability, validity, and complexity
Students acknowledge and estimate uncertainty as part of the interpretation of simulation predictions
Revise models Students use external data, theories, or additional simulation tools to calibrate, verify, or improve the accuracy of
computational models or simulations
Level 3
Construct models Students construct new computational models or simulations by developing algorithms and methods that simulate physical
phenomena and engineered systems
Use models Students interface computational models or simulations directly with measurement devices such as sensors, imaging systems,
real-time control systems, and so forth
Evaluate models Students discern between different algorithms or computational methods to describe physical models or engineered systems as
computational representations
Students determine and quantify the reliability of computer simulations and their predictions
Students determine variability in data due to immeasurable or unknown factors via uncertainty-quantification methods or
techniques
Students evaluate algorithms by determining uncertainties and defining error, stability, machine precision concepts, and the
inexactness of computational approximations (e.g., convergence, including truncation and round-off)
Students verify a simulation model based on software engineering protocols, bug detection and control, and scientific
programming methods
Students validate a simulation model based on prescribed acceptance criteria such as observations, experiments, experience,
and judgment
Revise models Students identify the mechanisms for exchanging information to bridge models across scales and maintain computational
tractability
Students iteratively and systematically evaluate and improve the fidelity, accuracy, reliability, performance, and cost
(monetary and computational) of their computational models or simulations
© ASCE 04017008-8 J. Prof. Issues Eng. Educ. Pract.
J. Prof. Issues Eng. Educ. Pract., 2017, 143(4): -1--1
Downloaded from ascelibrary.org by Purdue University Libraries on 05/11/17. Copyright ASCE. For personal use only; all rights reserved.
frameworks for designing curriculum and instruction in higher
education, (2) outlining specific modeling and simulation perfor-
mances for higher education that expand those proposed for the
kindergarten to Grade 12 level (e.g., Schwarz et al. 2012), and
(3) providing a more detailed investigation of modeling and sim-
ulation practices previously identified in engineering education
(e.g., Vergara et al. 2009;Magana et al. 2012). The proposed pre-
liminary learning progression is also an initial step toward the
thoughtful integration of these practices into engineering higher
education.
The learning progression approach is appropriate and relevant
because it can guide the development of curricula, pedagogies, and
assessments. As mentioned previously, traditional instructional de-
sign practices propose identifying enduring understandings as an
initial step for designing learning activities (Wiggins and McTighe
2005). These enduring understandings are then usually trans-
formed into specific learning objectives that can guide the design
of learning experiences. This paper proposes that in addition to
identifying enduring understandings, educators and instructional
designers should also identify enduring disciplinary practices, such
as the ones proposed in this preliminary learning progression.
These enduring disciplinary practices can then be combined with
enduring understandings to synergize both ideas. This combination
may result in deeper, meaningful, and authentic learning. In doing
so, faculty must be aware that the notion of model in this prelimi-
nary learning progression is used in a generic way throughout the
list of performances. It is important to realize that models come at
a wide range of degrees of complexity, and therefore many of the
performances described here are quite broad in nature. For exam-
ple, the Level 1 performance on for evaluating models can be
trivial for single-parameter models with directly comparable and
robust experiment data, but it can also turn into a very complex
project. For instance, if the model consists of many unknowns and
the experimental data need to be reduced and processed, then the
degree of complexity is high. The proposed preliminary learning
progression is flexible in this regard, by leaving it up to the
professors to be more explicit about model types, methods, and
algorithms when these are aligned with disciplinary learning ob-
jectives. This generic nature of the learning progression is also
intentional so it can be adopted and adapted not only in civil
engineering, but to a wide range of engineering disciplines and
contexts and even to other science domains (e.g., computational
biology).
The preliminary learning progression does not consider notions
of programming or mathematics prerequisite knowledge and skills,
which are crucial for the successful integration of these practices
into disciplinary courses. In classroom studies, prior knowledge
in math and programming have been identified to significantly
influence how students approach and benefit from the modeling
and simulation process (Magana et al. 2016,2017;Magana and
Coutinho 2017). Researchers have also identified that students
experience more challenges when the modeling and simulation pro-
cess involves a programming task (Magana et al. 2016), particularly
as related to the mapping from the mathematical representations
to the algorithmic representations (Magana et al. 2017). Lessons
learned from previous classroom studies suggest that once enduring
understandings of modeling and simulation practices are identified,
a second step is to design proper instruction and supports that con-
sider studentsprior knowledge and skills in programming, math-
ematics, and engineering (Vieira et al. 2016b,2017). The ongoing
research program will continue to implement classroom studies
with the goal of providing educators with guidelines or pedagogical
principles to maximize the effects of the use of modeling and sim-
ulation for learning (e.g., Magana et al. 2013;Vieira et al. 2016a;
Alabi et al. 2015). Integration of prior programming, mathematics,
and disciplinary knowledge are beyond the scope of this paper.
This specific preliminary learning progression is grounded on
(1) a systematic examination of literature about national needs as
related to modeling and simulation practices, (2) opinions of ex-
perts and practitioners in industry and academia, and (3) relevant
theory and research about how students learn a particular concept
or topic. More empirical evidence is needed to legitimize the pro-
gression via classroom testing to identify if in fact, most students
do follow the predicted pathways when they receive the appropriate
instruction(Corcoran et al. 2009, p. 16). Learning progressions
have been described as containing the following characteristics:
(1) clear end points that are defined by societal aspirations and
analysis of the central concepts and themes in a discipline, (2) prog-
ress variables that identify the dimensions of skills that are being
developed over time, (3) levels of achievement or stages of progress
that define significant intermediate steps in skill development,
(4) performance expectations that are the operational definitions of
what individualsskills would look like at each of these stages of
progress, and (5) assessments that measure student understanding
of the key practices and track their developmental progress over
time (Corcoran et al. 2009, p. 15).
The proposed preliminary learning progression contains, to
some extent, the first three of the five characteristics mentioned pre-
viously. Future work is therefore required in order to establish the
performance expectations for each of these skills; that is, the fourth
characteristic of the list. After all, each of the performances can be
required within each level, but at different levels of mastery. For
example, freshmen and sophomores can perform at a familiarity
level of a skill, while juniors and seniors at a proficiency level,
and M.S. and Ph.D. students at a mastery level (i.e., a three-
dimensional categorization of each skill, by curricular level and
mastery level). Further research is therefore needed in order to iden-
tify what those levels (i.e., familiarity, proficiency, and mastery)
could look like for each skill. The identification of these profi-
ciency levels for each skill could also be identified or validated
via preliminary studies previously published in engineering educa-
tion journals or conference proceedings; however, this validation
is outside of the scope of this study. Future work implementing
classroom-based educational research is also needed to identify
the construct validity of the progression (i.e., does the hypothesized
sequence describe paths most students will actually follow?) and
consequential validity of the progression (i.e., does instruction
based on the proposed learning progression produce positive
results?) (Corcoran et al. 2009). These two approaches can result
in (1) performance expectations at each of these stages of prog-
ress, (2) assessment mechanisms to assess student understand-
ing and progress of these skills, (3) the design of a curriculum
to guide a more formal integration of these practices at the under-
graduate and graduate level, and also (4) identifying what kinds of
instructional supports can help students move from one level to
another.
Conclusion, Limitations, and Future Work
The main limitation of this study is that the proposed preliminary
learning progression has not been fully validated in classroom
settings. Learning progressions, however, are initially based on
systematic examinations of the literature in which the goal is to
identify relevant theory and research about how students learn a
particular concept or topic (Corcoran et al. 2009). Due to the lim-
ited number of evidence-based educational studies in this area,
approach suggested by Corcoran et al. was combined with the
© ASCE 04017008-9 J. Prof. Issues Eng. Educ. Pract.
J. Prof. Issues Eng. Educ. Pract., 2017, 143(4): -1--1
Downloaded from ascelibrary.org by Purdue University Libraries on 05/11/17. Copyright ASCE. For personal use only; all rights reserved.
preliminary learning progression in literature in science education
(i.e., Schwarz et al. 2009), literature in engineering education
(i.e., Magana et al. 2012), national reports on the topic (Table 1),
and the validation through this Delphi study using engineering
experts from industry and academia.
A second limitation of this study is the subjective nature of the
Delphi study, which exposes it to issues of researcher and subject
bias (Hasson et al. 2000). Outcomes from this study, however, are
an initial step toward the thoughtful integration of modeling and
simulation practices into the undergraduate and graduate engineer-
ing curricula. Outcomes from this study also represent a baseline
for future research about the particular studentscapabilities and
corresponding sequence or sequences of learning and achievement.
Once studentscapabilities and sequences of learning are identi-
fied, instructors can delineate the nature of productive instructional
pathways toward modeling and simulation proficiency. Finally, a
third inevitable limitation of the study relates to the experimental
mortality. As shown in Fig. 2, some of the participants dropped off
during different stages of the study and did not complete all three
rounds.
In conclusion, this work contributes to the body of knowledge
about the use of computers in engineering education by identifying
practices regularly implemented via domain-specific software, nu-
merical and scientific computing programming languages, and
computational tools. This work also extends prior kindergarten
to Grade 12 research by outlining a set of modeling and simulation
practices for higher education. As such, outcomes from this study
represent a critical first step in providing the higher-education fac-
ulty a mechanism for thoughtful integration of modeling and sim-
ulation practices in the classroom, along with disciplinary learning
objectives. It also provides a suitable framework for guiding the
coherent development of curriculum, assessment, and instruction.
Finally, although the focus for this work has been in engineering
education, considering experts with a range of engineering back-
grounds, the outcomes also have applications for natural and social
sciences.
Appendix I. Descriptive and Inferential Statistics for Level of Importance for Each Modeling and Simulation
Skill for Industry and Academia
Modeling and simulation skills
Industry (n¼19) Academia (n¼18) Differences
Mean
Standard
deviation Mode Mean
Standard
deviation Mode
Degrees of
freedom X2p-value
1. Use simulations to understand and explore theories and
relationships and interactions of phenomena at different
scales (i.e., length, time).
4.53 0.60 5 4.22 0.71 4 2 1.929938 0.380995
2. Use simulations to design new experiments to test theories. 4.53 0.60 5 4.33 0.75 5 2 1.25081 0.535045
3. Use simulations to infer and predict physical phenomena
or the behaviors of engineered systems.
4.58 0.49 5 4.72 0.45 5 1 0.832555 0.361535
4. Use simulations as an alternative when measurements are
impractical or too expensive.
4.53 0.68 5 4.22 0.97 5 3 2.827852 0.418935
5. Use simulations to design, modify, or optimize materials,
processes, and products.
4.63 0.48 5 4.50 0.76 5 2 4.576316 0.101453
6. Calibrate simulation models with tests. 4.47 0.75 5 4.33 0.82 5 3 5.843042 0.119501
7. Evaluate a simulation, highlighting benefits and
drawbacks.
4.42 0.49 4 4.67 0.58 5 2 5.434085 0.06607
8. Identify trade-offs of modeling and simulation including
performance, accuracy, validity, and complexity.
4.63 0.58 5 4.61 0.59 5 2 0.012982 0.99353
9. Compare results from different simulations of the same
problem and explain differences.
4.53 0.68 5 4.28 0.87 5 3 1.493544 0.683761
10. Validate a simulation model by determining the accuracy
with which the mathematical model depicts the actual
phenomena.
4.42 0.67 5 4.44 0.50 4 2 2.726599 0.255815
11. Verify a simulation model by determining the accuracy
with which the computational model represents the
mathematical model.
3.74 0.96 3 3.78 0.85 4 3 0.494702 0.89486
12. Involve simulations and experiments interactively to
improve the fidelity of the simulation tool, its accuracy,
and its reliability.
4.47 0.50 4 3.89 0.81 4 3 6.953761 0.073386
13. Scrutinize and upgrade any portion of a model or
simulation in a controlled manner.
4.42 0.49 4 3.72 0.93 4 3 9.201917 0.026723
14. Extend or adapt an existing model (by configuring it and
not programming it) to a new situation.
4.37 0.58 4 3.94 0.97 4 4 4.491405 0.343568
15. Construct a mathematical model to represent abstractions
of reality dictated by the theory or theories characterizing
a phenomenon.
4.21 0.61 4 4.00 1.11 5 4 2.940933 0.567758
16. Identify existing algorithms or computational methods to
describe physical models and engineered systems as
computational representations of simulations.
4.11 0.64 4 3.78 0.97 4 3 2.059273 0.724858
17. Develop algorithms and methods that simulate the
described physical models and engineered systems as
simulations.
4.16 0.87 5 3.89 0.99 5 3 0.812394 0.8465
© ASCE 04017008-10 J. Prof. Issues Eng. Educ. Pract.
J. Prof. Issues Eng. Educ. Pract., 2017, 143(4): -1--1
Downloaded from ascelibrary.org by Purdue University Libraries on 05/11/17. Copyright ASCE. For personal use only; all rights reserved.
Appendix I (Continued.)
Modeling and simulation skills
Industry (n¼19) Academia (n¼18) Differences
Mean
Standard
deviation Mode Mean
Standard
deviation Mode
Degrees of
freedom X2p-value
18. Implement (program) a computational model of a
mathematical description of an event or phenomenon,
which is a discretized approximation of the mathematical
model.
4.00 0.86 4 4.06 0.70 4 3 1.123526 0.771398
19. Link simulation tools directly to measurement devices. 3.84 0.99 4 3.28 0.80 3 3 4.439952 0.217709
20. Comprehend (draw conclusions) visual representations
of data (e.g., detecting patterns, assessing situations, and
prioritizing tasks) via visualization.
4.68 0.57 5 4.33 0.88 5 3 2.085608 0.55483
21. Visualize the data on the degrees of freedom
characterizing a model using the numerical outputs from
a simulation.
4.32 0.80 5 4.06 0.91 5 3 2.310375 0.510536
22. Visualize the data (e.g., scalars, vector, and tensor fields)
collected experimentally from multiple sources.
4.37 0.74 5 4.11 0.87 5 3 3.475512 0.323952
23. Use standard APIs and tools to create visual displays of
data, including graphs, charts, tables, and histograms.
4.00 0.97 5 3.56 1.12 4 3 2.278533 0.516646
24. Determine and quantify the reliability of computer
simulations and their predictions.
4.53 0.60 5 4.44 0.68 5 2 0.431163 0.806072
25. Acknowledge uncertainty in the interpretation of
simulation predictions.
4.58 0.49 5 4.61 0.68 5 2 4.415592 0.109943
26. Determine variability in data due to immeasurable or
unknown factors via uncertainty-quantification methods
or techniques.
4.16 0.74 4 4.06 0.91 5 2 3.132837 0.208792
27. Define error, stability, machine precision concepts, and
the inexactness of computational approximations.
4.21 0.83 5 4.06 0.97 5 3 3.844829 0.278715
28. Choose an appropriate modeling approach or method for
a given problem or situation.
4.47 0.68 5 4.72 0.56 5 2 1.667524 0.434412
29. Identify differences in model representations used for
different phenomena at different scales.
4.42 0.59 4 4.17 0.76 5 2 2.274635 0.320678
30. Identify the need to change models as scales are bridged
to maintain computational tractability.
4.11 0.72 4 3.89 0.87 4 3 1.426035 0.699444
Note: α¼0.05=30 ¼0.00167.
Appendix II. Grouping and Ranking for Each Modeling and Simulation Skill
Rank Performances
Degrees of
freedom X2p-value
Level: Freshmen and Sophomore
1 14. Extend or adapt an existing model (by configuring it and not programming it) to a new situation
(mean ¼4.16, standard deviation ¼0.82).
2 3.465972 0.176756
2 7. Evaluate a simulation, highlighting benefits, and drawbacks (mean ¼4.54, standard deviation ¼0.55). 2 3.111111 0.211072
3 22. Visualize the data (e.g., scalars, vector, and tensor fields) collected experimentally from multiple sources
(mean ¼4.24, standard deviation ¼0.82).
2 0.324074 0.648077
4 23. Use standard APIs and tools to create visual displays of data, including graphs, charts, tables, and
histograms (mean ¼3.78, standard deviation ¼1.07).
2 3.111111 0.10247
5 20. Comprehend (draw conclusions) visual representations of data (e.g., detecting patterns, assessing
situations, and prioritizing tasks) via visualization (mean ¼4.51, standard deviation ¼0.76).
2 2.52 0.133614
6 15. Construct a mathematical model (including mathematical formulas, constructions, equations,
inequalities, constraints) to represent abstractions of reality dictated by the theory or theories characterizing
a phenomenon (mean ¼4.11, standard deviation ¼0.89).
2 4.306061 0.116132
Level: Junior and Senior
1 1. Use simulations to understand and explore theories and relationships and interactions of phenomena at
different scales (i.e., length, time) (mean ¼4.38, standard deviation ¼0.67).
2 0.289174 0.86538
2 21. Visualize the data on the degrees of freedom characterizing a model using the numerical outputs from a
simulation (mean ¼4.19, standard deviation ¼0.86).
2 0.583333 0.540291
3 8. Identify trade-offs of modeling and simulation including performance, accuracy, validity, and complexity
(mean ¼4.62, standard deviation ¼0.59).
2 0.84183 0.656446
4 18. Implement (program) a computational model of a mathematical description of an event or phenomenon,
which is a discretized approximation of the mathematical model (mean ¼4.03, standard deviation ¼0.79).
2 0.361438 0.83467
5 29. Identify differences in model representations used for different phenomena at different scales
(mean ¼4.30, standard deviation ¼0.69).
2 1.288636 0.401008
6 24. Determine and quantify the reliability of computer simulations and their predictions (mean ¼4.49,
standard deviation ¼0.64).
2 2.121875 0.14922
© ASCE 04017008-11 J. Prof. Issues Eng. Educ. Pract.
J. Prof. Issues Eng. Educ. Pract., 2017, 143(4): -1--1
Downloaded from ascelibrary.org by Purdue University Libraries on 05/11/17. Copyright ASCE. For personal use only; all rights reserved.
Appendix II (Continued.)
Rank Performances
Degrees of
freedom X2p-value
7 25. Acknowledge uncertainty into the interpretation of simulation predictions (mean ¼4.59,
standard deviation ¼0.59).
2 0.845833 0.359715
8 2. Use simulations to design experiments to test theories (mean ¼4.43, standard deviation ¼0.68). 2 3.450505 0.178128
9 4. Use simulations as an alternative when measurements are impractical or too expensive (mean ¼4.38,
standard deviation ¼0.85).
2 0.335417 0.8456
10 6. Calibrate simulation models with tests (mean ¼4.41, standard deviation ¼0.79). 2 0.297386 0.861834
11 16. Identify existing algorithms or computational methods to describe physical models and engineered
systems as computational representations of simulations (mean ¼3.95, standard deviation ¼0.84).
2 4.574561 0.101542
12 5. Use simulations to design, modify, or optimize materials, processes, and products (mean ¼4.57,
standard deviation ¼0.64).
2 2.070833 0.355078
13 9. Compare results from different simulations of the same problem and explain differences (mean ¼4.41,
standard deviation ¼0.79).
2 2.361616 0.307031
14 3. Use simulations to infer and predict physical phenomena or the behaviors of engineered systems
(mean ¼4.65, standard deviation ¼0.48).
2 0.592172 0.743724
15 28. Choose an appropriate modeling approach or method for a given problem or situation (mean ¼4.59,
standard deviation ¼0.63).
2 0.352431 0.718216
Level: M.S. and Ph.D.
1 17. Develop algorithms and methods that simulate the described physical models and engineered systems as
simulations (mean ¼4.03, standard deviation ¼0.94).
2 0.807692 0.667747
2 26. Determine variability in data due to immeasurable or unknown factors via uncertainty-quantification
methods or techniques (mean ¼4.11, standard deviation ¼0.83).
2 1.332692 0.392012
3 30. Identify the need to change models as scales are bridged to maintain computational tractability
(mean ¼4.00, standard deviation ¼0.81).
2 2.3625 0.411314
4 27. Define error, stability, machine precision concepts, and the inexactness of computational
approximations (e.g., convergence, including truncation and round-off) (mean ¼4.14,
standard deviation ¼0.91).
2 4.2 0.273322
5 13. Scrutinize and upgrade any portion of a model or simulation in a controlled manner (mean ¼4.08,
standard deviation ¼0.82).
2 0.141414 0.931735
6 11. Verify a simulation model by determining the accuracy with which the computational model represents
the mathematical model based on software engineering protocols, bug detection and control, scientific
programming methods, and a posteriori error estimation (mean ¼3.76, standard deviation ¼0.91).
1 1.767677 0.18367
7 19. Link simulation tools directly to measurement devices (e.g., large-scale numerical computing,
data-intensive computing, sensors, imaging, grid computing, among others, for real-time control of
simulations and computer predictions) (mean ¼3.57, standard deviation ¼0.95).
1 1.458333 0.227195
8 10. Validate a simulation model by determining the accuracy with which the mathematical model depicts the
actual phenomena based on prescribed acceptance criteria such as observations, experiments, experience,
and judgment (mean ¼4.43, standard deviation ¼0.59).
2 3.671329 0.159508
9 12. Involve simulations and experiments (or field data) interactively to improve the fidelity of the simulation
tool, its accuracy, and its reliability (mean ¼4.19, standard deviation ¼0.73).
1 0.324074 0.56917
Note: α¼0.05=30 ¼0.00167.
Appendix III. Grouping and Categorization of Each Modeling and Simulation Skill
Level Model Modeling and simulation skills
1. Freshmen and sophomore Construct models Visualize the data (e.g., scalars, vector, and tensor fields) collected
experimentally from multiple sources (mean ¼4.24,
standard deviation ¼0.82, rank ¼3).
Use standard APIs and tools to create visual displays of data, including graphs,
charts, tables, and histograms (mean ¼3.78, standard deviation ¼1.07,
rank ¼4).
Construct a mathematical model (including mathematical formulas,
constructions, equations, inequalities, constraints) to represent abstractions of
reality dictated by the theory or theories characterizing a phenomenon
(mean ¼4.11, standard deviation ¼0.89, rank ¼6).
Use models Comprehend (draw conclusions) visual representations of data (e.g., detecting
patterns, assessing situations, and prioritizing tasks) via visualization
(mean ¼4.51, standard deviation ¼0.76, rank ¼5).
Compare and
evaluate models
Evaluate a simulation, highlighting benefits and drawbacks (mean ¼4.54,
standard deviation ¼0.55, rank ¼2).
Revise models Extend or adapt an existing model (by configuring it and not programming it)
to a new situation (mean ¼4.16, standard deviation ¼0.82, rank ¼1).
© ASCE 04017008-12 J. Prof. Issues Eng. Educ. Pract.
J. Prof. Issues Eng. Educ. Pract., 2017, 143(4): -1--1
Downloaded from ascelibrary.org by Purdue University Libraries on 05/11/17. Copyright ASCE. For personal use only; all rights reserved.
Appendix III (Continued.)
Level Model Modeling and simulation skills
2. Junior and senior Construct models Visualize the data on the degrees of freedom characterizing a model using the
numerical outputs from a simulation (mean ¼4.19, standard deviation ¼0.86,
rank ¼2).
Implement (program) a computational model of a mathematical description of
an event or phenomenon, which is a discretized approximation of the
mathematical model (mean ¼4.03, standard deviation ¼0.79, rank ¼4).
Use models Use simulations to understand and explore theories and relationships and
interactions of phenomena at different scales (i.e., length, time) (mean ¼4.38,
standard deviation ¼0.67, rank ¼1).
Use simulations to design new experiments to test theories (mean ¼4.43,
standard deviation ¼0.68, rank ¼8).
Use simulations to design, modify, or optimize materials, processes, and
products (mean ¼4.57, standard deviation ¼0.64, rank ¼11).
Use simulations to infer and predict physical phenomena or the behaviors of
engineered systems (mean ¼4.65, standard deviation ¼0.48, rank ¼13).
Compare and
evaluate models
Identify trade-offs of modeling and simulation including performance,
accuracy, validity, and complexity (mean ¼4.62, standard deviation ¼0.59,
rank ¼3).
Identify differences in model representations used for different phenomena at
different scales (mean ¼4.30, standard deviation ¼0.69, rank ¼5).
Determine and quantify the reliability of computer simulations and their
predictions (mean ¼4.49, standard deviation ¼0.64, rank ¼6).
Acknowledge uncertainty into the interpretation of simulation predictions
(mean ¼4.59, standard deviation ¼0.59, rank ¼7).
Identify existing algorithms or computational methods to describe physical
models and engineered systems as computational representations of
simulations (mean ¼3.95, standard deviation ¼0.84, rank ¼10).
Compare results from different simulations of the same problem and explain
differences (mean ¼4.41, standard deviation ¼0.79, rank ¼12).
Choose an appropriate modeling approach or method for a given problem or
situation (mean ¼4.59, standard deviation ¼0.63, rank ¼14).
Revise models Calibrate simulation models with tests (mean ¼4.41, standard
deviation ¼0.79, rank ¼9).
3. M.S. and Ph.D. Construct models Develop algorithms and methods that simulate the described physical models
and engineered systems as simulations (mean ¼4.03, standard
deviation ¼0.94, rank ¼1).
Use models Link simulation tools directly to measurement devices (e.g., large-scale
numerical computing, data-intensive computing, sensors, imaging, grid
computing, among others, for real-time control of simulations and computer
predictions) (mean ¼3.57, standard deviation ¼0.95, rank ¼7).
Compare and
evaluate models
Determine variability in data due to immeasurable or unknown factors via
uncertainty-quantification methods or techniques (mean ¼4.11, standard
deviation ¼0.83, rank ¼2).
Identify the need to change models as scales are bridged to maintain
computational tractability (mean ¼4.00, standard deviation ¼0.81,
rank ¼3).
Define error, stability, machine precision concepts, and the inexactness of
computational approximations (e.g., convergence, including truncation and
round-off) (mean ¼4.14, standard deviation ¼0.91, rank ¼4).
Verify a simulation model by determining the accuracy with which the
computational model represents the mathematical model based on software
engineering protocols, bug detection and control, scientific programming
methods, and a posteriori error estimation (mean ¼3.76, standard
deviation ¼0.91, rank ¼6).
Validate a simulation model by determining the accuracy with which the
mathematical model depicts the actual phenomena based on prescribed
acceptance criteria such as observations, experiments, experience, and
judgment (mean ¼4.43, standard deviation ¼0.59, rank ¼8).
Revise models Scrutinize and upgrade any portion of a model or simulation in a controlled
manner (mean ¼4.08, standard deviation ¼0.82, rank ¼5).
Involve simulations and experiments (or field data) interactively to improve the
fidelity of the simulation tool, its accuracy, and its reliability (mean ¼4.19,
standard deviation ¼0.73, rank ¼9).
© ASCE 04017008-13 J. Prof. Issues Eng. Educ. Pract.
J. Prof. Issues Eng. Educ. Pract., 2017, 143(4): -1--1
Downloaded from ascelibrary.org by Purdue University Libraries on 05/11/17. Copyright ASCE. For personal use only; all rights reserved.
Appendix IV. Alignment and Adjustment between Skills and Learning Performances
Model Skill Learning performance Observation
Freshmen and sophomore levels
Construct models Visualize the data (e.g., scalars, vector, and
tensor fields) collected experimentally
from multiple sources (mean ¼4.24,
standard deviation ¼0.82, rank ¼3).
Students construct visual representations
of data, such as graphs, charts, tables,
and histograms, using standard
domain-specific software, application
programming interfaces, or using built-in
libraries within scientific computing
software.
These two statements were merged into
one because visualize data and create
visual displays of data engage learners into
similar activities.
Use standard APIs and tools to create
visual displays of data, including graphs,
charts, tables, and histograms
(mean ¼3.78, standard deviation ¼1.07,
rank ¼4).
An expert from academia commented
I agree. Computer tools are so ubiquitous
in visualization now that they are often the
default way to carry out the first statement,
anyway.
Construct a mathematical model
(including mathematical formulas,
constructions, equations, inequalities,
constraints) to represent abstractions of
reality dictated by the theory or theories
characterizing a phenomenon
(mean ¼4.11, standard deviation ¼0.89,
rank ¼6).
Given a simple model, students identify the
corresponding mathematical model and
use computer-programming methods or
APIs to implement an appropriate
algorithm representing abstractions of
reality via mathematical formulas,
constructions, equations, inequalities,
constraints, and so forth.
Emphasis on simple mathematical models.
An expert from academia made the
comment If we are looking at basic
principles like speed, acceleration :::
Then I agree. If we are expecting more
advanced mathematical models of
engineering systems, then that would be a
junior/senior level skill.
An expert from academia suggested Yo u
may consider adding , which can then be
solved using computer-programming
methods or APIs, via an appropriate
algorithm.The mathematical modeling
step by itself requires no knowledge
of creating algorithms or that of
API/programming.
An expert from academia commented
“‘Constructseems rather advanced for
Freshmen and Sophomores. I would
distinguish this by saying that I expect 1st
and 2nd year students to be able to take an
existing mathematical model, implement it
to create a solution and explain the solution
in terms of the mathematical model. I think
early on they should acquire the
programming skill as a foundation on
which to build their theoretical capacities.
An expert from academia added We need
to put together the conceptual model with
the mathematical model so students can
then move into the computational model.
Therefore, I suggest emphasizing students
ability to identify the mathematical model
and its connection to the physical world.
Another expert from academia added
Based on the comments, I think that the
new version is better, but the language does
jump from completely mathematical/
theoretical (left column) to completely
computational/algorithmic (center
column). I think adding a second
mathematical(as shown) might help
maintain the application.
Use models Comprehend (draw conclusions) visual
representations of data (e.g., detecting
patterns, assessing situations and
prioritizing tasks) via visualization
(mean ¼4.51, standard deviation ¼0.76,
rank ¼5).
Students use existing computational
models or simulations to comprehend,
characterize, and draw conclusions from
visual representations of data by evaluating
appropriate boundary conditions, noticing
patterns, identifying relationships,
assessing situations, and so forth.
Emphasis on existing computational
models.
An expert from industry mentioned
Ability to identify the correct boundary
conditions to apply to a product, process,
or equipment model.
© ASCE 04017008-14 J. Prof. Issues Eng. Educ. Pract.
J. Prof. Issues Eng. Educ. Pract., 2017, 143(4): -1--1
Downloaded from ascelibrary.org by Purdue University Libraries on 05/11/17. Copyright ASCE. For personal use only; all rights reserved.
Appendix IV (Continued.)
Model Skill Learning performance Observation
Compare and
evaluate models
Evaluate a simulation, highlighting
benefits and drawbacks (mean ¼4.54,
standard deviation ¼0.55, rank ¼2).
Students compare the results of models and
simulations to laboratory experiments,
theory, measurements, test cases, and so
forth, to determine their alignment,
overlap, or goodness of fit, among other
metrics.
An expert from academia made the
following comment: In your table,
Evaluate a simulation, highlighting
benefits, and drawbacks. Again compared
to what? Evaluation is a higher level skill
that implies some familiarity with
alternatives.
An expert from academia added In
evaluation against what, Id additionally
include general test cases (cases that can be
easily checked independently) where the
simulation should produce predictable
results. The relative ease of computation is
often the all-powerful metric.
Revise models Extend or adapt an existing model
(by configuring it and not programming it)
to a new situation (mean ¼4.16, standard
deviation ¼0.82, rank ¼1).
Students extend or adapt simple models
from one situation to another, either by
configuring the model through a graphical
user interface or by modifying or
extending existing code.
Emphasis on simple models.
Junior and senior levels
Construct models Visualize the data on the degrees of
freedom characterizing a model using the
numerical outputs from a simulation
(mean ¼4.19, standard deviation ¼0.86,
rank ¼2).
Students connect simulation and
visualization by first visualizing data using
numerical outputs from a simulation and
then interacting with the visualization to
engage in critical thinking about the
simulated model.
An expert from industry mentioned Being
able to identify the right tools to generate
this kind of visualizations and link them to
the simulation model being developed is a
very useful skill in my opinion.
An expert from academia mentioned
Linking simulation tools and
visualization software. Currently, my
students think of the simulation as
independent of the visual. Then, the
visualization is an afterthought or simply a
post-computation demonstration.
Interacting with the visualization opens
critical thinking.
Implement (program) a computational
model of a mathematical description of an
event or phenomenon, which is a
discretized approximation of the
mathematical model (mean ¼4.03,
standard deviation ¼0.79, rank ¼4).
Students implement simple computational
models by creating discretized
mathematical descriptions of an event or
phenomenon, using high-level
programming languages or scientific
computing software.
Emphasis on the use of high-level
programming languages or scientific
computing software.
An expert from industry provided the
following comment: Implementing
computational models feels to me
advanced for Junior/Seniors. I suppose it
depends on what is meant by this. If its
discretizing measurements and considering
issues like sampling rate and window
effects, that seems reasonable. Writing
their own ODE solver feels distracting
from lessons that would benefit them more
(like lessons on model choice, sample rate,
noise, and window effects).
Use models Use simulations to understand and explore
theories and relationships and interactions
of phenomena at different scales
(i.e., length, time) (mean ¼4.38,
standard deviation ¼0.67, rank ¼1).
Students use simulations at different scales
to deploy the correct solution method,
inputs, and other parameters to explore
theories and identify relationships between
modeled phenomena.
An expert from academia provided the
following comment: Students need to
understand if the solution is correct and
to choose the correct methods and
inputs/meshing/boundary conditions.
An expert from academia mentioned
Undergraduate students will have
difficulties selecting an appropriate
method; however, once the method is
provided students should be able to
implement it.
Use simulations to design, modify,
or optimize materials, processes, and
products (mean ¼4.57, standard
deviation ¼0.64, rank ¼11).
Students use computational models or
simulations to design, modify, or optimize
materials, processes, products, or systems.
An expert from academia suggested
systems is missing.
© ASCE 04017008-15 J. Prof. Issues Eng. Educ. Pract.
J. Prof. Issues Eng. Educ. Pract., 2017, 143(4): -1--1
Downloaded from ascelibrary.org by Purdue University Libraries on 05/11/17. Copyright ASCE. For personal use only; all rights reserved.
Appendix IV (Continued.)
Model Skill Learning performance Observation
Use simulations to design new experiments
to test theories (mean ¼4.43, standard
deviation ¼0.68, rank ¼8).
Students use computational models or
simulations to design experiments to test
theories, prototypes, products, materials,
and so forth.
An expert from industry suggested
Remove the word new.It suggests that
students have to come up with new designs
as opposed to students using simulations as
a way to design experiments.
Use simulations to infer and predict
physical phenomena or the behaviors of
engineered systems (mean ¼4.65,
standard deviation ¼0.48, rank ¼13).
Students use computational models or
simulations to infer and predict physical
phenomena or the behaviors of engineered
systems.
When asked if this statement should be
merged with the following one, two
experts from academia suggested This
statement should stand on its own. The
emphasis here is on design of experiments.
Like using simulations as a prior phase of
the experimentation process.
Compare and
evaluate models
Identify trade-offs of modeling and
simulation including performance,
accuracy, validity, and complexity
(mean ¼4.62, standard deviation ¼0.59,
rank ¼3).
Students evaluate the benefits and
disadvantages of competing computational
models or simulations by determining and
weighing factors such as assumptions,
limitations, precision, accuracy, reliability,
validity, and complexity.
An expert from industry provided a
comment: Critiquing your own models
and those of others to identify the
assumptions and limitations of the model
outputs.
Compare results from different simulations
of the same problem and explain
differences (mean ¼4.41, standard
deviation ¼0.79, rank ¼12).
An expert from academia suggested
There is obsession with
precision but no obsession with accuracy.
Students confuse precision with accuracy.
Statements were merged based on the
following comment from an expert from
academia: The first item in the list in that
section identify trade-offs :::I would
suggest might not be first in that category.
I would move it down below the second
one Identify differences in :::My
reasoning is that differences would need to
be understood and identified in order to
identify the trade-offs in a systematic way.
Choose an appropriate modeling approach
or method for a given problem or situation
(mean ¼4.59, standard deviation ¼0.63,
rank ¼14).
An expert from industry and an expert
from academia suggested moving this skill
to the M.S. and Ph.D. levels.
Identify differences in model
representations used for different
phenomena at different scales
(mean ¼4.30, standard deviation ¼0.69,
rank ¼5).
An expert from academia suggested
I would remove this one since it is already
covered above with: Students use
simulations at different scales to deploy the
correct method and inputs to explore
theories and identify relationships.
Acknowledge uncertainty into the
interpretation of simulation predictions
(mean ¼4.59, standard deviation ¼0.59,
rank ¼7).
Students acknowledge and estimate
uncertainty as part of the interpretation of
simulation predictions.
An expert from industry suggested
Acknowledge is too broad. Students must
have that skill developed. Students must be
able to quantify uncertainty.
An expert from academia said quantify is
really difficult, especially for
undergraduates, however, a good skill
would be to acknowledge uncertainty and
estimate it.
Revise models Calibrate simulation models with tests
(mean ¼4.41, standard deviation ¼0.79,
rank ¼9).
Students use external data, theories, or
additional simulation tools to calibrate,
verify, or improve the accuracy of
computational models or simulations.
Suggestions for rewording were received
from three experts. For example, an expert
from academia commented Should
existing databe changed to data from
literatureor external datato make clear
that it is not the students previous data?
M.S. and Ph.D. levels
Construct models Develop algorithms and methods that
simulate the described physical models
and engineered systems as simulations
(mean ¼4.03, standard deviation ¼0.94,
rank ¼1).
Students construct new computational
models or simulations by developing
algorithms and methods that simulate
physical phenomena and engineered
systems.
An expert from academia concurred
Sounds good. Emphasis on new
computational modelsis an important
element.
© ASCE 04017008-16 J. Prof. Issues Eng. Educ. Pract.
J. Prof. Issues Eng. Educ. Pract., 2017, 143(4): -1--1
Downloaded from ascelibrary.org by Purdue University Libraries on 05/11/17. Copyright ASCE. For personal use only; all rights reserved.
Appendix IV (Continued.)
Model Skill Learning performance Observation
Use models Link simulation tools directly to
measurement devices (e.g., large-scale
numerical computing, data-intensive
computing, sensors, imaging, grid
computing, among others, for real-time
control of simulations and computer
predictions (mean ¼3.57, standard
deviation ¼0.95, rank ¼7).
Students interface computational models
or simulations directly with measurement
devices such as sensors, imaging systems,
real-time control systems, and so forth.
A reviewer suggested The notion of grid
computingin level 3 is a bit unexpected.
I do think knowledge of grid computing,
cloud computing and HPC is particularly
important when learning about
simulation.
Compare and
evaluate models
Identify existing algorithms or
computational methods to describe
physical models and engineered systems as
computational representations of
simulations (mean ¼3.95, standard
deviation ¼0.84, rank ¼10).
Students discern between different
algorithms or computational methods to
describe physical models or engineered
systems as computational representations.
This one also includes the one from above:
Choose an appropriate modeling approach
or method for a given problem or situation
(mean ¼4.59, standard deviation ¼0.63,
rank ¼14).
Determine and quantify the reliability of
computer simulations and their predictions
(mean ¼4.49, standard deviation ¼0.64,
rank ¼6).
Students determine and quantify the
reliability of computer simulations and
their predictions.
Determine variability in data due to
immeasurable or unknown factors via
uncertainty-quantification methods or
techniques (mean ¼4.11, standard
deviation ¼0.83, rank ¼2).
Students determine variability in data
due to immeasurable or unknown factors
via uncertainty-quantification methods or
techniques.
Define error, stability, machine precision
concepts, and the inexactness of
computational approximations
(e.g., convergence, including truncation
and round-off) (mean ¼4.14, standard
deviation ¼0.91, rank ¼4).
Students evaluate algorithms by
determining uncertainties and defining
error, stability, machine precision
concepts, and the inexactness of
computational approximations
(e.g., convergence, including truncation
and round-off).
Verify a simulation model by determining
the accuracy with which the computational
model represents the mathematical model
based on software engineering protocols,
bug detection and control, scientific
programming methods, and a posteriori
error estimation (mean ¼3.76, standard
deviation ¼0.91, rank ¼6).
Students verify a simulation model based
on software engineering protocols, bug
detection and control, scientific
programming methods.
An expert from academia suggested
Im thinking how this one and the one
above are similar or different. It seems the
one above could almost be nested into the
a posteriori error estimationof this one.
That, or the error estimation could just be
left off of this one. That way the previous
one is about numerical accuracy and
exactness, while this one is about
validating that the simulation code is doing
the right thing.
Validate a simulation model by
determining the accuracy with which the
mathematical model depicts the actual
phenomena based on prescribed
acceptance criteria such as observations,
experiments, experience, and judgment
(mean ¼4.43, standard deviation ¼0.59,
rank ¼8).
Students validate a simulation model based
on prescribed acceptance criteria such as
observations, experiments, experience, and
judgment.
Revise models Identify the need to change models as
scales are bridged to maintain
computational tractability (mean ¼4.00,
standard deviation ¼0.81, rank ¼3).
Students identify the mechanisms for
exchanging information to bridge models
across scales and maintain computational
tractability.
An expert from academia commented the
following before revising the statement:
This one is a little vague. Is the emphasis
placed on the need to notice that one most
change models? Or is the emphasis placed
on students ability to change models.
I think that noticing the need was already
mentioned above.
The expert helped revise the statement in
its final version.
© ASCE 04017008-17 J. Prof. Issues Eng. Educ. Pract.
J. Prof. Issues Eng. Educ. Pract., 2017, 143(4): -1--1
Downloaded from ascelibrary.org by Purdue University Libraries on 05/11/17. Copyright ASCE. For personal use only; all rights reserved.
Appendix IV (Continued.)
Model Skill Learning performance Observation
Scrutinize and upgrade any portion of a
model or simulation in a controlled manner
(mean ¼4.08, standard deviation ¼0.82,
rank ¼5).
Students iteratively and systematically
evaluate and improve the fidelity, accuracy,
reliability, performance, and cost
(monetary and computational) of their
computational models or simulations.
An expert from industry provided the
following comment: Students need to
learn how to determine basic economic
evaluation of value (size of prize) for
simulations.
Involve simulations and experiments
(or field data) interactively to improve the
fidelity of the simulation tool, its accuracy,
and its reliability (mean ¼4.19, standard
deviation ¼0.73, rank ¼9).
An expert from academia suggested
I guess Id include computational cost.
Just cost seems like monetary cost.
Acknowledgments
The work reported here was supported in part by the National
Science Foundation under the award EEC 1449238. The views
represented here are those of the author and do not represent the
National Science Foundation. The author is exceedingly grateful to
the participants in the Delphi study and other collaborators and
students who provided feedback throughout the process.
References
ABET. (2013). 2014-2015 criteria for accrediting engineering programs.
Engineering Accreditation Commission, Baltimore.
ACM (Association for Computing Machinery)IEEE. (2013). Computer
science curricula 2013: Curriculum guidelines for undergraduate degree
programs in computer science curriculum guidelines for undergraduate
degree programs in computer science.Joint Task Force on Computing
Curricula Association for Computing Machinery and IEEE-Computer
Society.
Alabi, O., Magana, A. J., and Garcia, R. E. (2015). Gibbs, computational
simulation as a teaching tool for studentsunderstanding of thermody-
namics of materials concepts.J. Mater. Edu., 37(56), 239260.
ASCE. (2008). Civil engineering body of knowledge for the 21st century
Preparing the civil engineer for the future, 2nd Ed., Reston, VA.
ASEE (American Society for Engineering Education). (2013). Transform-
ing undergraduate education in engineering (TUEE).Phase I: Syn-
thesizing and integrating industry perspectives, Arlington, VA.
Balogh, Z. E., and Criswell, M. E. (2013). Framework of knowledge for
masters-level structural engineering education.J. Prof. Issues Eng.
Edu. Pract.,10.1061/(ASCE)EI.1943-5541.0000176, 04013007.
Chen, J. C., Ellis, M., Lockhart, J., Hamoush, S., Brawner, C. E., and
Tront, J. G. (2000). Technology in engineering education: What do
the faculty know and want?J. Eng.Edu., 89(3), 279283.
Corcoran, T. B, Mosher, F. A., and Rogat, A. (2009). Learning progres-
sions in science: An evidence-based approach to reform.CPRE Re-
search Rep. No. RR-63, Consortium for Policy Research in Education,
Center on Continuous Instructional Improvement, Teachers College,
Columbia Univ., New York.
Cortes, J. M, Pellicer, E., and Catala, J. (2011). Integration of occupational
risk prevention courses in engineering degrees: Delphi study.J. Prof.
Issues Eng. Educ. Pract.,10.1061/(ASCE)EI.1943-5541.0000076,
3136.
Dalkey, N. C., Brown, B. B., and Cochran, S. (1969). The Delphi method:
An experimental study of group opinion, Vol. 3, Rand Corporation,
Santa Monica, CA.
Desimone, L. M, and Le Floch, K. C. (2004). Are we asking the right
questions? Using cognitive interviews to improve surveys in education
research.Educ. Eval. Policy Anal., 26(1), 122.
DoD (Department of Defense). (2006). Acquisition modeling and
simulation master plan.Office of the Under Secretary of Defense,
Washington, DC.
DoD (Department of Defense). (2008). DoD modeling and simulation
body of knowledge (BOK).AcqNotes, Washington, DC.
DOE (Department of Energy). (2010). Computational materials science
and chemistry: Accelerating discovery and innovation through
simulation-based engineering and science.Rep. Dept. of Energy
Workshop on Computational Materials Science and Chemistry for
Innovation, Office of Science, Bethesda, MD.
Duffield, C. (1993). The Delphi technique: A comparison of results
obtained using two expert panels.Int. J. Nurs. Stud., 30(3), 227237.
Duncan, R. G., and Hmelo-Silver, C. E. (2009). Learning progressions:
Aligning curriculum, instruction, and assessment.J. Res. Sci. Teach.,
46(6), 606609.
Dym, C. L., Agogino, A. M., Eris, O., Frey, D. D., and Leifer, L. J. (2005).
Engineering design thinking, teaching, and learning.J. Eng. Educ.,
94(1), 103120.
Emmott, S. (2008). Towards 2020 science.Sci. Parliament, 65(4), 3133.
Felder, R. M., and Brent, R. (2003). Designing and teaching courses to
satisfy the ABET engineering criteria.J. Eng. Educ., 92(1), 725.
Guzdial, M. (2011). From science to engineering.Commun. ACM, 54(2),
3739.
Hasson, F., Keeney, S., and McKenna, H. (2000). Research guidelines for
the Delphi survey technique.J. Adv. Nurs., 32(4), 10081015.
Hsu, C. C., and Sandford, B. A. (2007). The Delphi technique: Making
sense of consensus.Pract. Assess. Res. Eval., 12(10), 18.
Hu, C. (2007). Integrating modern research into numerical computation
education.Comput. Sci. Eng., 9(5), 7881.
International Engineering Alliance. (2011). Washington Accord program
outcomes.Washington, DC.
Kadiyala, M., and Crynes, B. L. (2000). A review of literature on
effectiveness of use of information technology in education.J. Eng.
Educ., 89(2), 177189.
Kendall, J. E., Kendall, K. E., Smithson, S., and Angell, I. O. (1992).
SEER: A divergent methodology applied to forecasting the future roles
of the systems analyst.Human Syst. Manage., 11(3), 123135.
Krajcik, J., McNeill, K. L., and Reiser, B. J. (2008). Learning-goals-driven
design model: Developing curriculum materials that align with national
standards and incorporate project-based pedagogy.Sci. Educ., 92(1),
132.
Lenox, T. A., Ressler, S. J., ONeil, R. J., and Conley, C. H. (1997).
Computers in the integrated civil engineering curriculum: A time of
transition.Proc., 1997 American Society of Engineering Education
Conf., Milwaukee.
Li, J., and Fu, S. (2012). A systematic approach to engineering ethics
education.Sci. Eng. Ethics, 18(2), 339349.
Magana, A. J., Brophy, S. P., and Bodner, G. M. (2012). Instructors
intended learning outcomes for using computational simulations as
learning tools.J. Eng. Educ., 101(2), 220243.
© ASCE 04017008-18 J. Prof. Issues Eng. Educ. Pract.
J. Prof. Issues Eng. Educ. Pract., 2017, 143(4): -1--1
Downloaded from ascelibrary.org by Purdue University Libraries on 05/11/17. Copyright ASCE. For personal use only; all rights reserved.
Magana, A. J., and Coutinho, G. S. (2017). Modeling and simulation prac-
tices for a computational thinking-enabled engineering workforce.
Comput. Appl. Eng. Educ., 25(1), 6278.
Magana, A. J., Falk, L. M., and Reese, J. M. (2013). Introducing
discipline-based computing in undergraduate engineering education.
ACM Trans. Comput. Educ., 13(4), 122.
Magana, A. J., Falk, M. L., Vieira, C., and Reese, M. J. (2016). A case
study of undergraduate engineering studentscomputational literacy
and self-beliefs about computing in the context of authentic practices.
Comput. Human Behav., 61, 427442.
Magana, A. J., Falk, M. L., Vieira, C., Reese, M. J., Jr., Alabi, O., and
Patinet, S. (2017). Affordances and challenges of computational tools
for supporting modeling and simulation practices.Comput. Appl. Eng.
Educ., in press.
Magana, A. J., and Mathur, J. I. (2012). Motivation, awareness, and per-
ceptions of computational science.Comput. Sci. Eng., 14(1), 7479.
Maria, A. (1997). Introduction to modeling and simulation.Proc., 29th
Conf. on Winter Simulation, Atlanta.
McKenna, A. F., and Carberry, A. R. (2012). Characterizing the role of
modeling in innovation.Int. J. Eng. Educ., 28(2), 263269.
NRC (National Research Council). (2003). BIO 2010: Transforming
undergraduate education for future research biologists.Committee on
Undergraduate Biology Education to Prepare Research Scientists for
the 21st Century, National Academies Press, Washington, DC.
NRC (National Research Council). (2008). Integrated computational
materials engineering: A transformational discipline for improved com-
petitiveness and national security.Committee on Integrated Computa-
tional Materials Engineering, National Academies Press, Washington,
DC.
NRC (National Research Council). (2011). Report of a workshop on the
pedagogical aspects of computational thinking.National Academies
Press, Washington, DC.
NSF (National Science Foundation). (2004). Simulation based engineer-
ing and science: A report on a workshop.Arlington, VA.
NSF (National Science Foundation). (2006). Revolutionizing engineering
science through simulation.Rep. of the National Science Foundation
Blue Ribbon Panel on Simulation-Based Engineering Science,
Washington, DC.
NSF (National Science Foundation). (2011). Report of the high perfor-
mance computing task force advisory committee for cyberinfrastructure
task force on grand challenges.Washington, DC.
Okoli, C., and Pawlowski, S. D. (2004). The Delphi method as a re-
search tool: An example, design considerations and applications.
Inf. Manage., 42(1), 1529.
Prince, M., Vigeant, M., and Nottis, K. (2012). Development of the heat
and energy concept inventory: Preliminary results on the prevalence and
persistence of engineering studentsmisconceptions.J. Eng. Educ.,
101(3), 412438.
Rossouw, A., Hacker, M., and de Vries, M. J. (2011). Concepts and con-
texts in engineering and technology education: An international and
interdisciplinary Delphi study.Int. J. Technol. Des. Educ., 21(4),
409424.
Schwarz, C., Reiser, B. J., Achér, A., Kenyon, L., and Fortus, D. (2012).
Challenges in defining a learning progression for scientific modeling.
Learning progression in science: Current challenges and future direc-
tions, A. C. Alonzo and A. Wenk Gotwals, eds., Sense Publishers,
Boston.
Schwarz, C., and White, B. Y. (2005). Metamodeling knowledge:
Developing studentsunderstanding of scientific modeling.Cognit.
Instruction, 23(2), 165205.
Schwarz, C. V., et al. (2009). Developing a learning progression for sci-
entific modeling: Making scientific modeling accessible and meaning-
ful for learners.J. Res. Sci. Teach., 46(6), 632654.
Shiflet, A. B. (2002). Computer science with the sciences: An emphasis in
computational science.ACM SIGCSE Bull., 34(4), 4043.
Shiflet, A. B., and Shiflet, G. W. (2014). Introduction to computational
science: Modeling and simulation for the sciences, Princeton University
Press, Princeton, NJ.
Smith, C. L., Wiser, M., Anderson, C. W., and Krajcik, J. (2006). Impli-
cations of research on childrens learning for standards and assessment:
A proposed learning progression for matter and the atomic-molecular
theory.Meas. Interdiscip. Res. Perspect., 4(12), 198.
Streveler, R. A., Olds, B. M., Miller, R. L., and Nelson, M. A. (2003).
Using a Delphi study to identify the most difficult concepts for stu-
dents to master in thermal and transport science.Proc., Annual Conf.
of the American Society for Engineering Education, American Society
for Engineering Education, Nashville, TN.
Thornton, K., Nola, S., Garcia, R. E., Asta, M., and Olson, G. B. (2009).
Computational materials science and engineering education: A survey
of trends and needs.J. Miner. Metals Mater. Soc., 61(10), 1217.
Van Someren, M. W., Barnard, Y. F., and Sandberg, J. A. C. (1994). The
think aloud method: A practical guide to modelling cognitive processes,
Vol. 2, Academic Press, London.
Vergara, C. E., et al. (2009). Aligning computing education with engineer-
ing workforce computational needs: New curricular directions to im-
prove computational thinking in engineering graduates.Frontiers in
Education Annual Conf., IEEE, San Antonio.
Vieira, C., Magana, A. J., Falk, M. L., and Garcia, R. E. (2017). Writing
in-code comments to self-explain in computational science and engi-
neering education.ACM Trans. Comput. Educ. (TOCE), in press.
Vieira, C., Magana, A. J., Roy, A., Falk, L. M., and Reese, J. M. (2016a).
Exploring undergraduate studentscomputational literacy in the con-
text of problem solving.Comput. Educ. J., 7(1), 100112.
Vieira, C., Magana, A. J., Roy, A., Falk, L. M., and Reese, J. M. (2016b).
In-code comments as a self-explanation strategy for computational
science education.Proc., 123rd ASEE Annual Conf. and Exposition,
American Society for Engineering Education, New Orleans.
Wiggins, G., and McTighe, J. (1997). Understanding by design, Associa-
tion for Supervision and Curriculum Development, Alexandria, VA.
Wiggins, G., and McTighe, J. (2005). Understanding by design, Associa-
tion for Supervision and Curriculum Development, Alexandria, VA.
WTEC (World Technology Evaluation Center). (2009). International as-
sessment of research and development in simulation-based engineering
and science, S. C. Glotzer, ed., Baltimore.
© ASCE 04017008-19 J. Prof. Issues Eng. Educ. Pract.
J. Prof. Issues Eng. Educ. Pract., 2017, 143(4): -1--1
Downloaded from ascelibrary.org by Purdue University Libraries on 05/11/17. Copyright ASCE. For personal use only; all rights reserved.
... Internationally, the development of a modelling competence is considered one a key learning objective in the field of STEM education. Models are frequently employed to gain insights into the process of modelling (Khan & Krell, 2019;Magana, 2017;Pham & Tytler, 2022;Schwarz et al., 2009). During modelling, students engage in activities to generate, evaluate and modify models; they also develop their scientific practices, such as asking questions about natural phenomena, reasoning and scientific argumentation (Khan & Krell, 2019). ...
... Engineers use models as a language to enhance their engineering design processes and their computational understanding of a problem (Carberry & McKenna, 2014). For example, engineering models are used as design tools for designing technologies (Pleasants & Olson, 2019), analytic tools to support the study of complex phenomena, and predictive tools to anticipate the suitability of new designs (Magana, 2017). These processes are also used with models for scientific purposes. ...
... While the epistemic aims of modelling in science and engineering might be the same, other practices such as prediction differs slightly between these two domains. In science, the predictive power of models is used to develop and test hypotheses about the natural world, whereas in engineering, it is used to also characterize future innovations, processes, or systems and placing a lateral emphasis on decisions about something that may happen, such as risk-mitigation (Magana, 2017). ...
Article
Full-text available
Models and modelling play a critical role in science education to engage students more fully in science practices. Few studies have investigated the nature of models and modelling in integrated STEM teacher education. This study examines pre-service science teachers’ (PSTs) understanding of the nature of models and modelling in a STEM methods course. Model and modelling for authentic STEM are used as a theoretical lens for conceptualising PSTs’ understanding of the nature of models and modelling. Interpretive research was used to analyse how this course contributed to PSTs’ understanding of the nature of models and modelling based on four dimensions: meanings, purposes, processes and the complexity of models and modelling. Data were collected through questionnaires. Inductive content analysis was used to reveal distinct patterns of PSTs’ understandings. The findings indicated that at the beginning of the course, PSTs understood that models were a replication of phenomena or a prototype. By the end of the course, they understood modelling as a practice to explain and predict phenomena in science to solve problems and improve the quality of life through engineering. By the end of the course, PSTs viewed modelling as a bridge between science and engineering within the context of an integrated STEM education. The PSTs showed marked shifts by the end of the course by demonstrating a deeper understanding of modelling as a dynamic process. PSTs saw the integration of science and engineering in STEM as a route for epistemic agency on behalf of their students and a greater appreciation of model complexity. This study suggests that introducing the nature of modelling in science and engineering assists the teaching of STEM. The model and modelling implications for STEM teacher education are discussed.
... Just developing technical skills is not enough; industries need people who have the ability to collaborate and work with employees from diverse backgrounds [3]. Communication and teamwork have been identified as critical 21stcentury skills that every young graduate must develop [2], [4], [5]. Studies [3], [6], [7] have demonstrated that teamwork has many benefits for students that help them improve their skills, such as promoting communication and collaboration skills and letting the student take over the role of arranging tasks. ...
... The Dickinson and McIntyre [11] model emphasizes that communication is the key to teamwork. Studies [2], [4], [5], [18] also revealed that good teamwork Opportunity for equal contribution/participation Improve Social Sensitivity Improve Empathy communication helps them effectively set their goals, assign roles, and work together to achieve them. The student quoted below shows how much students valued good teamwork communication. ...
... Elements of authentic assessments that promote the transfer of KSAs have been theorized [7] but require further examination. Simulation-based learning has been applied to enhance education of various disciplines and prepare students to undertake critical decisions, especially in engineering [8] - [10]. As a form of experiential learning, engineering simulations further provides a wide range of opportunities to practice complex skills in higher education to facilitate effective learning [11]. ...
... Fortunately, today, digital tools allow for predictive simulations in industrial process engineering [12]. Thus, with the application of the same, both industries [13]- [15], and especially engineering students [16], [17], can carry out simulations of large processes in a totally safe environment and where a real temporality is not required to obtain scalable results, prior to implementations, or modifications in existing production systems, which entail high costs and risks.. Process optimization in industrial manufacturing is a crucial aspect that leverages both traditional engineering principles and advanced technological tools to enhance the efficiency and effectiveness of production lines. The goal of process optimization is to minimize costs, maximize output, and maintain product quality, thereby ensuring that manufacturing processes are not only economically viable but also environmentally sustainable and socially responsible. ...
Article
Full-text available
Este artículo presenta una metodología didáctica aplicada a los estudios de ingeniería. A partir del uso de herramientas digitales se desarrolla una metodología para modelar y simular un proceso de fabricación industrial. La metodología comienza con el análisis del proceso de fabricación, por lo que los estudiantes realizan una aproximación analítica al modelado del sistema productivo analizado. A continuación, se utiliza una herramienta digital para modelar dicho sistema en base a los parámetros previamente analizados. Con la simulación del modelo, los estudiantes de ingeniería analizan los resultados de tiempos y costes específicos de cada proceso y producto. En base a estos se presentan propuestas de mejora para optimizar el proceso que se modela y simula nuevamente para comprobar la eficiencia y beneficio de las mejoras propuestas en el proceso. Con esta metodología, los estudiantes de ingeniería se introducen en un contexto escalable de la industria real para realizar mejoras seguras, al mismo tiempo que desarrollan habilidades en ingeniería de procesos y herramientas digitales.AbstractThis paper presents a didactic methodology applied to engineering studies. From the use of digital tools, a methodology is developed to model and simulate an industrial manufacturing process. The methodology begins with the analysis of the manufacturing process, so the students carry out an analytical approach to the modeling of the analyzed production system. Next, a digital tool is used to model said system based on the previously analyzed parameters. With the simulation of the model, engineering students analyze the results of specific times and costs of each process and product. Based on these, improvement proposal are presented to optimize the process that are modelling and simulated again to check the efficiency and benefit of the proposed improvements in the process. With this methodology, engineering students are introduced to a scalable context of real industry for safe improvements, while developing skills in digital tools and processes engineering.
... Para Magana (2017) el modelado y la simulación se refieren a una combinación de procesos en el que el comportamiento de un sistema se demuestra o predice mediante una representación computacional. Estos procesos están altamente interrelacionados y en ocasiones se usan indistintamente. ...
Article
Full-text available
The present study aims to assess the development of mathematical modeling skills in engineering students in a differential equations course at the Universidad Cooperativa de Colombia. A quasi-experimental quantitative approach is adopted, dividing the participants into an experimental group and a control group. In the experimental group, students become familiar with mathematical modeling techniques that emulate the approach used by engineers when analyzing physical systems. In the control group, traditional teaching techniques are employed. The results indicate that students in the experimental group demonstrate a remarkable ability to analyze physical systems while identifying variables, parameters, and physical laws governing the behavior of these systems. In contrast, these skills are not observed in the control group. In conclusion, incorporating mathematical modelling tools in the classroom from an engineering perspective has a major impact on student skill development.
... Background Simulation and modeling are common in modern engineering [2,3] and can offer competitive advantages [4]. There are three main advantages provided by simulation. ...
... Future research can investigate if these trends continue in different contexts with larger populations of students at various levels. Similarly, although these findings present insights into optimization facets at different levels, it does not explicitly connect these facets to other learning progressions or trajectories of student ideas at each of these stages (e.g., Magana, 2017). Future work can explore in more detail the kinds of sequences or progressions of different optimization facets at different levels. ...
Article
Full-text available
Despite an increasing focus on integrating engineering design in K-12 settings, relatively few studies have investigated how to support students to engage in systematic processes to optimize the designs of their solutions. Emerging learning technologies such as computational models and simulations enable rapid feedback to learners about their design performance, as well as the ability to research how students may or may not be using systematic approaches to the optimization of their designs. This study explored how middle school, high school, and pre-service students optimized the design of a home for energy efficiency, size, and cost using facets of fluency, flexibility, closeness, and quality. Results demonstrated that students with successful designs tended to explore the solution space with designs that met the criteria, with relatively lower numbers of ideas and fewer tightly controlled tests. Optimization facets did not vary across different student levels, suggesting the need for more emphasis on supporting quantitative analysis and optimization facets for learners in engineering settings.
... In today's engineering workspaces, modeling, simulation, and computational tools are commonly used to aid in the research and design of systems. Because of this, modeling and simulation capabilities have been included in several scientific and engineering fields as analytical tools to improve the understanding of complex phenomena and as predictive tools to evaluate the viability of new designs (Magana 2017). ...
Chapter
Full-text available
The primary focus of this research is to develop digital human models (DHM) that include accurate posture and motion prediction models for a diverse population. The posture and motion prediction models currently employed in DHMs must be adapted and improved based on actual motion data in order to provide validity for simulations of challenging dynamic activities (Ahmed et al. 2018). Additionally, if accurate models for predicting human posture and motion are developed and applied, they can be combined with psychophysical and biomechanical models to provide a deeper understanding of dynamic human performance and population-specific limitations, and these new DHM models will eventually serve as a useful tool for ergonomics design. In this line, we are making an effort to forecast driver comfort and postures when designing a mars rover's seat and peripherals using RAMSIS software. The core of RAMSIS is a highly accurate three-dimensional human model that is built on anthropometry databases from around the globe and can simulate humans with a variety of body measurements. To assess comfort during the design process, we employ a variety of additional analysis techniques, including those for comfort studies, eyesight, reachability, and force.KeywordsErgonomic AnalysisDigital Human ModelingRAMSISSimulation
Article
Full-text available
Computational thinking has been recognized as a collection of understandings and skills required for new generations of students not only proficient at using tools, but also at creating them and understanding the implication of their capabilities and limitations. This study proposes the combination of modeling and simulation practices along with disciplinary learning as a way to synergistically integrate and take advantage of computational thinking in engineering education. This paper first proposes a framework that identifies different audiences of computing and related computational thinking practices at the intersection of computer science and engineering. Then, based on a survey with 37 experts from industry and academia, this paper also suggests a series of modeling and simulation practices, methods, and tools for such audiences. Finally, this paper also reports experts' identified challenges and opportunities for integrating modeling and simulation practices at the undergraduate level. ß
Conference Paper
Full-text available
Computational science and engineering is an important field that integrates computational tools and methods along with and disciplinary sciences and engineering to solve complex problems. However, several research studies and national agencies report that engineering students are not well prepared to use or create these computational tools and methods in the context of their discipline. Furthermore, some of the skills within computational science and engineering (e.g., programming) can be difficult to learn. This study explores potential pedagogical strategies for the implementation of worked-examples in the context of computational science and engineering education. Students’ self-explanations of a worked-example are collected as in-code comments, and analyzed to identify effective self-explanation strategies. The results from this study suggest that students’ in-code comments: (1) can be used to elicit self-explanations and engage students in exploring the worked-example; and (2) show differences that can be used to identify the selfexplanation effect
Book
Prepared by the Body of Knowledge Committee of the Committee on Academic Prerequisites for Professional Practice of ASCE. This report focuses on outcomes of proposed changes in the way civil engineering is taught and learned, including the knowledge, skills, and attitudes necessary for entry into professional practice. The first Body of Knowledge report, published in 2004, outlined 15 areas of knowledge that, when fulfilled by means of formal education and experience, prepare an engineer for practice at the professional level. This new edition expands the 15 outcomes to 24 organized into three categories: foundational, technical, and professional. It also makes use of Bloom’s Taxonomy to describe minimum cognitve levels for each outcome. This report offers a detailed roadmap for engineering educators and professionals to change the way civil engineering is practiced by reforming the manner in which tomorrow’s civil engineers are prepared for tomorrow. The body of the report defines the body of knowledge, presents the means of fulfilling those requirements, and offers guidance for faculty, students, engineer interns, and practitioners. The numerous appendixes include ASCE Policy 465, Emergence of the Body of Knowledge; an explanation of Bloom’s Taxonomy; rubrics for measuring a student’s knowledge according to the taxonomy; and the importance of education in humanities and social sciences, sustainability, globalization, public policy, and personal attitudes. This report is essential reading for anyone involved in the education of student and younger engineers. © 2008 by the American Society of Civil Engineers. All Rights Reserved.
Article
This article presents two case studies aimed at exploring the use of self-explanations in the context of computational science and engineering (CSE) education. The self-explanations were elicited as students’ in-code comments of a set of worked-examples, and the cases involved two different approaches to CSE education: glass box and black box. The glass-box approach corresponds to a programming course for materials science and engineering students that focuses on introducing programming concepts while solving disciplinary problems. The black-box approach involves the introduction of Python-based computational tools within a thermodynamics course to represent disciplinary phenomena. Two semesters of data collection for each case study allowed us to identify the effect of using in-code comments as a self-explanation strategy on students’ engagement with the worked-examples and students’ perceptions of these activities within each context. The results suggest that the use of in-code comments as a self-explanation strategy increased students’ awareness of the worked-examples while engaging with them. The students’ perceived uses of the in-code commenting activities include: understanding the example, making a connection between the programming code and the disciplinary problem, and becoming familiar with the programming language syntax, among others.
Article
This mixed-methods sequential explanatory design investigates disciplinary learning gains when engaging in modeling and simulation processes following a programming or a configuring approach. It also investigates the affordances and challenges that students encountered when engaged in these two approaches to modeling and simulation.
Article
Modeling is a core skill for engineering students and a pervasive feature of the engineering curriculum. Engineering students engage in modeling anytime they use an equation, flow chart, force diagram, or any other representation of some physical phenomena regardless of discipline. In this way modeling relates to both design process and analysis; however, students do not always recognize the full and nuanced ways that these two interact. This paper reports results from our research that is exploring the role that computational, analytical, and modeling abilities play in innovation, in the context of engineering design education. Our study reports results on faculty and students' conceptions on the role of modeling in design. Specifically, our study sheds light on the variations in how faculty and students describe how to model a design idea or solution, and the different ways each group perceives how models can be useful/helpful in the design process. Our findings indicate that students recognize the descriptive value of physical models but mention the more abstract mathematical or predictive nature of modeling less often. In addition, we found significant differences between students and faculty responses in providing mathematics or theory as an approach to modeling a design solution.
Article
The interdisciplinary field of computational science combines simulation, visualization, mathematical modeling, programming, data structures, networking, database design, symbolic computation, and high performance computing with various scientific disciplines. Despite the shortage of computational scientists, few programs and computational science textbooks appropriate for undergraduates exist. After extensive discussions on enhancing computer use in the sciences, Wofford College faculty members designed a curriculum for students majoring in science or mathematics, called "Emphasis in Computational Science." A student electing this program completes a Bachelor of Science, three existing courses (Programming in C++, Data Structures, Calculus I), two new computational science courses (Scientific Programming, Data and Visualization), and a summer internship. Application rich course modules that have been developed in collaboration with scientists are employed as the textbooks for the computational science courses. Available through the world wide web, these modules can instruct and provide applications for a variety of courses [4].