Content uploaded by Greg J. Strimel
Author content
All content in this area was uploaded by Greg J. Strimel on Oct 25, 2018
Content may be subject to copyright.
Examining Engineering Design Cognition with Respect to
Student Experience and Performance*
GREG J. STRIMEL
Purdue University, 349 Young Hall, West Lafayette, Indiana, 47907, USA. E-mail: gstrimel@purdue.edu
EUNHYE KIM
Purdue University, Seng-Liang Wang Hall, 516 Northwestern Ave., Suite 3500, West Lafayette, IN 47906, USA.
E-mail: kim1906@purdue.edu
SCOTT R. BARTHOLOMEW
Purdue University, 351 Young Hall, West Lafayette, Indiana, 47907, USA. E-mail: sbartho@purdue.edu
DIANA V. CANTU
Old Dominion University, 5115 Hampton Boulevard, Norfolk, VA 23529, USA. E-mail: dcant005@odu.edu
This study investigated the design cognition and performance results of secondary and post-secondary engineering
students while engaged in an engineering design task. Relationships between prototype performance and design cognition
were highlighted to investigate potential links between cognitive processes and success on engineering design problems.
Concurrent think-aloud protocols were collected from eight secondary and 12 post-secondary engineering students
working individually to design, make, and evaluate a solution prototype to an engineering design task. The collected
protocols were segmented and coded using a pre-established coding scheme. The results were then analyzed to compare the
two participant groups and determine the relationships between students’ design cognition, engineering experience level,
and design performance. Significant differences between participants with secondary engineering experiences and those
without were found in regards to the amount of time various cognitive processes were employed to complete a design task.
For the given design scenario, students with secondary engineering experiences achieved significantly higher rubric scores
than those without. Improved design performance was also found to be significantly correlated with more time employing
the mental processes of analyzing,communicating,designing,interpreting data,predicting, and questioning/hypothesizing.
Important links between educational experiences in engineering design, prior to college, and student success on engineering
design problems may indicate necessary shifts in student preparation.
Keywords: engineering; design; cognition; performance
1. Introduction
As initiatives to integrate engineering at the P-12
level continue to increase [1–5], it has become
evident there is widespread societal agreement for
fostering a student’s engineering design abilities as a
means to promote a better educated populace with
the 21st century skills necessary for future economic
success [6–8]. This expanded interest in engineering
can be attributed to the idea that immersing stu-
dents in engineering design experiences, which natu-
rally tie mathematics and science learning together
through solving authentic problems, is an essential
approach to provide new levels of relevancy to
education, motivate students in learning, make
STEM careers more accessible, and prepare stu-
dents with the skills to address the major challenges
facing the world today and in the future [1, 6, 9, 10].
Consequently, engineering design has become a
central component of P-12 education with both
technology education [12] and science education
[4, 11] incorporating engineering design in their
standards and curriculum. In this context, it
becomes important to assist educators in properly
enacting interventions that better enable students to
employ engineering design practices that produce
the most viable solutions to authentic problems.
However, as Dym, Agogino, Eris, Frey and Leifer
[13] explain, engineering design is challenging to
understand, teach, and evaluate because many
efforts to infuse engineering design are void of an
empirical understanding of students’ cognitive pro-
cesses as they engage in the engineering design
process [2].
To fill this void, researchers have begun to study
the design cognition of adolescents, college stu-
dents, and practitioners engaged in engineering
design activities [1, 2, 5, 14–21]. However, as
Grubbs [22] describes, even after decades of design
cognition research there is still minimal agreement
on how people design and limited examinations on
effective ways to bridge design research with teach-
ing and learning strategies. This concern may be
attributed to the emphasis of design cognition
research being focused time allocation to a set of
predetermined design process steps [5] for only a
segment of time during a student’s full design
activity. In doing so, researchers often lack oppor-
* Accepted 12 June 2018.1910
International Journal of Engineering Education Vol. 34, No. 6, pp. 1910–1929, 2018 0949-149X/91 $3.00+0.00
Printed in Great Britain #2018 TEMPUS Publications.
tunities to compare a student’s cognitive activity to
the actual outcome of their problem-solving pro-
cess. As Atman and Bursic [23] describe, examining
both the design process and design product can
enable one to explore the potential relationships
between the process the student follows and the
quality of their solutions. Atman and Brusic con-
tinue to explain that an understanding of this
potential relationship may help identify successful,
and unsuccessful, procedures in engineering design.
These findings could then be used to establish
interventions for the improved teaching of engineer-
ing design, help students to develop as more effec-
tive problem solvers [25], and support the
evaluation of current curricular efforts in regards
to engineering [1]. Consequently, it is important for
researchers to examine engineering design cognition
in a manner that enables the analysis of how a
student’s thinking can influence the outcome of
their engineering design process. In addition, in
light of the emphasis on engineering design at
various levels of education, it is import to examine
engineering design cognition in a manner that
enables the analysis of students at various experi-
ence levels. As described by Wilson et al., [5], this
type of research may provide ‘‘useful heuristics for
secondary engineering and science teachers who
seek to bridge adolescents’ existing engineering
practices to the formal practices of engineering by
identifying gaps and commonalities between the
two groups’ practices [p. 3].’’
Accordingly, the researchers enacted multiple
exploratory case studies to describe the cognitive
activities of experienced secondary engineering stu-
dents and traditional first-year engineering under-
graduate students as they designed, made, and
evaluated a solution prototype to an authentic
design task. The research was conducted intention-
ally to compare each student’s design process to the
product of their process as well as highlight poten-
tial indicators for developing more effective solu-
tions. In addition, the research design enabled the
comparison of these results with students’ experi-
ence levels. In doing so, the results may assist in
identifying ways in which to improve the teaching
and learning of engineering design.
2. Background
2.1 Engineering design
Design is widely considered a central element of
engineering [13]. It is believed that all practicing
engineers perform some form of design function and
as such, engineering programs accredited by the
ABET must ensure graduating students possess
the abilities to apply design procedures to solve
problems [25, 26]. However, as Gero [27] describes,
design is not limited to engineering and has been a
human function throughout history as people con-
tinuously work to improve their lives and capabil-
ities. Therefore, as Simon [28] stated, the act of
design can be viewed as a natural human process
that involves identifying and understanding a pro-
blem or opportunity and devising a plan of action to
resolve the problem or address the opportunity.
With these explanations, one may classify design
simply as a general problem-solving process that
people employ to improve their situation. Conse-
quently, design may sometimes be enacted merely as
an inadvertent trial-and-error approach to solving
problems in classroom environments. However,
engineers do not just design, rather, they are habi-
tually trained to follow a more explicit and inten-
tional process toward developing solutions known
as engineering design. This type of design involves
developing predictions through the deliberate appli-
cation of mathematical, scientific, and analytical
modeling practices to determine whether the condi-
tions for a potential solution are favorable [29].
Thus, Dym et al. [13] define engineering design as:
a systematic, intelligent process in which
designers generate, evaluate, and specify concepts
for devices, systems, or processes whose form and
function achieve clients’ objectives or users’ needs
while satisfying a specified set of constraints [p.
104].
The practices of engineering design are often
iterative in nature and integrate knowledge from a
variety of fields within social constructs to develop
and manipulate the designed world [30]. The key
elements of this practice involve establishing the
specifications for a successful design, conceiving
innovative solution designs, narrowing potential
solutions through predictive analysis, and optimiz-
ing a design through analytical modeling in order to
best meet competing criteria and constraints [30–32].
These elements all draw profoundly upon complex
cognitive functions [13, 17] and thus, it becomes
vital to better understand engineering design think-
ing for the potential of developing methods for
improving curriculum and instruction [33].
2.2 Engineering design cognition
As the role of design in engineering curriculum at all
levels of education has been established, it is funda-
mental to comprehend ways in which to best teach
and learn the practices of engineering design [13,
34]. Because design involves the complex mental
processes of inquiry, synthesis, analysis, and deci-
sion-making, research investigating how designers
think and learn has been conducted across multiple
disciplines and professions since the 1970’s [18].
Much of this research examined the cognitive pro-
Examining Engineering Design Cognition with Respect to Student Experience and Performance 1911
cesses or design practices of engineers, architects, or
post-secondary students as they worked to develop
a solution design [17, 22, 35]. The intent of this track
of research has primarily been to establish better
methods to prepare future designers [36]. Wilson et
al., [5] claim it is essential to examine engineering
design cognition at all levels from adolescents to
advanced practitioners as a means to identify ways
in which to fully support adolescents in developing
the habits of mind practiced by professional engi-
neers.
Currently, the increased prominence of P-12
engineering education has likely led to the increase
of design cognition research with primary and
secondary students. Much of this research aims
toward better cultivating student design thinking
capabilities and integrating disciplinary content
with process knowledge. The common method for
examining design cognition has been the concurrent
think–aloud protocol procedure employing verbal
protocol analysis. The concurrent think–aloud pro-
cedure is used to collect a person’s actions while
solving a predetermined design task along with their
own verbal interpretation of their thought processes
as they preform those actions [2]. The resulting
verbal interpretations are then analyzed using a
verbal protocol analysis technique, which typically
applies a previously derived coding scheme over a
video recording or transcription of a design session
[37]. The coded data are then used to describe the
processes and procedures students follow to design.
However, much of this type of research paints an
incomplete picture of design cognition and provides
limited ways in which to synthesize findings across
multiple studies. For example, the researchers have
identified 14 unique design cognition studies [1, 2, 5,
14, 17–21, 38–42] between 1995 and 2016 involving
participants at the P-12 level. Of these studies, only
two required students to produce a physical proto-
type of their solution—suggesting that over 85% of
the studies did not provide any means for compar-
ing process with product performance. We assert
that only studying students through the point of
producing a design concept limits the understanding
of the complex mental processes involved in the
iterative production of a functional prototype. In
addition, the majority of these design cognition
studies collect student data in group-settings;
which, although beneficial, does not capture an
individual student’s thought process(es). Addition-
ally, the 14 identified studies employed eight differ-
ent coding schemes to analyze data, which are all
based upon different conceptual foundations. The
variety of coding schemes limits the ability to
compare findings across studies and these can be
problematic as most design studies involve small
samples of student populations.
3. Statement of the problem
While engineering students are often taught to use
an idealistic engineering design process to solve
problems, it is unclear exactly how people with
various levels of experience cognitively navigate a
complex and multifaceted engineering design pro-
blem. With greater insight into design cognition,
educators may be better equipped to manage the
difficulties in planning for, and assessing, student
abilities in producing viable solutions to engineering
design tasks. Therefore, the purpose of this study
was to identify the cognitive processes employed by
experienced secondary and traditional post-second-
ary engineering students to solve an engineering
design problem in an effort to expand the under-
standing of how these students, with different back-
grounds, cognitively navigate an engineering design
process from design conception through the pro-
duction of a physical prototype. In addition, this
research was intentionally designed to compare
student’s thinking process(es) with the effectiveness
of their physical prototype. This enabled the
researchers to investigate potential relationships
between a students’ process and their designed
product—allowing for the identification of poten-
tially significant cognitive predictors of success in
engineering design.
4. Research objectives
The research objectives that guided this study were:
RO
1
: Identify the cognitive processes experienced
secondary engineering students use to design,
make, and evaluate functional prototypes to an
engineering design problem.
RO
2
: Identify the cognitive processes traditional
post-secondary engineering students use to
design, make, and evaluate functional prototypes
to an engineering design problem.
RO
3
: Compare the design cognition and perfor-
mance of experienced secondary and traditional
post-secondary engineering students.
RO
4
: Determine potential identifiers within engi-
neering design cognition related to student apti-
tude in successfully designing and making
solutions.
Working hypotheses were established for the
third research objective of comparing the design
cognition of experienced secondary engineering
students and traditional post-secondary engineer-
ing students. The researchers expected the tradi-
tional post-secondary engineering students to have
completed a more rigorous study of science and
mathematics for entry into a college engineering
program and have also been exposed to college level
Greg J. Strimel et al.1912
engineering design models. Therefore, the post-
secondary students should exhibit a more informed
and analytical process of solution development
while the secondary students should exhibit more
of a practical and more prototype-oriented process
of solution development. Specifically, the research-
ers hypothesized that the post-secondary students
will (a) devote more time to designing a solution
(i.e., cognitive processes such as Defining Pro-
blem(s),Designing,Analyzing, and Predicting)
and less time physically making a solution (i.e., the
cognitive process Modeling/Prototyping Construct-
ing), (b) employ the scientific and mathematical
cognitive processes (Computing, Interpreting Data,
Observing, Experimenting, and Questioning/
Hypothesizing) for a greater amount of time than
secondary students, and (c) the post-secondary
students will develop more effective (i.e., better at
solving the problem) solutions to the design task.
Lastly, a working hypothesis for the fourth research
objective was that specific cognitive processes could
be identified as significant predictors of design
performance.
5. Methodology
5.1 Data collection and analysis
This study employed a multiple exploratory case
study approach using a concurrent think-aloud
protocol procedure to identify the cognitive pro-
cesses used by both secondary and traditional post-
secondary engineering students as they worked to
develop physical solutions to an engineering design
task. The concurrent think-aloud protocol proce-
dure is a method used to capture a participant’s
behaviors and commentary on their own thought
processes as they occur during a predetermined
activity such as an engineering design challenge
[2]. The resulting verbal commentary is then ana-
lyzed, using a verbal protocol analysis technique,
which applies a previously derived coding scheme to
audio/video recordings of a design session [39].
Ericsson and Simon [43] explain that concurrent
think-aloud protocols represent one’s directly ver-
balized cognitive processes and thus, they maintain
that the verbal protocol analysis can be an effective
research methodology to examine an individual’s
cognitive processes. Atman and Bursic [23] also
state that using a verbal protocol analysis for
assessing the cognitive processes of engineering
students is an appropriate method for understand-
ing the processes used while developing a design
solution. Therefore, the participants in this study
were asked to verbalize their thoughts while
engaged in an engineering design task. The partici-
pants were also required to complete the task alone,
instead of in a group setting; to help capture their
individual thought processes while also minimizing
any outside interference. To facilitate data collec-
tion, participants were equipped with point-of-view
cameras that captured verbal commentary as well as
the participants’ non-verbal cues (i.e., hand move-
ments and directed attention). The combination of
verbal and observational protocols, in addition to
the participants’ design artifacts, allowed the trian-
gulation of data—as the verbal protocol alone
would be weak if used exclusively in capturing a
participant’s cognitive processes [35]. In addition, a
demographics survey was used to identify the prior
experiences of the recruited participants. Fig. 1
provides an overview of the methodology for this
study.
The engineering design challenge (see Table 1) for
this study was developed to support the collection of
Examining Engineering Design Cognition with Respect to Student Experience and Performance 1913
Fig. 1. The overview of procedures used for the study.
verbal and observational protocol while partici-
pants designed, made, and evaluated their physical
prototype. Data were collected as participants pro-
gressed through the full design process spectrum
rather than only collecting data while they devel-
oped a solution concept. This enabled the research-
ers to compare a participant’s cognitive strategies
with the effectiveness of their final prototype. This
process may be one method for bridging design
cognition research with practice as it can help
determine cognitive predictors for developing suc-
cessful resolutions to design challenges and can
identify design heuristics for enhancing design cap-
abilities. Consequently, the design challenge for this
study was developed to enable the researchers to
collect quantitative data on the effectiveness of each
participant’s solution to better determine potential
relationships between solution effectiveness and the
mental strategies employed by participants. In addi-
tion, the classroom instructors used the rubric in in
Table 2 to holistically evaluate the student projects.
The engineering design challenge was presented
to the participants as an ill-structured, open-ended,
real-world issue. The challenge required partici-
pants to define the problem, identify the criteria
and constraints, determine the materials needed for
their proposed solution prototype, and then make,
test, and optimize their solution. The posed problem
tasked the participants with designing, making, and
evaluating a system or device that would help reduce
the contamination of water in a developing nation
after the onset of a natural disaster. Each partici-
pant was asked to work in isolation and was not
limited by time. The participants were all provided
the same materials and production facility as well as
a computer-based turbidity sensor to evaluate how
well their device removes potential contaminates
from a water sample. Figs. 2 and 3 provide samples
of the student design work for the challenge pro-
vided.
Following data collection, the recordings of each
participant’s enacted design process were divided
into three distinct phases: designing, making, and
evaluating. See Table 3 for a description of these
phases. The concurrent think-aloud protocol for
each phase were simultaneously segmented and
coded using the 17 mental processes for technolo-
gical problem solving defined and validated by
Halfin [44]. The operational definitions of these
processes and sample utterances are provided in
Table 4. However, based on a review of literature,
the mental process of Modeling was determined by
the researchers to be too similar to the other codes of
Model/Prototype Constructing and Designing. The
inability to differentiate between these codes was
stated in the original work by Halfin [44]. As a result,
the use of Modeling as a mental processing code was
avoided and the actions that could be considered
Modeling were coded as either Designing or Model/
Prototype Constructing.
To enable the coding process, the researchers
used Hill’s [47] Observational Procedures for
Greg J. Strimel et al.1914
Table 1. Engineering design task as presented to the participants
Introduction In many developing countries, clean water is not readily accessible and therefore disease and illness is spread. This
is especially true in the aftermath of natural disasters in these areas. While there are many challenges related to
clean water, purification is an important part of many water treatment processes.
Problem Statement People in developing countries do not have continuous access to clean water, especially after the onset of a natural
disaster. Water in these situations needs significant purification. However, water purification units are expensive
and not easy to obtain. Therefore, you are tasked to design an inexpensive, easy to use, easy to assemble, durable,
and low maintenance water purification system using low cost, readily available materials to quickly remove
contaminants from water. You will focus on reducing the turbidity of a sample of water.
Testing Performance Turbidity is a measure of the lack of clarity (cloudiness) of water and is a key test of water quality. Turbidity is
apparent when light reflects off of particles in the water. Some sources of turbidity include soil erosion, waste
discharge, urban runoff, and algal growth. In addition to creating an unappealing cloudiness in drinking water,
turbidity can be a health concern. It can sustain or promote the growth of pathogens in the water distribution
system, which can lead to the spread of waterborne diseases. Turbidity is measured in Nephelometric Turbidity
Units, NTU. Water is visibly turbid at levels above 5 NTU. However, the standard for drinking water is typically
1.0 NTU or lower.
Prototype Materials You are not limited to any specific materials.
You can use any materials necessary to create the best solution.
You should not be concerned with material availability.
You should design your solution to best meet the specified criteria and constraints.
Equipment/Supplies Computer and Internet access, distilled water, contaminated water, water sample bottle with lid, paper towel,
bucket, Vernier turbidity sensor/equipment, LabQuest Mini, Logger Pro software.
Deliverables Functioning Prototype of Quality Construction.
Project Journal.
Solution Analysis—A summary of the details of the design, its benefits, uses, and other important information
that explains the design solution.
Technology Education Mental Processes
(OPTEMP) computer analysis tool, which permits
a researcher to both segment, and code, verbal
protocols simultaneously while observing video
recordings. Once each video is coded, the
OPTEMP tool generates a spreadsheet with the
quantity of time each participant employed each
cognitive process. To ensure the reliability of this
procedure, two coders independently coded each
participant’s protocols and a Pearson’s rcorrelation
coefficient between the coding results was calcu-
lated. The Pearson’s rcalculation revealed a reliable
level of consistency in the coding results. The mean
correlation coefficient between codes, for all 20
participants, was 0.902 (n = 17 [represents the
number of mental processes], p = 0.00), demonstrat-
ing highly reliable results in the identified codes.
5.2 Participants
This study included twenty purposefully selected
participants: eight experienced secondary engineer-
ing students (age 16–18) and 12 post-secondary
students (age 18+) whom were within their first-
year of an engineering major at a university. The
experienced secondary students were selected based
on their involvement in engineering/technology
coursework in high school. Each of these students
were enrolled in the capstone course of the Project
Lead the Way [48] high school engineering program
at two high schools in the southeast region of the
United States. The post-secondary engineering stu-
dents were enrolled in the first required engineering
design course at a land-grant, space-grant, and
research-intensive public university in the Appala-
chian region of the southern United States. The
Examining Engineering Design Cognition with Respect to Student Experience and Performance 1915
Table 2. Engineering design challenge rubric
Category 5 – 4 Points 3 – 2 Points 1 – 0 Points
Research Thoroughly researched and
documented existing solutions
and necessary concepts.
Few existing solutions and
necessary concepts were
researched and documented.
No evidence that research was
conducted.
Multiple Solutions Developed multiple conceptual
solution ideas.
Developed only a few conceptual
solution ideas.
Did not consider multiple
solution ideas.
Design Justification A robust justification for
following a particular design idea
is clearly stated.
A weak justification for following
a particular design idea is stated.
A justification for following a
particular design idea is not
provided.
Material Selection Appropriate materials were
selected and properly
manipulated to make a quality
solution meeting the established
criteria and constraints.
A few of the materials used were
of quality and enabled the
making of a solution that met all
of the criteria and constraints.
Materials used did not aid in the
creation of a quality solution.
Prototype Performance After filtration, the clarity of the
water is less than 1 NTU.
After filtration, the turbidity was
reduced.
After filtration, the clarity of the
water has not changed.
Prototype Durability The final product received no
damage or wears and required no
adjustments or repairs during
testing.
The final product received some
damage or wear during testing
but was easily repaired. Minor
adjustments were required.
The final product received
significant damage or wear
during testing that was not easily
repaired and interfered with
testing.
Prototype Use The final product could be easily
set up and used with little or no
instruction.
The final product would require
careful set up with some
instruction.
The final product is very difficult
to set up and requires extensive or
complicated instructions.
Engineering Notebook All Best Practices for Engineering
Notebook are applied
60% of Best Practices for
Engineering Notebook are
applied. The quality of
documented information is poor.
Less than 40% of Best Practices
for Engineering Notebook are
applied. Few or no Engineering
notebook entries are included.
Prototype Testing Test procedures are followed and
correct data are collected. The
student is knowledgeable
regarding the reason for the test,
each step in the procedure, and
the significance of the data.
Minor deviations in test
procedures and data collection
occur. The student is unfamiliar
with the reason for the tests
performed.
Little to no evidence exists to
indicate that prototype test
procedures were conducted.
Prototype Revision The test evaluation results in
suggestions for improvement.
Detailed description of the design
modifications that were made
based upon the results of
prototype testing.
The test evaluation results in
suggestions for improvement.
Less than adequate description of
the design modifications that
were made based upon the results
of prototype testing.
Little to no evidence exists that
revisions are considered or made.
participants were purposefully selected from the
introductory engineering course as it enabled data
to be collected from a ‘‘traditional’’ cohort of
students beginning their pursuit of an engineering
major while being introduced to the concept of
engineering design. The term ‘‘traditional’’ is used
in connection with commonly practiced metrics
used for admittance into an engineering major,
which can be broadly viewed as requiring students
to have completed a series of advanced mathematics
and science courses. Important to the study, these
metrics do not include completing prerequisite
Greg J. Strimel et al.1916
Fig. 2. Sample secondary student project.
coursework in engineering/technology such as Pro-
ject Lead the Way during high school; therefore,
most of the post-secondary participants did not
have experiences in engineering/technology courses
while in high school—a defining difference in the
two samples included in this study. Thus, the
research assumed that the comparison between the
two groups would facilitate the identification of
possible effects related to prior experiences in engi-
neering/technology coursework on engineering
Examining Engineering Design Cognition with Respect to Student Experience and Performance 1917
Fig. 3. Sample post-secondary student project.
design cognition and performance. Participant
background data were collected through a demo-
graphics survey to provide a description of the
subjects being studied and investigate comparabil-
ity across groups. This information was used to
determine participant similarities and differences
in engineering experience.
The secondary group consisted of two female and
six male participants with a cumulative high school
grade point average at or above 3.6. Each partici-
pant completed the Introduction to Engineering
Design,Principles of Engineering, and Digital Elec-
tronics Project Lead the Way high school pre-
engineering courses and was enrolled in the cap-
stone Engineering Design and Development course at
the time of the study. Additionally, each participant
completed at least Algebra I, Algebra II, Geometry,
and Trigonometry/Pre-Calculus courses as well as
at least one biology and physics course. Lastly, each
of the participants was active in the Skills USA
technical workforce competition program and
each was interested in a future career in engineering.
The post-secondary engineering group was com-
prised of four female and eight male participants
with an average age of 20 years who were enrolled in
the first calculus-based engineering design course.
All participants were either enrolled in or had
completed college Calculus I at the time of the
study. Additionally, six of these students reported
having some experience with calculus or pre-calcu-
lus while in high school and almost all participants
reported completion of high school coursework in
physics (11), chemistry (12), and biology (10). How-
ever, only one of the post-secondary participants
had experience in engineering/technology (Project
Lead the Way) coursework while in high school.
The researchers do note that the number of
participants is a limitation. However, design cogni-
tion research typically involves a small number of
participants due to the qualitative nature of the
collected data. Therefore, this study is in alignment
with other recent design cognition studies at the
secondary level, which on average only involve 22
participants.
6. Findings
6.1 Research Objective 1. Identify the cognitive
processes experienced secondary engineering
students use to design, make, and evaluate
functional prototypes to an engineering design
problem
The first research objective was met by coding
audio/video recordings of participants thinking
aloud during a complete engineering design session.
The codes used in the data analysis were a set of 17
mental processes used in technological problem
solving, identified and validated by Halfin [44] and
revalidated by Wicklein and Rojewski [46]. On
average, the secondary participants completed the
challenge in one hour, 50 minutes, and 35.8 seconds.
Throughout the entire engineering design activity,
the top three most employed mental processes by
secondary engineering students were Model/Proto-
type Constructing,Analyzing, and Managing.
Model/Prototype Constructing consumed 23.3 per-
cent of the participant’s time on average, which
mostly consisted of physically manipulating materi-
als. Next, Analyzing consumed 15.8 percent of the
participant’s time on average and consisted mostly
of information gathering and analyzing the effec-
tiveness of various design decisions. Lastly, Mana-
Greg J. Strimel et al.1918
Table 3. Design Phase Descriptions
Design Process Phases Description
Design Phase This phase occurred at the beginning of the problem-solving process and generally consisted of the practices
of:
Problem Scoping
Information Gathering
Ideation
Solution Concept Development
Concept Selection
Making Phase This phase occurred during the middle portion of problem solving process and generally consisted of the
practices of:
Prototype Production
Material Gathering
Material Experimentation
Evaluation Phase This phase occurred during the final portion of problem solving process and generally consisted of the
practices of:
Prototype Testing
Data Collection/Analysis
Concept/Prototype Refinement
Additional Information Gathering
ging consumed 13.9 percent of participant time and
consisted mostly of the participants planning their
actions and gathering necessary resources. The least
used mental processes were Experimenting (0.7%),
Computing (0.08%), Questioning/Hypothesizing
(1.9%), Defining Problems (2.0%), Interpreting
Data (2.1%), Predicting (2.3%), and Measuring
(2.6%). Each participant’s cognitive processes data
for the entire engineering design session is reported
in Table 5. The average time for each mental process
employed by the secondary-level participants can be
found in Table 6.
Examining Engineering Design Cognition with Respect to Student Experience and Performance 1919
Table 4. Halfin’s 17 Mental Processes for Solving Technological Problems [44–46]
Cognitive Process Definition & Sample Utterance
Analyzing This is the process of identifying, isolating, taking apart, breaking down, or performing similar actions for the
purpose of setting forth or clarifying the basic components of a phenomenon, problem, opportunity, object, system,
or point of view. ‘‘I believe I have a design flaw which is this right here.’’
Communicating This is the process of conveying information (or ideas) from one source (sender) to another (receiver) through a
media using various modes (The modes may be oral or written or pictures or symbols, or any combination of these.).
‘‘Let’s write down the original sample number.’’
Computing This is the process of selecting and applying mathematical symbols, operations, and processes to describe, estimate,
calculate, quantify, relate, and/or evaluate in the real or abstract numerical sense. ‘‘At 14 inch intervals, I will need 2 of
them.’’
Creating This is the process of combining the basic components or ideas of phenomena, objects, events, systems, or points of
view in a unique manner that will better satisfy a need, either for the individual or for the outside world. ‘‘I should
combine both ideas.’’
Defining problem(s) This is the process of stating or defining a problem, which will then enhance the investigation leading to an optimal
solution. It is transforming one state of affairs to another desired state. ‘‘What does the device need to do?’’
Designing This is the process of conceiving, creating, inventing, contriving, sketching, or planning by which some practical ends
may be affected, or proposing a goal to meet the societal needs, desires, problems, or opportunities and do things
better. Design is a cyclic or iterative process of continuous refinement or improvement. ‘‘Let’s just create a sketch
here.’’
Experimenting This is the process of determining the effects of something previously untried in order to test the validity of a
hypothesis, to demonstrate a known (or unknown) truth, or to try out various factors relating to a particular
phenomenon problem, opportunity element, object, event, system, or point of view. ‘‘Let us see what works better for
the base then the foam I have.’’
Interpreting data This is the process of clarifying, evaluating, explaining, and translating to provide (or communicate) the meaning of
particular data. ‘‘I can deduct that this way of sampling is not working.’’
Managing The process of planning, organizing, directing, coordinating, and controlling the inputs and outputs of the system. ‘‘I
will move all of our stuff back to the table.’’
Measuring This is the process of describing characteristics (by the use of numbers) of a phenomenon, problem, opportunity,
element, object, event, system, or point of view in terms that are transferable. Measurements are made by direct or
indirect means, are on relative or absolute scales, and are continuous or discontinuous. ‘‘I know it needs to be at least
this big.’’
Modeling* This is the process of producing or reducing an act or condition to a generalized construct that may then be presented
graphically in the form of a sketch, diagram, or equation; physically in the form of a scale model or prototype; or in
the form of a written generalization.
Model/Prototype
Constructing
This is the process of forming, making, building, fabricating, creating, or combining parts to produce a scale model
or prototype. ‘‘I need a pair of scissors to cut a hole in the bottom.’’
Observing This is the process of interacting with the environment through one or more of the senses (seeing, hearing, touching,
smelling, or tasting). The senses are utilized to determine the characteristics of a phenomenon, problem, opportunity,
element, object, event, system, or point of view. The observer’s experiences, values, and associations may influence
the results. ‘‘Visually, I can tell I’m not doing any better.’’
Predicting This is the process of prophesying or foretelling something in advance, anticipating the future based on special
knowledge. ‘‘That’s not going to work.’’
Questioning/
Hypothesizing
Questioning is the process of asking, interrogating, challenging, or seeking answers related to a phenomenon,
problem, opportunity, element, object, event, system, or point of view. ‘‘What materials will work best?’’
Testing This is the process of determining the workability of a model, component, system, product, or point of view in a real or
simulated environment to obtain information for clarifying or modifying design specifications. ‘‘Okay let’s do this!’’
Visualizing This is the process of perceiving a phenomenon, problem, opportunity, element, object, event, or system in the form
of a mental image based on the experience of the perceiver. It includes an exercise of all the senses in establishing a
valid mental analogy for the phenomena involved in a problem or opportunity. ‘‘If I poke holes in the cup here, the
water will run into there.’’
*Modeling was not used as a code in this study as a review of literature indicated difficulty in differentiating this process from others, such as
Model/Prototype Constructing and Designing.
6.2 Research Objective 2. Identify the cognitive
processes traditional post-secondary engineering
students use to design, make, and evaluate functional
prototypes to an engineering design problem
As before, this objective was also met by coding
audio/video recordings of participants thinking
aloud during a complete engineering design session.
On average, the post-secondary participants com-
pleted the challenge within one hour, 21 minutes,
and 16 seconds. Throughout the entire engineering
design activity, the three most employed mental
processes were Model/Prototype Constructing,
Managing, and Testing.Model/Prototype Con-
structing consumed 28.9 percent of the participant’s
time on average, which consisted mostly of physi-
cally manipulating materials. Next, Managing con-
sumed 23.8 percent of their time on average and
consisted mostly of the participants planning
their actions and gathering necessary resources.
Lastly, Testing consumed 10.5 percent of their
time on average and consisted mostly of operating
their devices with the purpose of collecting data to
determine how well it performed. The least used
mental processes were Computing (0.3%), Measur-
ing (1.2%), Predicting (1.4%), Interpreting Data
(1.4%), Visualizing (2.2), and Questioning/Hypo-
thesizing (2.2%). Each participant’s cognitive
process data for the entire engineering design ses-
sion is reported in Tables 7. The secondary partici-
pant group average of each process can be seen in
Table 8.
6.3 Research Objective 3. Compare the design
cognition and performance of experienced
secondary and traditional post-secondary
engineering students
To achieve this objective, the researchers performed
a comparison between secondary and post-second-
ary participants’ design cognition using the Mann-
Whitney statistical test. This approach was inten-
tionally pursued, as the Mann-Whitney test is less
sensitive to the concern of the cognitive process data
Greg J. Strimel et al.1920
Table 5. Secondary Cognitive Processes During Engineering Design Activity
Time (Hours:Minutes:Seconds)
Code
Participant
1
Participant
2
Participant
3
Participant
4
Participant
5
Participant
6
Participant
7
Participant
8
Analyzing 0:09:41 0:23:54 0:23:10 0:18:33 0:21:23 0:20:04 0:13:22 0:09:34
Communicating 0:07:44 0:15:51 0:07:31 0:04:19 0:07:07 0:03:30 0:03:39 0:02:23
Computing 0:01:28 0:03:13 0:00:28 0:00:20 0:00:28 0:00:25 0:00:29 0:00:10
Creating 0:01:01 0:01:59 0:04:01 0:04:20 0:03:01 0:00:53 0:04:29 0:03:36
Designing 0:06:52 0:10:20 0:08:23 0:10:35 0:07:58 0:08:18 0:09:23 0:02:17
Defining Problems 0:03:38 0:01:11 0:01:44 0:01:30 0:01:38 0:04:37 0:01:29 0:01:52
Experimenting 0:00:40 0:01:25 0:01:11 0:00:06 0:01:00 0:01:26 0:00:11 0:00:34
Interpreting Data 0:04:56 0:05:23 0:01:14 0:01:36 0:02:56 0:01:35 0:00:20 0:00:53
Managing 0:13:06 0:21:49 0:14:24 0:17:09 0:12:58 0:14:44 0:12:00 0:17:08
Measuring 0:02:24 0:04:16 0:01:08 0:02:32 0:02:56 0:00:48 0:07:34 0:01:12
Modeling 0:00:00 0:00:56 0:00:00 0:00:23 0:00:00 0:00:00 0:00:00 0:00:00
Model/Prototype
Constructing
0:20:34 0:06:48 0:39:19 0:35:16 0:15:01 0:24:20 0:38:33 0:26:47
Observing 0:05:07 0:09:09 0:02:08 0:05:22 0:04:58 0:04:46 0:02:57 0:04:00
Predicting 0:02:23 0:05:21 0:02:46 0:02:09 0:02:19 0:01:36 0:01:49 0:01:38
Questioning/Hypothesizing 0:01:55 0:02:07 0:02:07 0:01:46 0:01:59 0:01:59 0:02:30 0:02:24
Testing 0:13:31 0:25:07 0:13:08 0:10:08 0:16:14 0:07:37 0:04:55 0:07:58
Visualizing 0:03:11 0:01:20 0:01:36 0:06:44 0:01:45 0:03:59 0:06:34 0:03:11
TOTAL 1:38:09 2:20:11 2:04:20 2:02:48 1:43:41 1:40:39 1:50:13 1:25:39
Table 6. Mean Cognitive Process Times for all Secondary Students (N = 8)
Code
xTime
(Hours:Minutes:Seconds) Code
xTime
(Hours:Minutes:Seconds)
Analyzing 17:27.5 Measuring 02:51.1
Communicating 06:30.5 Modeling 00:09.8
Computing 00:52.5 Model/Prototype Constructing 25:49.8
Creating 02:54.9 Observing 04:48.5
Designing 08:00.8 Predicting 02:30.2
Defining Problems 02:12.6 Questioning/Hypothesizing 02:05.9
Experimenting 00:49.3 Testing 12:20.0
Interpreting Data 02:21.8 Visualizing 03:32.5
Managing 15:24.8 Total Design Time 1:50:35.8
Note.
xrepresents the sample mean for all secondary students.
Examining Engineering Design Cognition with Respect to Student Experience and Performance 1921
Table 7. Post-Secondary Cognitive Processes During Engineering Design Activity
Time (Hours:Minutes:Seconds)
Code
Participant
1
Participant
2
Participant
3
Participant
4
Participant
5
Participant
6
Participant
7
Participant
8
Participant
9
Participant
10
Participant
11
Participant
12
Analyzing 0:04:02 0:02:02 0:02:19 0:03:54 0:03:14 0:08:22 0:06:55 0:01:50 0:05:41 0:05:44 0:05:57 0:01:58
Communicating 0:02:48 0:00:00 0:01:19 0:01:02 0:00:54 0:02:09 0:04:05 0:01:16 0:03:35 0:01:27 0:03:24 0:06:23
Computing 0:00:00 0:00:00 0:00:21 0:00:03 0:00:00 0:00:08 0:00:15 0:01:33 0:00:08 0:00:29 0:00:13 0:00:00
Creating 0:02:02 0:06:21 0:02:02 0:04:40 0:03:09 0:01:27 0:02:00 0:01:35 0:10:18 0:03:32 0:05:16 0:03:31
Designing 0:02:53 0:00:00 0:01:59 0:01:04 0:00:43 0:03:09 0:00:18 0:01:02 0:03:30 0:03:23 0:02:55 0:05:05
Defining Problems 0:03:24 0:01:17 0:01:17 0:00:53 0:00:24 0:00:58 0:04:03 0:03:37 0:04:16 0:01:39 0:00:39 0:03:26
Experimenting 0:02:18 0:05:09 0:02:44 0:00:05 0:03:27 0:08:11 0:02:14 0:03:34 0:01:38 0:01:53 0:09:43 0:01:43
Interpreting Data 0:01:05 0:01:05 0:01:44 0:00:28 0:01:42 0:01:09 0:01:07 0:00:25 0:01:24 0:00:11 0:02:52 0:00:38
Managing 0:20:09 0:16:26 0:23:23 0:16:16 0:15:17 0:17:46 0:23:26 0:20:46 0:18:47 0:19:44 0:21:41 0:18:47
Measuring 0:00:00 0:00:00 0:00:43 0:00:20 0:00:22 0:00:13 0:00:47 0:00:31 0:00:30 0:05:16 0:02:28 0:01:01
Modeling 0:00:00 0:00:00 0:00:00 0:00:00 0:00:00 0:00:00 0:00:00 0:00:00 0:00:00 0:00:00 0:00:00 0:00:00
Model/Prototype Constructing 0:23:15 0:24:03 0:21:34 0:32:35 0:21:26 0:14:09 0:12:39 0:16:57 0:26:48 0:37:53 0:21:07 0:29:40
Observing 0:05:01 0:04:21 0:07:46 0:01:47 0:06:49 0:03:36 0:03:20 0:06:03 0:02:47 0:02:36 0:05:21 0:02:12
Predicting 0:01:54 0:01:22 0:01:16 0:01:41 0:01:10 0:00:20 0:00:05 0:00:00 0:02:34 0:01:09 0:01:35 0:00:22
Questioning/Hypothesizing 0:00:46 0:02:31 0:00:57 0:01:49 0:03:00 0:02:12 0:01:39 0:00:19 0:02:45 0:01:34 0:03:25 0:00:43
Testing 0:04:24 0:04:45 0:08:43 0:04:25 0:19:19 0:07:42 0:10:41 0:12:36 0:08:14 0:03:54 0:08:00 0:09:50
Visualizing 0:02:44 0:01:36 0:01:01 0:03:55 0:01:35 0:01:18 0:00:21 0:00:32 0:03:35 0:03:00 0:00:53 0:00:51
TOTAL 1:16:45 1:10:58 1:19:08 1:14:56 1:22:33 1:12:48 1:13:56 1:12:36 1:36:29 1:33:23 1:35:30 1:26:11
displaying evidence of non-normality. The Mann-
Whitney test results indicated that secondary parti-
cipants dedicated significantly more time employing
the mental processes of analyzing (U< 0.000, p<
0.000), communicating (U= 10.000, p= 0.005),
computing (U= 16.500, p= 0.023), designing (U=
5.000, p= 0.001), measuring (U= 12.000, p= 0.008),
predicting (U= 8.000, p= 0.003), and visualizing (U
= 15.500, p= 0.019) than the post-secondary
participants. Conversely, the secondary partici-
pants dedicated significantly less time employing
the cognitive processes of experimenting (U= 8.000,
p= 0.003) and managing (U= 15.000, p= 0.017)
than the post-secondary participants. Moreover,
the Mann-Whitney test indicated that secondary
participants devoted significantly more time to
completing the entire engineering design session
(U= 4.000, p= 0.001) and specifically the design
phase of the process (U= 10.000, p= 0.005). These
results are summarized in Table 9. In terms of design
Greg J. Strimel et al.1922
Table 8. Mean Cognitive Process Times for all Post-Secondary Students (N = 12)
Code
xTime
(Hours:Minutes:Seconds) Code
xTime
(Hours:Minutes:Seconds)
Analyzing 04:19.8 Measuring 01:00.9
Communicating 02:21.8 Modeling 00:00.0
Computing 00:15.9 Model/Prototype Constructing 23:30.4
Creating 03:49.5 Observing 04:18.3
Designing 02:10.0 Predicting 01:07.4
Defining Problems 02:09.4 Questioning/Hypothesizing 01:48.3
Experimenting 03:33.2 Testing 08:32.8
Interpreting Data 01:09.2 Visualizing 01:46.7
Managing 19:22.5 Total Design Time 1:21:16.0
Note.
xrepresents the sample mean for all post-secondary students.
Table 9. Mann-Whitney Analysis between Secondary and Post-Secondary Participants
Mean Time Dedicated to Each Cognitive
Process (sec.)
Secondary
(N= 8)
Post-Secondary
(N= 11)
Mann-Whitney
U
Asymp. Sig.
(2-tailed)
Analyzing 1047.63 261.45 0.000** 0.000
Communicating 390.50 139.45 10.000** 0.005
Computing 52.63 17.27 16.500* 0.023
Creating 175.00 239.18 34.000 0.409
Designing 480.75 126.18 5.000** 0.001
Defining Problems 132.38 122.64 32.000 0.322
Experimenting 49.13 220.09 8.000** 0.003
Interpreting Data 141.63 69.55 27.000 0.160
Managing 924.75 1158.09 15.000* 0.017
Measuring 171.25 66.45 12.000** 0.008
Modeling 9.88 0.00 33.000 0.088
Model/Prototype Constructing 1549.75 1411.91 37.000 0.563
Observing 288.38 254.36 37.000 0.563
Predicting 150.13 63.09 8.000** 0.003
Questioning/Hypothesizing 125.88 114.00 39.000 0.679
Testing 739.75 535.36 29.000 0.215
Visualizing 212.50 101.55 15.500* 0.019
Total Design Time 6642.39 4900.73 4.000** 0.001
Design Phase Time 1776.40 915.30 10.000** 0.005
Making Phase Time 2385.18 1637.55 24.000 0.099
Evaluation Phase Time 2480.86 2347.71 40.000 0.741
Mean
Secondary
(N= 8)
Post-Secondary
(N= 11)
Mann-Whitney
U
Asymp. Sig.
(2-tailed)
Number of Prototypes Trials 3.13 3.64 34.500 0.418
Rubric Score 37.75 29.18 16.000* 0.021
Final Turbidity 20.60 47.17 25.000 0.117
Note: Post-secondary student 1 was excluded in this test, as they did not producea testable prototype. It is important to note that Prototype
Trials refers to the number of times students tested their solutions and that Final Turbidity refers to the lowest turbidity level achieved.
* Significant at the 0.05 level (2-tailed). ** Significant at the 0.01 level (2-tailed).
performance, the secondary participants achieved
significantly higher rubric scores (U= 16.000, p=
0.021) than the post-secondary participants. Addi-
tional analysis revealed that secondary participant
prototypes had better results (e.g., lower turbidity
levels) than post-secondary participants, although
the difference between the two groups was not
statistically significant (see Table 9).
In this study’s sample, secondary participants all
had educational experiences in engineering design
through the Project Lead the Way high school
engineering program. However, only one post-sec-
ondary participant (PS4) reported secondary
experience in engineering through the Project Lead
the Way curriculum. Therefore, to explore signifi-
cant differences in design cognition between second-
ary participants having previous educational
experiences in engineering design and post-second-
ary having no experience, another Mann-Whitney
test was conducted. To do so, the researchers first
prepared the data by removing the post-secondary
participant with secondary experiences in engineer-
ing design (PS4) and the post-secondary participant
who failed to produce a testable prototype (PS1).
Following data conditioning the Mann-Whitney
test was conducted and the results indicated that
the secondary participants having engineering
design experiences in high school were significantly
different than the post-secondary participants with
no previous educational experiences in engineering/
technology. Specifically, the secondary participants
who completed high school engineering/technology
coursework dedicated significantly more time to
analyzing (U< 0.000, p< 0.000), communicating
(U= 10.000, p= 0.008), computing (U= 16.500, p=
0.036), designing (U= 5.000, p= 0.002), measuring
(U= 12.000, p= 0.013), predicting (U= 6.000, p=
0.003), and visualizing (U= 10.500, p= 0.009) than
the post-secondary participants having no previous
educational experience in engineering design.
Examining Engineering Design Cognition with Respect to Student Experience and Performance 1923
Table 10. Mann-Whitney Analysis between Secondary Engineering Participants and Post-Secondary Participants without prior
Engineering Coursework During High School
Mean Time Dedicated to Each Cognitive
Process (sec.)
Secondary
Engineering
Participants
(N= 8)
Post-secondary
Participants Without
Secondary
Engineering
Experience (N= 10)
Mann-Whitney
U
Asymp. Sig.
(2-tailed)
Analyzing 1047.63 264.20 0.000** 0.000
Communicating 390.50 147.20 10.000** 0.008
Computing 52.63 18.70 16.500* 0.036
Creating 175.00 235.10 34.000 0.594
Designing 480.75 132.40 5.000** 0.002
Defining Problems 132.38 129.60 32.000 0.477
Experimenting 49.13 241.60 0.000** 0.000
Interpreting Data 141.63 73.70 26.000 0.214
Managing 924.75 1176.30 12.000* 0.013
Measuring 171.25 71.10 12.000* 0.013
Modeling 9.88 0.00 30.000 0.104
Model/Prototype Constructing 1549.75 1357.60 32.000 0.477
Observing 288.38 269.10 37.000 0.790
Predicting 150.13 59.30 6.000** 0.003
Questioning/ Hypothesizing 125.88 114.50 38.000 0.859
Testing 739.75 562.40 29.000 0.328
Visualizing 212.50 88.20 10.500** 0.009
Total Design Time 6642.39 4941.20 4.000** 0.001
Design Phase Time 1776.40 942.75 10.000** 0.008
Making Phase Time 2385.18 1582.77 21.000 0.091
Evaluation Phase Time 2480.86 2415.47 37.000 0.790
Mean
Secondary
(N= 8)
Post-Secondary
(N= 11)
Mann-Whitney
U
Asymp. Sig.
(2-tailed)
Number of Prototypes Trials 3.13 3.40 34.500 0.612
Rubric Score 37.75 27.80 9.000** 0.006
Final Turbidity 20.60 51.88 18.000 0.051
Note: Post-secondary student 1 and 4 were excluded in this test. It is important to note that Prototype Trials refers to the number of times
students tested their solutions and that Final Turbidity refers to the lowest turbidity level achieved.
* Significant at the 0.05 level (2-tailed). ** Significant at the 0.01 level (2-tailed).
Further, the data revealed that the secondary parti-
cipants having engineering design experiences in
high school devoted less time to experimenting (U
< 0.000, p< 0.000) and managing (U= 12.000, p=
0.013) than the post-secondary participants. Also,
the results demonstrated that secondary partici-
pants with high school engineering experiences
dedicated significantly more time to the design
phase of the process (U= 10.000, p= 0.008). In
terms of student performance, the secondary parti-
cipants achieved significantly higher rubric scores
(U= 9.000, p= 0.006) than the post-secondary
students without engineering experiences during
high school. In the final turbidity, the secondary
participants yielded better test results than post-
secondary participants, but was not significant.
Table 10 presents the Mann-Whitney analysis
results.
For Research Objective 3, this study sought to
test three working hypotheses in regards to the
comparison of design cognition between secondary
and post-secondary participants. First, the
researchers hypothesized that post-secondary par-
ticipants would devote more time to Defining Pro-
blems,Designing,Analyzing, and Predicting and less
time to Modeling/Prototyping Constructing. How-
ever, the Mann-Whitney tests determined that sec-
ondary participants devoted significantly more time
to designing, analyzing, and predicting than post-
secondary participants. Additionally, there was no
significant difference in the amount of time dedi-
cated to defining problems and model/prototype
constructing between the two groups.
The second hypothesis was that post-secondary
participants would employ more scientific and
mathematical cognitive processes, such as Comput-
ing, Interpreting Data, Observing, Experimenting,
and Questioning/Hypothesizing than secondary par-
ticipants. The analysis results confirmed that post-
secondary participants did devote significantly
more time to experimenting, however, they spent
significantly less time on computing than secondary
participants. Other cognitive processes demon-
strated no significant differences between the two
participant groups.
Lastly, the researchers hypothesized that post-
secondary participants would develop more effec-
tive solutions to the design challenge. The results
showed that post-secondary students scored signifi-
cantly less on the rubric scores than their secondary
counterparts. However, the prototype test results
(turbidity achieved) were not significantly different
between the two groups. Taken together these
findings may highlight differences in engineering
design cognition between first-year traditional engi-
neering majors and high school students with multi-
ple years of experience in engineering/technology.
This may also indicate that traditional cognitive
metrics for admittance into engineering programs
do not align with actions of designing and making.
Lastly, these findings may suggest the importance of
expanding engineering education at the secondary
level. Table 11 presents the statistical analysis
results related to the working hypotheses.
6.4 Research Objective 4. Determine potential
identifiers within engineering design cognition,
related to student aptitude in successfully designing
and making solutions
To achieve Research Objective 4, the researchers
examined the performance variables (i.e., rubric
score and final turbidity) and the cognitive processes
that the participants employed during the design
session. Based on the results of the study, partici-
pants’ rubric scores were significantly correlated to
the amount of time they employed the mental
processes of analyzing (r= 0.635, p = 0.003),
communicating (r= 0.528, p= 0.020), designing
Greg J. Strimel et al.1924
Table 11. Statistical Results on Working Hypotheses
Working Hypotheses Result
(a) Designing and Making Solutions Defining Problems (DP)
Designing (DE)
Analyzing (AN)
Predicting (PR)
Model/Prototype Constructing (MP)
S < PS
S < PS
S < PS
S < PS
S > PS
S = PS
S > PS **
S > PS **
S > PS **
S = PS
(b) Scientific and Mathematical
Cognitive Process
Computing (CP)
Interpreting Data (ID)
Observing (OB)
Experimenting (EX)
Questioning/Hypothesizing (QH)
S < PS
S < PS
S < PS
S < PS
S < PS
S > PS *
S = PS
S = PS
S < PS **
S = PS
(c) Solution Effectiveness Rubric Score
Prototype Test Result
S < PS
S < PS
S > PS *
S = PS
Note. The working hypotheses where generated to determine which group of students would devote greater cognitive effort (time) to a
specific cognitive process. For example, S > PS signifies that for that specific cognitive process, secondary students devoted more time to the
process than the post-secondary students (S: Secondary Students / PS: Post-Secondary Students).
* Significant at the 0.05 level (2-tailed). ** Significant at the 0.01 level (2-tailed).
(r= 0.619, p= 0.005), interpreting data (r= 0.477, p=
0.039), and predicting (r= 0.749, p< 0.000). Also,
the rubric scores were significantly related to the
total time participants dedicated for the design
phase time (r= 0.550, p= 0.015). Therefore, the
results indicate that higher rubric scores are signifi-
cantly correlated with more time employing the
mental processes of analyzing, communicating,
designing, interpreting data, and predicting as well
as more time dedicated for the entire design session
and specifically the design phase of the process.
Additionally, the final turbidity results of each
participant’s prototype were significantly correlated
with more time in the cognitive process of question-
ing/hypothesizing (r= –0.547, p= 0.015). Hence, the
results indicate that more time dedicated to ques-
tioning/ hypothesizing may be a predictor of better
prototype performance. Table 12 illustrates the
correlational analysis results between mental pro-
cesses and student performance.
Furthermore, to identify significant cognitive
predictors of performance success in terms of
design process (participant rubric scores) and pro-
duct (turbidity score attained), multiple linear
regression analyses of the design cognition data
were attempted between the cognitive processes,
the rubric score, and the turbidity levels. Prior to
pursuing the multiple linear regression analyses,
regression diagnostics were conducted to test statis-
tical assumptions of linearity, homoscedasticity,
normality of residuals, mean independence, and
non-linear relationships. While these assumptions
proved to be justifiable, an issue with multicolli-
nearity was uncovered as the analysis revealed that
several of the predictors (cognitive processes) were
highly correlated with one another. To account for
this issue, the numbers of collinear predictors were
reduced by combining highly correlated cognitive
processes into an aggregate process. Based on the
results of these statistical diagnostics, the cognitive
processes of Computing and Interpreting Data were
combined to form Quantitative Reasoning (QR);
Experimenting,Testing,Questioning/Hypothesizing,
and Observing were combined to form Scientific
Inquiring (SI); and Designing,Creating, and Model-
ing were combined to form Designing/Ideating (DI).
While the collinearity of the cognitive processes
presented an issue for conducting a multiple linear
regression analysis, the diagnostic procedures pro-
vided statistical evidence for refining the design
cognition-coding scheme established through the
work of Halfin in 1973.
Following the creation of the new cognitive
process codes, the data were again examined.
While the issues of multicollinearity were mitigated
by the aggregation of the cognitive processes, it was
determined that the sample size in this study was too
small to produce a significant equation to predict
Examining Engineering Design Cognition with Respect to Student Experience and Performance 1925
Table 12. Correlation between Participant Cognitive Processes and Engineering Design Performance
Rubric Score Final Turbidity
M
Pearson
Correlation r
Sig.
(2-tailed) p
Pearson
Correlation r
Sig.
(2-tailed) p
Analyzing (s) 592.47 0.635** 0.003 –0.146 0.551
Communicating (s) 245.16 0.528* 0.020 –0.374 0.114
Computing (s) 32.16 0.300 0.212 –0.295 0.22
Creating (s) 212.16 0.052 0.831 –0.246 0.31
Designing (s) 275.47 0.619** 0.005 –0.201 0.409
Defining Problems (s) 126.74 –0.187 0.442 –0.302 0.209
Experimenting (s) 148.11 –0.318 0.184 0.221 0.363
Interpreting Data (s) 99.89 0.477* 0.039 0.138 0.573
Managing (s) 1059.84 –0.286 0.235 –0.317 0.186
Measuring (s) 110.58 0.319 0.183 0.251 0.300
Modeling (s) 4.16 0.358 0.132 –0.002 0.992
Model/Prototype
Constructing (s) 1469.95 0.189 0.439 –0.312 0.193
Observing (s) 268.68 0.008 0.975 –0.061 0.805
Predicting (s) 99.74 0.749** 0.000 0.038 0.877
Questioning/Hypothesizing (s) 119.00 0.185 0.450 –0.547* 0.015
Testing (s) 621.42 0.097 0.694 –0.106 0.666
Visualizing (s) 148.26 0.325 0.175 –0.003 0.992
Total Design Time (s) 5634.06 0.561* 0.013 –0.209 0.390
Design Phase Time (s) 1277.87 0.550* 0.015 –0.239 0.325
Making Phase Time (s) 1952.34 0.197 0.420 –0.223 0.358
Evaluation Phase Time (s) 2403.77 –0.032 0.897 –0.178 0.466
Number of Prototype Trials*** 3.42*** 0.220 0.365 –0.450 0.053
Note: Post-secondary student 1 was excluded in this test. It is important to note that Prototype Trials refers to the number of times students
tested their solutions and that Final Turbidity refers to the lowest turbidity level achieved.
* Significant at the 0.05 level (2-tailed). ** Significant at the 0.01 level (2-tailed).
student design success based on cognitive proces-
sing. Despite the issues preventing the regression,
the creation of aggregated codes may prove bene-
ficial to future research.
7. Discussion
It is important to note that this study was limited to
a purposive sample of 20 participants; therefore, the
results of the study are not generalizable to all
engineering programs. However, design cognition
research typically involves a small number of parti-
cipants due to the qualitative nature of the collected
data and thus, this study is in alignment with recent
design cognition studies at the secondary level.
Although limitations exist, stakeholders within the
engineering education community should consider
the findings from this research in future curriculum
efforts and researchers should consider this research
methodology with larger sample sizes. While the
findings in this paper help to highlight elements of
design cognition with respect to design perfor-
mance, further investigations are necessary. For
example, one might design a study using this meth-
odology that includes a larger sample of high school
students with no engineering experiences, high
school students with engineering experiences,
post-secondary students with no engineering experi-
ence, and post-secondary students with high school
engineering experiences. Then identifying the cog-
nitive processes employed, or the lack thereof, may
serve as better indicators of potential voids in
curricula, instruction, and student learning. In
addition, design cognition research results may be
used to reveal latent disconnects between secondary
and post-secondary engineering education pro-
grams. Furthermore, studies such as this can high-
light potential cognitive indicators for enhancing a
student’s engineering design performance. For
example, Strimel’s [49] qualitative analysis sug-
gested better performing students devoted more
time to communicating, managing, testing, obser-
ving, interpreting data, and experimenting and the
results of this study implies that better performance
is significantly correlated with more cognitive effort
in predicting, analyzing, designing, communicating,
interpreting data, and questioning/hypothesizing.
Therefore, design heuristics around these areas
maybe important to integrate within secondary
engineering curriculum.
In addition, a multiple regression analysis of the
data was attempted to identify influential predictors
of success in terms of design performance and
prototype effectiveness. While the sample size in
this study proved to be too small for generating a
significant equation for using cognitive process time
to predict design success, we recommend future
efforts following this approach should be
attempted. It is also important to note that the
effort to conduct a multiple regression analysis
uncovered cognitive processes that were highly
correlated with one another. This discovery pro-
vides support for revising the Halfin [44] coding
scheme by combining highly correlated cognitive
processes into aggregate processes. Therefore, it is
recommended that for future research the cognitive
processes of Computing and Interpreting be com-
bined to form Quantitative Reasoning (QR); Experi-
menting,Testing,Questioning/Hypothesizing, and
Observing be combined to form Scientific Inquiring
(SI); and Designing,Creating, and Modeling be
combined to form Designing/Ideating (DI).
The results of this study also highlight the poten-
tial that P-12 engineering/technology experiences
hold for cultivating a student’s cognitive and phy-
sical abilities for solving problems using engineering
design practices. Other studies have attempted to
demonstrate this potential as well [14, 38, 40, 20].
For example, Mentzer et al. [40] evidenced that the
more experiences in engineering design students
have, the more cognitive efforts they engage in for
idea generation, feasibility analysis, and decision-
making. Similarly, Grubbs [14] identified that sec-
ondary students having engineering design experi-
ences spent considerably more cognitive effort when
proposing solutions to engineering problem than
those without these experiences. Moreover, Grubbs
[14] findings suggest that students immersed in
secondary engineering/technology curriculum may
have the opportunity to experience or develop back-
ground knowledge of viable solutions and thus,
further their ability to generate a series of solution
ideas than non-engineering high school students.
However, Kannengiesser et al. [38] and Wells et al.
[20] found no statistically significant differences in
design thinking between students with secondary
engineering experience and those without.
The results of this study, support the idea that
secondary engineering/technology education can
influence a students’ engineering design cognition
by demonstrating significant differences in analyz-
ing, designing, predicting, communicating, measur-
ing, visualizing, and computing between students
with and without the previous engineering design
experiences. However, the coding schemes used in
design cognition research are based on a variety of
different conceptual foundations and thus, may
elicit different study results and these contradictions
may be a result of the coding schemes employed.
The coding scheme used in this study was founded
on the actions of practicing designers and engineers
while the Kannengiesser et al. [38] and Wells et al.
[20] studies employed a scheme founded in cognitive
science. Therefore, the coding scheme employed in
Greg J. Strimel et al.1926
this study emphasized performance rather than
cognitive processing of information. Consequently,
the results of this study may suggest an influence of
secondary engineering experiences on design per-
formance but not necessarily on the cognitive pro-
cessing of information in design decision-making.
As mentioned, the findings suggest the impor-
tance of educational experiences in engineering
design before entering college engineering pro-
grams. In Strimel’s [49] study of secondary-level
engineering students he suggested that secondary
students are heavily focused on making their solu-
tions and devote a minimal amount of time to
thoroughly planning and making predictions
about their designs before prototyping their pro-
posed solution. Strimel noted that most of the
participants did not experiment with materials to
determine what would be the best choice for their
solution; instead, they relied on repair materials
such as tape and hot glue. He further explained
that these findings might indicate that authentic
engineering practices of predictive analysis, model-
ing, and optimization are not accurately practiced
throughout P-12 engineering/technology curricula
and instruction. However, upon analysis of the
post-secondary level design cognition data in this
study, the researchers identified that the secondary
engineering/technology students, as well as the one
post-secondary participant with prior high school
experience, were more successful in creating effec-
tive solutions to the proposed design challenge. The
results showed students with experience in engineer-
ing/technology devoted significantly more time to
the Design Phase of the problem-solving process
and dedicated significantly more time to employing
the mental processes of Analyzing,Designing,Com-
municating, Computing, and Predicting than the
post-secondary engineering students.
Moreover, secondary engineering experiences
seemed to have an influence on student practices
when developing a design and producing a physical
prototype. The researchers observed the post-sec-
ondary level participants, with no prior engineering
coursework, experienced what may be described as
a ‘‘failure to launch,’’ meaning the students found it
difficult to even start developing a solution to the
problem. These students had extensive experience in
science and mathematics but no identified experi-
ence with designing or making. Therefore, when
tasked to solve an ill-structured problem without a
sequence of steps, they struggled to determine what
they needed to do to complete the challenge.
Furthermore, these students were observed experi-
menting solely with the resources or solutions avail-
able and often avoiding the design of a novel device
to solve the problem. On the other hand, the
secondary-level participants, with High School
engineering/technology experiences, sketched ideas
and used gathered information to create an
informed design concept prior to doing any type
of making or experimentation. Also, the secondary-
level participants devoted significantly more time to
the mental processes of Visualizing and Measuring.
The secondary students were observed dedicating
more time to mentally conceiving how components
of their device would be assembled and making
measurements before manipulating materials. Con-
versely, the post-secondary participants without
prior engineering experience were seen struggling
with the assembly of their prototypes and did not
use tools and materials properly (e.g., these partici-
pants were observed failing to put a drill bit in the
chuck of a hand-held power drill and performing
inappropriate tasks such as hammering a screw into
a piece of wood). While these experiences in design-
ing and making may not be aligned with the work
performed by a professional engineer, a lack of these
proficiencies and an understanding of these prac-
tices may limit the abilities of future engineers in
making informed design decisions. Therefore, early
engineering experiences seem to be crucial to afford
students the opportunities to better practice design-
ing and making as well as performing more
informed design decisions based on the properties
of materials and abilities to manipulate them. How-
ever, the opportunities for students to participant in
engineering coursework at the P-12 are limited
across the United States as it is not often a require-
ment for students. Secondary student experiences
with informed engineering design and physically
making prototypes are left to chance [50] as indi-
cated by the National Assessment of Educational
Progress for the United States which showed that
more than half of the nation’s eighth graders were
not proficient in engineering and technology lit-
eracy.
8. Conclusions
As the teaching of engineering design continues to
increase at the P-12 level, it becomes essential to
understand the ways in which students mentally
process engineering design tasks to provide effective
teaching, establish suitable scaffolding of engineer-
ing design experiences, and integrate interventions
that enhance student design abilities. This study
investigates the design cognition and performance
results of secondary and post-secondary engineer-
ing students while engaged in engineering design
problems. Relationships between prototype perfor-
mance and design cognition were highlighted to
investigate potential links between cognitive pro-
cesses and success on engineering design problems.
Concurrent think-aloud protocols were collected
Examining Engineering Design Cognition with Respect to Student Experience and Performance 1927
from eight secondary and 12 post-secondary engi-
neering students working individually to design,
make, and evaluate a solution prototype to an
engineering design challenge. The resulting protocol
were then coded using a pre-established coding
scheme and analyzed to compare the two partici-
pant groups as well as determine the relationship
between students’ design cognition, experience
level, and design performance. Significant differ-
ences between participants with secondary engi-
neering experiences and those without were found
in regards to the amount of time various cognitive
processes were employed to complete a design task.
For the given design scenario, students with second-
ary engineering experiences achieved significantly
higher rubric scores than those without. Improved
design performance was also found to be signifi-
cantly correlated with more time employing the
mental processes of analyzing, communicating,
designing, interpreting data,predicting, and ques-
tioning/ hypothesizing. These results may highlight
important links between educational experiences in
engineering design, prior to college, and student
success on engineering design problems. Thus, this
study may indicate necessary shifts in student pre-
paration in and for engineering while in primary and
secondary schools.
References
1. T. R. Kelley, Cognitive Processes of Students Participating in
Engineering-Focused Design Instruction, Journal of Tech-
nology Education,19(2), 2008, pp. 50–64.
2. T. R. Kelley, B. M. Capobianco and K. J. Kaluf, Concurrent
Think-Aloud Protocols to Assess Elementary Design Stu-
dents, International Journal of Technology and Design Educa-
tion,25(4), 2015, pp. 521–540.
3. National Research Council, Engineering in K-12 Education:
Understanding the Status and Improving the Prospects,
National Academies Press, Washington, D.C., 2009.
4. NGSS Lead States, Next Generation Science Standards: For
states, by States, National Academies Press, Washington,
D.C., 2013.
5. A. A. Wilson, E. R. Smith and D. L. Householder, High
School Students’ Cognitive Activity while Solving Authentic
Problems through Engineering Design Processes, American
Society of Engineering Education Annual Conference Pro-
ceeding, Atlanta, G.A., 2013.
6. National Academy of Engineering and National Research
Council, STEM Integration in K-12 Education: Status, Pro-
spects, and an Agenda for Research, National Academies
Press, Washington, DC, 2014.
7. J. Douglas, E. Iversen and C. Kalyandurg, Engineering in the
K-12 Classroom: An Analysis of Current Practices and Guide-
lines for the Future. American Society of Engineering Educa-
tion, Washington, D.C., 2004.
8. R. C. Wicklein, Five Good Reason for Engineering Design as
the Focus for Technology Education, The Technology Tea-
cher,65(7), 2006, pp. 25–29.
9. M. H. Hosni, Letter to Next Generation Science Standards
Writing Committee, https://www.asme.org/getmedia/
25be935b-0b66-466e-acf1-e212b534a386/ PS1301_ASME_
Board_on_Education_Comments_on_the_Second_Public_
Draft_of_the_Next_ Generation_Science_Standards.aspx,
Accessed 29 January 2013.
10. M. Ryan, J. Gale and M. Usselman, Integrating Engineering
into Core Science Instruction: Translating NGSS Principles
into Practice through Interative Curriculum Design, Inter-
national Journal of Engineering Education,33(1), 2017, pp.
321–331.
11. M. R. O. Beliler, Sustainability Interest and Knowledge of
Future Engineers: Identifying Trends in Secondary School
Students, International Journal of Engineering Education,
33(1), 2017, pp. 489–503.
12. P. A. Asunda and R. B. Hill, Critical Features of Engineering
Design in Technology Education, Journal of Industrial
Teacher Education,44(1), 2007, pp. 25–48.
13. C. L. Dym, A. M. Agogino, O. Eris, D. D. Frey and L. J.
Leifer, Engineering Design Thinking, Teaching, and Learn-
ing, Journal of Engineering Education,94(1), 2005, pp. 103–
120.
14. M. E. Grubbs, Further Characterization of High School Pre-
and Non-Engineering Students’ Cognitive Activity during
Engineering Design, Doctoral Dissertation, Virginia Tech,
2016.
15. T. R. Kelley, D. C. Brenner and J. T. Pieper, Two
Approaches to Engineering Design: Observations in STEM
Education, Journal of sTEm Teacher Education,47(2), 2010,
pp. 5–40.
16. M. D. Lammi, Characterizing High School Students’ Systems
Thinking in Engineering Design through the Function-Beha-
vior-Structure (FBS) Framework, Doctoral Dissertation,
Utah State University, 2011.
17. M. D. Lammi and K. Becker, Engineering Design Thinking,
Journal of Technology Education,24(2), 2013, pp. 55–77.
18. M. D. Lammi and J. S. Gero, Comparing Design Cognition
of Undergraduate Engineering Students and High School
Pre-Engineering Students, Frontiers in Education Confer-
ence, Rapid City, S.D., 2011.
19. N. Mentzer, Team Based Engineering Design Thinking,
Journal of Technology Education,25(2), 2014, pp. 52–72.
20. J. Wells, M. Lammi, M. Grubbs, J. Gero, M. Paretti and C.
Williams, Characterizing Design Cognition of High School
Students: Initial Analyses Comparing those with and with-
out Pre-Engineering Experiences, Journal of Technology
Education,27(2), 2016, pp. 78–91.
21. M. Welch and H. S. Lim, The Strategic Thinking of Novice
Designers: Discontinuity between Theory and Practice,
Journal of Technology Studies,26(2), 2011, pp. 34–44.
22. M. E. Grubbs, Bridging Design Cognition Research and
Theory with Teaching and Learning, Pupil’s Attitudes
Toward Technology International Conference, Christ
Church, New Zealand, 2013.
23. C. J. Atman and K. M. Bursic, Verbal Protocol Analysis as a
Method to Document Engineering Student Design Pro-
cesses, Journal of Engineering Education,87(2), 1998, pp.
121–132.
24. E. J. Mastascusa, W. J. Snyder and B. S. Hoyt, Effective
Instruction for STEM Disciplines: From Learning Theory to
College Teaching, Jossey-Bass, San Francisco, C.A., 2011.
25. Engineering Accreditation Commission, Criteria for Accred-
iting Engineering Programs, Accreditation Board for Engi-
neering and Technology, Baltimore, M.D., 2016.
26. C. J. Atman, J. R. Chimka, K. M. Bursic and H. L.
Nachtmann, A Comparison of Freshman and Senior Engi-
neering Design Processes, Design Studies,20(2), 1999, pp.
131–152.
27. J. S. Gero, Design Prototypes: A Knowledge Representation
Schema for Design, AI magazine,11(4), 1990, pp. 26–36.
28. H. A. Simon, The Sciences of the Artificial, MIT Press,
Cambridge, M.A., 1969.
29. J. Eekels, Values, Objectivity and Subjectivity in Science and
Engineering, Journal of Engineering Design,6(3), 1995, pp.
173–189.
30. R. L. Carr, L. D. Bennett and J. Strobel, Engineering in the
K- 12 STEM Standards of the 50 U.S. States: An Analysis of
Presence and Extent, Journal of Engineering Education,
101(3), 2012, pp. 539–564.
31. T. A. Harris and H. R. Jacobs, On Effective Methods to
Teach Mechanical Design, Journal of Engineering Education,
84(4), 1995, pp. 343–349.
32. C. Merrill, R. L. Custer, J. Daugherty, M. Westrick and Y.
Greg J. Strimel et al.1928
Zeng, Delivering Core Engineering Concepts to Secondary
Level Students, Journal of Technology Education,20(1),
2009, pp. 48–64.
33. P. Roberts, The Place of Design in Technology Education, in
D. Layton (Ed.) Innovations in Science and Technology
Education, vol. 5, UNESCO, Paris, France, 1994, pp. 171–
179.
34. J. A. Marshall and L. K. Berland, Developing a Vision of
Pre-College Engineering Education, Journal of Pre-College
Engineering Education Research,2(2), 2012, pp. 36–50.
35. N. Cross, Expertise in Design: An Overview. Design Studies,
25(5), 2004, pp. 427–441.
36. R. S. Adams, J. Turn and C. Y. Atman, Educating Effective
Engineering Designers: The Role of Reflective Practice,
Design Studies,24(3), 2003, pp. 275–294.
37. A. T. Purcell, J. S. Gero, H. Edwards and T. McNeill, The
Data in Design Protocols: The Issue of Data Coding, Data
Analysis in the Development of Models of the Design
Process, in N. Cross, H. Christiaans and K. Dorst (Eds),
Analysing Design Activity, John Wiley, Chichester, U.K.,
1996, pp. 225–252.
38. U. Kannengiesser, J. Gero, J. Wells and M. Lammi, Do High
School Students Benefit from Pre-Engineering Design Edu-
cation? International Conference on Engineering Design
(ICED),11, Human Behaviour in Design, Milan, Italy,
July 27–30, 2015. pp. 267–276.
39. T. R. Kelley, D. C. Brenner and J. T. Pieper, Two
Approaches to Engineering Design: Observations in STEM
Education. Journal of sTEm Teacher Education,47(2), 2010,
pp. 5–40.
40. N. Mentzer, K. Becker and M. Sutton, Engineering Design
Thinking: High School Students’ Performance and Knowl-
edge. Journal of Engineering Education,104(4), 2015, pp.
417–432.
41. H. E. Middleton, The Role of Visual Imagery in Solving
Complex Problems in Design, Doctoral Dissertation, Griffith
University, 1998.
42. M. Welch, The Strategies Used by Ten Grade 7 Students,
Working in Single-Sex Dyads, to Solve a Technological
Problem. Doctoral Dissertation, McGill University, 1996.
43. K. A. Ericsson and H. A. Simon, Protocol Analysis: Verbal
Reports as Data, MIT Press, Cambridge, M.A., 1993.
44. H. H. Halfin, Technology: A Process Approach, Doctoral
Dissertation, West Virginia University, 1973.
45. R. B. Hill and R. C. Wicklein, A Factor Analysis of Primary
Mental Processes for Technological Problem-Solving, Jour-
nal of Industrial Teacher Education,36(2), 1999, pp. 83–100.
46. R. C. Wicklein and J. W. Rojewski, Toward a ‘‘Unified
Curriculum Framework’’ for Technology Education, Jour-
nal of Industrial Teacher Education,36(4), 1999, pp. 38–56.
47. R. B. Hill, The Design of an Instrument to Assess Problem-
Solving Activities in Technology Education, Journal of
Technology Education,9(1), 1997, pp. 31–46.
48. Project Lead the Way, www.pltw.org, Accessed 06 April
2018.
49. G. J. Strimel, Engineering Design: A Cognitive Process
Approach, Doctoral Dissertation, Old Dominion Univer-
sity, 2014.
50. Change the Equation, Left to Chance: U.S. Middle Schoolers
Lack in-Depth Experience with Technology and Engineer-
ing, in Vital Signs: Reports on the Condition of STEM
Learning in the U.S., 2016.
Greg J. Strimel is an assistant professor in the department of Technology Leadership & Innovation at Purdue University in
West Lafayette, IN. Strimel also serves as a co-director for the Advancing Excellence in P-12 Engineering Education project.
Eunhye Kim is a graduate student and research assistant in the School of Engineering Education at Purdue University;
West Lafayette, IN.
Scott R. Bartholomew is an assistant professor of Engineering/Technology Teacher Education at Purdue University; West
Lafayette, IN. Bartholomew taught middle school Engineering & Technology Education classes for three years prior to
accepting his position at Purdue University.
Diana Cantu is an adjunct faculty member in the STEM Education and Professional Studies department at Old Dominion
University; Norfolk, VA.
Examining Engineering Design Cognition with Respect to Student Experience and Performance 1929