Content uploaded by Cheryl L. Coyle
Author content
All content in this area was uploaded by Cheryl L. Coyle on Apr 27, 2016
Content may be subject to copyright.
Making Peanut Butter and Jelly Sandwiches: Do Students From
Different Disciplines Approach This Exercise Differently?
Cheryl L. Coyle, Bell Laboratories, 4 Hazelwood Ct., Howell, NJ, 07731, USA,
coyle@alcatel-lucent.com, 732-901-6815
Heather Vaughn, Bell Laboratories, 318 Daniele Dr., Ocean, NJ, 07712, USA,
vaughn@alcatel-lucent.com, 732-493-0419
In this practice-oriented paper for human factors education, we describe our experiences
piloting a variation of a classroom activity reported elsewhere. We conducted the
activity with two different groups of students: psychology majors and software
engineering majors. Focusing students on the simplest of algorithms is a fruitful activity
to introduce them to biases and variations that occur in practical field studies. Students
enjoyed the activity, and we took away learning for our industrial research context.
INTRODUCTION
The use of peanut butter and jelly (PB&J)
sandwiches as a teaching tool is not new
(Lewandowski & Morehead, 1998). Recently, it
was reported PB&J sandwich-making can be part of
successful Human-Computer Interaction (HCI)
teaching activities (Hourcade, Garcia & Perry,
2007; Davis & Rebelsky, 2007).
The activity introduces students to the complexity
of use-case writing for something as simple as
“sandwiches.” From our telecommunications
background, we saw the value of this research.
Imagine those students 10 years later as HCI
professionals. Imagine these individuals will need
to design communication services end users can
personalize using time-dependent, conditionally-
branching instructions? Given that general
consumers today may be frustrated programming
clocks on devices (with little or no dependencies,
yet), or have spare knowledge about how telecom
networks interoperate because the phone, service, or
device should “work for me,” effective use-case
writing remains paramount. HCI professionals need
to define how individuals specify preferences. Use-
case writing is an effective tool for exposing
variations and specifics. Trends then come to light.
Given that HCI students in the prior studies we
referenced are “mostly computer science majors,”
we wondered about other disciplines. Can this
exercise benefit other students who enter careers in
HCI? What follows is our experience.
PRACTICE INNOVATION
Human factors and ergonomics attracts psychology
majors, but it lures in software engineers, too. How
would software engineering (SE) students react to a
PB&J sandwich-making activity we’d modified
requiring them to define the use case? What about
psychology (Psych.) majors? Would there be
differences between the two types of students?
The literature in psychology and engineering
education has suggested that different disciplines
attract students with different inclinations toward
instruction. (Felder, R.M. & Brent, R., 2005;
Ramsden, P., Date; Whitmire, 2002; Wu, Custer &
Dyrenfurth, 1996). “Learning tasks in [hard]
science are typically described as hierarchical,
logical, heterogeneous, and rule-based and
procedure governed. … Arts and social sciences
tasks are seen to require interpretation, comparison,
generalization, and to be more self-governed…”
(Ramsden, 1997, pp. 208-209). We were curious
whether these differences would be reflected in
differences in performing the PB&J activity.
The PB&J activity provides a compelling
demonstration on the difference between procedural
and abstract knowledge. For Americans, making a
PB&J sandwich is one of the most basic algorithms
in our cultural lexicon, and while the task seems
obvious, observing the differences is intriguing.
In addition to our curiosity about potential
differences based on student studies, we wondered
whether it was possible to quantify the methods of
observation of PB&J. Could we document the
learnings of the students by giving them a before-
observation and after-observation task, and compare
the two? In the field of human factors, we often
catalog user task steps, and then revise them after
observing a population performing those tasks
(alongside engineers revising requirements and
code as people use the design). The task steps and
design frequently change after we’ve observed
users. Would task steps change for both student
types?
METHODS
We conducted a PB&J sandwich-making and
observation activity with students in two different
courses. The first was a classroom of upper-level
Psych. students taking a course on “Qualitative
Methods.” The second was a class of upper-level
SE students in a “Software Practicum” course.
Table 1 presents their majors, gender, and age
ranges.
Students Gender*: Age Range
Psych., n = 11 F = 10, M = 1 19-23 (Mdn = 21)
SE , n = 6 F = 2, M = 4 20-22 (Mdn= 21)
Table 1. Demographics.
* F = female, M = male.
We ran the exercise on two different days in
separate classrooms, but we followed the same
procedure for both. We supplied jars of peanut
butter (two different brands), jars of jelly (two
different brands) and loaves of white bread (two
different brands). We also provided plastic utensils,
paper plates, napkins and water bottles.
We wanted to collect “before” and “after” data.
Before beginning the activity of making and
observing others make PB&J sandwiches, we asked
the students to “Please write in order, and in as
much detail as possible, all the steps involved in
making a peanut butter and jelly sandwich.” Our
goal was to compare these descriptions with
descriptions written after observation. We
distributed worksheets with demographic questions,
a place to write the steps, and the question: “Have
you ever made a peanut butter and jelly sandwich?”
After about five minutes of writing task steps,
students handed in their worksheets (so they could
not refer to them during the observation activity),
and separated into small groups. The Psych. class
had two groups with four participants each and one
with three; SE’s had two groups of three
participants. We provided each small group with
the PB&J sandwich-making materials and
instructed them to take turns making sandwiches.
The non-sandwich makers were told to carefully
observe the sandwich maker and to take notes. We
encouraged the sandwich makers to think aloud and
explain what they were doing. We included this as
“think aloud” protocols are common in HCI user
studies.
Every student made a sandwich and observed at
least two other students. After the observation
activity, we asked students to fill out a second
worksheet. They were asked to write all the activity
steps again, this time referring to the notes taken
during observation. The worksheet also ended with
this question: “How did it feel to be observed?”
When the worksheets were complete, we collected
them and we held a brief class discussion about
students’ experiences with the activity.
FINDINGS
In this pilot experiences, we provide numbers only
to guide other researchers wishing to replicate this
activity with larger populations. Our sample was
small, it was asymmetric, and as noted in our
Methods Section, there are potential confounds. The
experience we were exploring is whether instructors
can gather thinking-style & behaviors from
different types of students, while also providing a
training activity that will prepare SE and Psych
students who may enter the professional context of
HCI?
Surprisingly, 3 out of the 11 Psych. students had
never made a PB&J sandwich before. This was
unusual, given that all the students were raised in
the United States. All the SE students had made
PB&J sandwiches before, and all were U.S.-raised.
Task Steps (before and after)
We counted the number of task steps identified by
each student before and after they observed others
making PB&J sandwiches. There were no
perceivable differences in number of task steps
provided by students before and after descriptions
about how to build a PB&J. About a third of the
students across groups identified more steps, a third
identified the same number, and a third identified
fewer steps after observing than before. Refer to
Table 2 for declared majors and task-step ranges.
Students Range Before Range Afterward
Psychology 4-7 (Mdn = 6) 4-8 (Mdn = 5)
SE 10-20 (Mdn = 12) 6-21 (Mdn = 11)
Table 2. Task step ranges before and after observations.
Every SE student identified more task steps than
every Psych. student. We saw a trend in SE
students detailing more instructions building a
PB&J. Given a high correlation noting asymmetry
across groups (Spearman R = .67, p = .030., n=17),
we generated a collapsed variable based on tasks
steps written before and after the PB&J activity.
One-way ANOVA on the mean steps using major as
a factor yielded F = 28.96, df = 1, p =.000. In other
words, SE students were more detailed overall. We
look at this trend in Discussion, given our small,
asymmetric sample with potential confounds.
Taking Notes
Psychology students commented difficulties for
taking notes while observing. They wanted to
observe, but not take notes simultaneously. Many
reported using a shorthand. One student suggested
videotaping the activity would be better than
observing and note-taking simultaneously. One
stated that it is “much easier to miss something
while taking notes.” In fact, another observer didn’t
take notes. Others remarked that they “knew ahead
of time” what the sandwich maker was going to do
next, so they didn’t have to record it carefully.
SE students did not describe note-taking trouble,
but we did not collect students’ notes for both
classes, so we can not report here on quantity or
quality of notes for each class. Others seeking to
expand on our activity might wish to collect and
analyze notes.
Feelings on Being Observed by Others
In their written responses, most of the Psych.
students indicated that it felt “weird” or “strange” to
be observed. Eight out of 11 responses included a
description of an uncomfortable feeling. Only three
out of the six SE students’ written responses
indicated they felt uncomfortable being observed;
one wrote that it was “fine” and one even replied
that “it felt good.” In class discussion, there
appeared to be a bigger distinction between the two
groups: there was general agreement among the
Psych. students that being observed by others made
them feel uncomfortable. However, in the SE class,
none said they felt weird or strange being observed.
Some mentioned they were more deliberate in their
actions since they knew they were being observed,
but nobody volunteered they experienced
discomfort -- even when we prompted them!
Student Reactions
SE students seemed to have more fun with this
activity than the Psych. students. SE students
enjoyed the opportunity to think aloud and
demonstrate how they make PB&J. The Psych.
students giggled a little, but seemed embarrassed.
Figure 1. Three different sandwiches.
During observation, SE students frequently teased
others about “doing it wrong.” Discussion revealed
that many of the individual SE students believed
that their way was ‘correct’ when there were
alternate ways of doing something. The Psych.
students, on the other hand, were more inclined to
watch the variations they observed without
commentary. One Psych. student during post-
discussions was surprised that “something so simple
had so many different ways to do it.” (See Figure
1).
DISCUSSION
This was a good activity to teach observation skills
for both the Psych. and the SE majors. It is
important all students experience being observed.
This part of the activity alone is worth it.
We did not perceive differences in individual
students’ styles for documenting task steps before
versus after observing others. This was surprising
to us because the literature on students writing
algorithms for PB&J states that watching a
professor perform the task steps (incorrectly) helps
improve the algorithm (Lewandowski & Morehead,
1998). However, our sample was small and our
task was different. We did not ask students to
follow the steps that peers or professors wrote. Nor
did we provide an explanation for the purpose of
documenting task steps. We suggest running a
study in which a specific use case is provided. We
continue to believe that students can learn a great
deal from observing others perform a familiar task,
but we need a better method for quantifying that.
For example, a response-time measure such as time
to complete writing the procedure steps might be
more sensitive in showing differences due to
learning that occurs while observing others.
We saw interesting trends between the two
populations. SE majors were more precise and
thorough in their task steps. As a group, they
identified more task steps for the same activity than
the Psych. students. They were more animated
during the observation activity and the discussion
that followed. The SE students had a propensity for
making END statements in their tasks steps, such as
“DONE, EAT, ENJOY” (perhaps from their coding
training?) Far fewer of the Psych. students included
such steps.
Our observation is that the groups were sized right.
It is best to have at least three people in a group, so
students can observe more than one person making
a sandwich. A group of two is too small. Four
students to a group is the maximum, though. Larger
groups take too much time and boredom can set in.
There are a few ways we would enhance the
experimental design for these different majors who
may enter HCI. First, we might use a validated/
reliability-based inventory to factor in learning
styles, such as the Kolb Learning Model, or, the
Felder & Solomon Index of Learning Styles, which
have been heavily tested in engineering contexts; or
use the Myer-Briggs if there is a strong theoretical
construct about what will be done with the data..
(Felder R. M & Brent, R, 2008; Kolb, D. A., 2008;
Myers Briggs, M. Briggs, K. C., 2008) We believe
that for our pilot, placing students into “SE” versus
“Psych” categories was sufficient, but, for larger
sample sizes these tools can be used to manage
numbers and differences.
The second way we would refine our experimental
design is to inter-rate our analyses on the written
instructions given by the students before and after.
We had an impression that the SE students made
discrete statements, much as they would for writing
computer code or requirements. We had another
impression that the Psych. students wrote out
instructions in a prose-like fashion, as if they were
observing phenomena or writing how to review
notes. Perhaps one can speed analysis with newer
aids (e.g., Ethnograph, Atlas.ti, etc.) to refine
findings about distinct use-case writing styles.
Finally, we would enhance our experimental design
by bringing the students from the different classes
together to run this use-case writing activity. It is
often a reality that HCI professionals from different
training backgrounds need to work together, and the
future probably requires more of this multi-
disciplinary interaction. This activity in a mixed
group has potential in the academic context to
enhance each discipline’s curriculum, but more
importantly, it prepares students from each
discipline to work efficiently with “others” as they
enter HCI professions.
This study may seem unrelated to industrial design
at first blush. Nevertheless, there are a few
connections worth mentioning.
• Writing use-cases for the next generation of
personalized technology-enabled services is
complicated. Whether the domain is tele-
communications, medicine, transportation, etc….
use cases require abstract and theoretical
partnership in thinking and documentation. This
requires both types of students - be they inclined
toward hierarchies and logic, or inclined toward
observation and generalities - to define what users
will need instructing services to behave as expected.
End-user personalization needs robust engineering.
• What if you had a team of people from
different trainings who needed to build a design?
You could use the PB&J activity in an Agile
environment, especially if it is early in
conception/design phase, to expose thinking and
documentation differences. You could also overlay
this experimental activity upon your particular
domain problem, especially if the problem is a
variation on something that came before. Using an
industrial paradigm: imagine HCI people, graphic
designers, system testers, and managers all need to
come together to design the “next” Smartphone? Do
you think they all have the same use case in mind?
• What if you built a device/application for a
small user set, and you suddenly found an
unexpected population uses it? Perhaps you can
use a domain-specific variation of PB&J to capture
new expectations and thinking. Fast redesigns to
accommodate the new set of users could follow.
ACKNOWLEDGMENTS
Thank you to Dr. Janice Stapley and Dr. Daniela
Rosca for sharing your classrooms. We appreciate
the support of Dr. Allen Milewski in finding us a
software engineering classroom. We thank the
students of Monmouth University. And thanks to
Juan Pablo Hourcade for inspiring us at CHI 2007.
REFERENCES
Davis, J. & Rebelsky, S. A. (2007). Food-first
computer science: Starting the first course right
with PB&J. Proceedings from SICSEI ’07: The
38th SIGCSE Technical Symposium on
Computer Science Education. (pp. 372-376).
New York: ACM Press.
Felder, R.M. & Brent, R. (2005). Understanding
student differences. Journal of Engineering
Education, 94(1), 57-72.
Felder, R.M & Soloman, B. (retrieved 2008). Index
of Learning Styles Questionnaire.
http://www.engr.ncsu.edu/learningstyles/ilsweb.
html.
Hourcade, J. P., Garcia, O. I., & Perry, K. B.
(2007). Learning observation skills by making
peanut butter and jelly sandwiches. Proceedings
from SIGCHI ’07: CHI ’07 Extended Abstracts
on Human Factors in Computing Systems. (pp.
1753-1758). New York: ACM Press.
Kolb, D. A. (retrieved 2008). Kolb Learning Style
Inventory, Version 3.1.
http://www.haygroup.com/tl/Questionnaires_W
orkbooks/Kolb_Learning_Style_Inventory.aspx
#Ordering.
Lewandowski, G. & Morehead, A. (1998).
Computer science through the eyes of dead
monkeys: Learning styles and interaction in CS
1. In D. Joyce & J. Impagliazzo (Eds.), The 29th
SIGCSE Technical Symposium on Computer
Science Education. (pp. 312-316). New York:
ACM Press.
Myers Briggs, M. & Briggs, K. C. (retrieved 2008).
http://www.myersbriggs.org/my%2Dmbti%2Dp
ersonality%2Dtype/take%2Dthe%2Dmbti%2Di
nstrument/.
Ramsden, P. (1997). The context of learning in
academic departments. In F. Martin, D.
Hounsell & N.J. Entwistle (Eds.), The
Experience of Learning: Implications for
Teaching and Studying in Higher Education.
(pp. 198-216). Edinburgh: Scottish Academic
Press.
Whitmire, E. (2002). Disciplinary differences and
undergraduates’ information-seeking behavior.
Journal of American Society for Information
Science & Technology, 53(8), 631-638.
Wu, T. F., Custer, R. L. & Dyrenfurth, M. J. (1996).
Technological and personal problem-solving
styles: Is there a difference? Journal of
Technology Education, 7(2), 55-71.