Content uploaded by Beau Hartweg
Author content
All content in this area was uploaded by Beau Hartweg on Jan 31, 2016
Content may be subject to copyright.
Electronic Journal of Science Education Vol. 20, No. 1 (2016)
© 2016 Electronic Journal of Science Education (Southwestern University/Texas Christian
University) Retrieved from http://ejse.southwestern.edu
Developing an Educational Tool to Model Food Chains
Daniella Biffi
Texas Christian University, United States
Beau Hartweg
Texas Christian University, United States
Yohanis de la Fuente
Texas Christian University, United States
Melissa Patterson
Texas Christian University, United States
Morgan Stewart
Texas Christian University, United States
Eric Simanek
Texas Christian University, United States
Molly Weinburgh
Texas Christian University, United States
Abstract
The Framework for K-12 Science Education (NRC, 2012) and Next Generation Science
Standards (NGSS Lead States, 2013) stress that in addition to disciplinary core ideas (content),
students need to engage in the practices of science and develop an understanding of the
crosscutting concepts such as cause and effect, systems, and scientific modeling. In response to
these reform suggestions we developed an educational tool to be used to help teach students
about models and the marine food chain. Our research was the validation of the tool as a
legitimate instructional device. The research reported here outlines the process and provides
science teacher and science teachers educators with an alternative for teaching this topic.
Key words: science education research, scientific models, food chains, educational tools
Please send all correspondence to: Daniella Biffi, TCU Box 297900, Fort Worth, TX 76129,
817-257-6115, d.biffi@tcu.edu
Introduction
The Framework for K-12 Science Education (NRC, 2012) and Next Generation Science
Standards (NGSS Lead States, 2013) stress that in addition to disciplinary core ideas (content),
students need to engage in the practices of science and develop an understanding of the
crosscutting concepts such as cause and effect, systems, and scientific modeling. In order to
Biffi, Hartweg, de la Fuente, Patterson, Stewart, Simanek and Weinburgh 41
Electronic Journal of Science Education ejse.southwestern.edu
develop such practices, students must have repeated experiences that increase in complexity and
vary in circumstances. They may engage with the actual phenomenon being studied or with
representations and models giving students sustained opportunities to work, develop
appreciation, and establish the existing interconnection among those ideas.
National (NAEP; NCES, 2014) and international (TIMSS; NCES, 2011) reports indicate
that students in the United States are not doing as well on large-scale assessments as students in
comparable countries. In response to the NGSS directive for students to be able to experience
facets of disciplinary core ideas, crosscutting concepts, and scientific practices, the first author
developed a dynamic model of a marine food chain that could be used by upper elementary
students several times without redundancy. Scientific practices of creating and using a model,
interpreting data, constructing explanation, and engaging in argumentation were an integral part
of the experience. The model utilized a Jenga© tower in a game-like atmosphere requiring the
students to use possible events that can occur within a food web to predict the effect such events
would have on the system. A game is “a set of activities, involving one or more players…(with)
goals, constraints, payoffs, and consequences…is rule-guided (and) involves some aspect of
competition, even if that competition is with oneself” (Dempsey, Haynes, Lucassen, & Casey,
2002, p. 159). The use of games (Franklin, Peat, & Lewis, 2003; Gutierrez, 2014; Odenweller,
Hsu, & DiCarlo, 1998) to help students develop scientific concepts is not new, but none were
found that have developed a game-like model using the NGSS as a foundation.
Literature Review
Barman and Mayer (1994) determined that most teachers considered ecosystems, food
chains, and food webs as important topics for students to know and believed that these concepts
were somewhat easy for students to understand. The same teachers believed that these concepts
were somewhat easy for students to understand (Barman & Mayer, 1994). However, more recent
research reveals that students’ ideas about these topics are usually filled with misconceptions
(Umphlett, Brosius, Laungani, Rousseau & Leslie-Pelecky, 2009). For instance, students believe
that changes in one trophic level will only affect another if they have a direct predator-prey
relationship. Subtle interactions, which result in the balance of an ecosystem, are often not well
understood by elementary and middle school students (Umphlett, et al, 2009). Ecosystems are
complex systems that involve multiple variables, compound causes and consequences, and
progressing structures which unfold in ways that cannot be seen by observers (Jacobson &
Wilensky, 2006; Manz, 2012). As students move into middle and high school, they tend to think
about individuals instead of populations, focus on animals, ignoring other groups of organisms
(e.g. plants, fungi), and not think about the community as a system (Grotzer & Basca, 2003).
Moreover, textbooks do not always explain the complexity of food chains and food webs
(Barman & Mayer, 1994). In fact, many pictures in student texts have historically represented
food webs with arrows pointing in the wrong direction, showing which organism got eaten as
opposed to the direction of energy/matter flow (Schollum, 1983). Textbooks often assume that
specific associations and generalizations about food relationships are abilities that students
would continually establish on their own (Barman & Mayer, 1994). As a result, adults in the
United States often lack basic knowledge and awareness of ecosystems and how they work
(Tran, Payne & Whitley, 2010).
Developing an Educational Tool to Model Food Chains 42
Electronic Journal of Science Education ejse.southwestern.edu
A possible way for teachers to help students understand these topics is through the use of
games designed to show the intricacy of the systems. Classroom games have been shown to be
an effective method of instruction to help develop content knowledge in physics (Anderson &
Barnett, 2013; Clark et al, 2011), geology (Mayer, 2011), chemistry (Rastegapour & Marashi,
2012), and astronomy (Ruzhitskays, et al, 2013). Studies have found that the use of games in the
classroom can lead to an increase in students’ motivation levels (Baines & Slutsky, 2009; Pinder,
2008). Specifically, upper elementary students (Barab, Sadler, Heiselt, Hickey, & Zuiker, 2007;
Kuo, 2007) and middle school students (Rowe, Shores, Mott, & Lester, 2010) expressed
increased interest in learning science when presented with games. Games also can provide
experiential, contextualized learning and can help develop metacognitive skills (Mayo, 2007).
They can allow the players to be producers rather than consumers (Gee, 2003) as students make
choices with consequences. Games that involve collaborative work “may act as a catalyst for
change in students’ self-efficacy” (Barab & Dede, 2007, p. 3).
Games may also be models or simulations of real, complex systems. This is important
because the Framework for K-12 Science Education and Next Generation Science Standards
(NGSS) stress that in addition to disciplinary core ideas, students need to be engaged in the
practices of science and to develop an understanding of crosscutting concepts. Additionally, the
NGSS (NGSS: Lead States 2013) highlight the importance of using tangible models as a way to
help students understand both the nature of the scientific enterprise and disciplinary core ideas.
Scientific models are powerful tools that can be used by students to visualize scientific
concepts, predict scientific phenomena and look for possible solutions. Even though there are
several kinds of scientific models, most students consider them as objects to be interpreted, but
not used to generate and test ideas (Lehrer & Schauble, 2006). Involving students in scientific
modeling is important for helping them develop and evaluate explanations of the natural world
(Baek, Schwarz, Chen, Hokayem & Zhan, 2011). De Ruiter, Wolters, Moore and Winemiller
(2005) affirmed that, unlike the classic stone-arch metaphor for understanding food webs, Jenga
towers are flexible structures that allow changes in species composition, attributes and dynamics,
displaying characteristics of the ecosystems that are important to understand the complexity of
the environment.
Theoretical Framework
We draw on two learning theories in our work: sociocultural constructivism and social
languages. The theoretical framework used in both the development of the educational tool and
this study is based in sociocultural constructivism (Luria, 1976; O’Loughlin, 1992; Vygotsky,
1978, 1986) and social languages (Bakhtin, 1981, Gee, 2004a, 2004b, 2008). During the 20th
century, several theories of cognition coalesced into a theory of learning which focuses on the
construction of knowledge as situated within culture and language. This theory, sociocultural
constructivism, emphasizes the importance of interacting with phenomena, ideas, and
community in developing cognition. It forefronts the importance of cultural tools such as
language, signs, and symbols produced within and by a group. Social constructivism then
stresses the interactive nature of learning and the role of language as a tool for expressing what is
known and for constructing new knowledge. In addition, the theory stresses the importance of
others in helping the learner move from a novice to expert.
Biffi, Hartweg, de la Fuente, Patterson, Stewart, Simanek and Weinburgh 43
Electronic Journal of Science Education ejse.southwestern.edu
Bakhtin and Gee stress the social nature of scientific language. The current scientific
way of talking and writing has developed over time within the scientific community and is
specific to scientists. Words are given very specialized meaning or new words are invented as
new phenomena are found. Gee (2004b) distinguishes between discourse (lower case ‘d’) that is
generic and used in informal settings and Discourse (upper case ‘D’) that is highly specific and
used by a sub-set of individuals. Scientists have developed a Discourse that is unique to science
and can be further divided into sub-areas such as ecology. To be successful in school, students
must learn the Discourse of science. Scientific Discourse can be acquired through interactions
with language that occur during an apprenticeship. The science classroom may be the site of the
apprenticeship.
Methods
The Peruvian Food Chain Jenga (PFCJ) was initially developed in order to engage upper
elementary student in thinking about the disciplinary core idea of ecosystems (LS2 of NGSS);
the crosscutting concepts of cause/effect (#2) and stability/change (#7); and the scientific
practices of developing and using models (#2), constructing explanation (#6) and engaging in
argument from evidence (#7) (see NRC, 2012, Box S-1, p.3). As with the development of an
inventory or test, a process of determining the content accuracy and playability was a necessary
pre-requisite to research concerning the effectiveness of using the tool for instructional purposes.
However, unlike an inventory or test, classical measures of reliability (Cronbach’s alpha) could
not be performed to determine internal consistency. Therefore, three cycles of development and
one cycle of student testing were used. In this article, we only report the development and
validation of the game.
Development of the PFCJ
The development of the PFCJ began with the establishment of the ecosystem to be
represented and what elements of the food chain to included. We chose the Peruvian marine
ecosystem because the ocean off the coast of Peru is considered one of the most productive
fishing areas in the world1, it is highly impacted by human activities, and the aquatic and marine
themes are often absent from K-12 curricula (Tran, Payne & Whitley, 2010). To illustrate
concepts such as keystone species and top predators, we selected the Peruvian anchovy and the
hammerhead shark, two species that are highly under pressure by contemporary practices such as
overfishing and “finning”.
The PFCJ utilizes the Tower, Event Cards, Guide Sheet, and Placement (Figure 1). The
classic Jenga© game set has 54 blocks that are stacked to form a tower. We divided the blocks in
seven groups, and assigned nine blocks for the first three trophic levels: zooplankton,
phytoplankton, and anchovies; six blocks to the mackerel, squid, and mahi-mahi levels; and three
blocks for top predator, the hammerhead shark. The remaining six blocks were labeled as ‘wild
cards’ to allow for situations in which the player did not have an appropriate block. This block
distribution was established to illustrate the abundance difference between organisms of different
trophic levels. The resulting Tower represents the Peruvian marine food chain. However, it must
be noted that the game is not proportional to the actual trophic levels.
Developing an Educational Tool to Model Food Chains 44
Electronic Journal of Science Education ejse.southwestern.edu
Figure 1. Components of the Peruvian Food Chain Jenga©.
The single Placemat contains additional factual information that provides more in depth
material that can be considered when making decisions about how to respond to the Event Cards.
One side of the Placement is dedicated to content information, models and the role of models in
science, challenges to the ecosystem, vocabulary, and the Peruvian Sea. The other side provides
rules for planning the game. Guide Sheets, one for each player, give clues for the placement of
the organisms from lowest to highest (producer to top predator) and provide a place for the
students to record the tropic levels prior to constructing the tower. They also provide the students
with a record that can be kept in their science notebook.
Event Cards provide situations for consideration by the students. The events or situations
on the Event Cards are based on current issues that affect marine ecosystems and that are found
in general-public media campaigns (e.g. Take Shark Fin Soup Off the Menu! campaign by
Oceana, 2015). Topics for the Event Cards were also drawn from websites of international non-
profit organizations and scientific literature. On the backside of each Event Card is a ‘move’
which tells the player to add blocks and remove blocks from specific trophic levels. These moves
parallel the actual consequence of the event presented on the front of the card. As events occur,
the Tower changes stability and eventually becomes so fragile that it collapses. The collapsed
food chain is very graphic in its representation of what can happen in an ecosystem.
Data Collection
Data were collected in three phases. The first phase, determination of the content
accuracy, utilized the help of college professors who taught biology, chemistry and science
education. During the second phase, science education graduate students and biology teachers
examined the tool for the ease of use. The last phase, testing the tool, engaged 4th and 5th grade
students and their teacher in using the PFCJ and giving feedback.
Biffi, Hartweg, de la Fuente, Patterson, Stewart, Simanek and Weinburgh 45
Electronic Journal of Science Education ejse.southwestern.edu
Education Tool Phase I – Content Accuracy. The research team invited a small group of
college professors from biology, chemistry, and science education to use the PFCJ prototype in
the game format. The Placemat was examined for accuracy. Events Cards were drawn as
directed by the instructions on the Placement, with the professor adding or removing a Jenga©
block as deemed appropriate from the ‘event’ outlined on the card. At the completion of each
player’s turn, the event was discussed for correctness, density of language, confusing wording,
potential reinforcement of misconceptions, and alignment with upper elementary core ideas.
Suggestions were made for improving the accuracy and eliminating misconceptions. Appropriate
changes were made.
Education Tool Phase II – Ease of Use. The next stage of development of the PFCJ was
to establish ease of use of the PFCJ for non-scientists and to determine the value of each
component of the model. This development phase involved two populations: science education
graduate students and high school biology teachers. Both groups had signed Human Subject
consents.
Four science education graduate students, all of whom had public school experience and
one certified to teach English Language Learners, were asked to review the tool. Their task was
to use the instructional tool in its current form and offer suggestions to improve the ease of use
and the format of PFCJ components. They were particularly interested in the readability of the
Events Cards, the appropriateness of the graphics on the Placemat and Guide Sheet, and the
clarity of the instructions. The graduate students built the Tower to model the correct placement
of trophic levels and played two rounds of the game. During the first round, players read the
scenario and discussed the consequences that could result. However, when they looked at the
moves offered on the reverse side of the Event Card, they could only remove a block based on
one of the two consequences outlined on the Event Card. During the second round of the game,
players removed blocks based on both consequences outlined on the Event Card. The merits of
using one or both consequences were discussed by the group and notes were recorded. The
decision was made to use both consequences in order to help model the complexity and fragility
of the food chain.
Twenty-three high school biology teachers were introduced to the PFCJ as a way to
review food chains with general biology students. They worked in groups of four, following the
instructions as if they were students. The Tower was built to show the correct order of the trophic
levels and Event Cards were drawn, read aloud, consequences predicted, and blocks removed or
added as predicted. After all teams had completed one round, the teachers were instructed to
think about and discuss how to improve the experience (notes were taken at each table). They
were asked to critique the instructions, the level of vocabulary, and usefulness as a way to
present/review food chains, cause and effect, systems, and models. Audio recorders were placed
at each table to capture the conversations. Table groups then shared with the whole room, with a
lively discussion of what was most valuable and lest valuable. Notes were taken by two of the
authors and small changes were made.
Education Tool Phase III – Testing the tool. The third phase was conducted at a local
elementary school. The participants in this phase consisted of the students in one 4th grade class
and four 5th grade classes (N = 89). The research team helped monitor four of the classes but the
Developing an Educational Tool to Model Food Chains 46
Electronic Journal of Science Education ejse.southwestern.edu
last class was conducted completely by the classroom teacher. All groups signed consent for
media and data use. In each class, students were divided into groups of three or four to complete
the PFCJ lesson (Figure 2).
Figure 2. Students engaged in PFCJ
Researchers collected both quantitative and qualitative data during this phase.
Quantitative data included student responses to a questionnaire and a time-stage chart. The
questionnaire was a 28-question Likert-scale survey based on Gutierrez (2013) previous work.
The questionnaire was divided in five categories 1) goals and objectives/clarity of purpose, 2)
design of the tool, 3) organization of the activity, 4) rules and playfulness, and 5) usefulness of
the lesson. After every section a place for comments was provided. Qualitative data included
researcher field notes, responses to open-ended questions, a ‘What I did/What I learned’ sheet
filled out by the students, audio recordings of students interacting with the PFCJ, and an informal
discussion with the classroom teacher.
Data Analysis
Notes from Phase I were reviewed for content accuracy. Audio and research notes from
Phase II were loosely analyzed for comments that occurred frequently or that resonated with the
research team about what was useful and what was not. Phase III required the calculation of
mean scores of the responses for the students’ questionnaire (Table 1) to determine student
opinion of the activity and a chart to show the time span for each phase. Qualitative analysis for
the student responses used a modified constant comparative design (Glaser & Strauss, 1967).
Each research team member read and coded the open-ended responses for emerging themes.
Research themes were compared and collapsed into two categories.
Results and Discussion
The result of each phase is presented below. The different phases take into account
recommendations from content experts, science education graduate students, classroom teachers,
and elementary students. Because each phase was designed to provide input from dissimilar
groups on different aspects of the tool, the outcomes vary. With each phase of the testing, results
were incorporated into the PFCJ.
Biffi, Hartweg, de la Fuente, Patterson, Stewart, Simanek and Weinburgh 47
Electronic Journal of Science Education ejse.southwestern.edu
Phase I content discussion
The college professors agreed that the model was appropriate even though the trophic
levels were not in proportion. They discussed various ways of wording some of the cards but did
not make any substantial content changes. The small changes in wording were made and
approved by the content experts. Thus, content validity was established and the tool moved to the
next level of testing.
Phase II educational use of the tool
The science education students were concerned with the alignment of the PFCJ to the
national standards and with ease of use for students of low reading and/or language ability. They
discussed the value of the Placemat as a source of information and wording of the instruction for
building the initial Tower. They suggested addition of the species picture to the Guide sheet
(Figure 3) and a sentence starter on the Guide Sheet and Placemat.
In response to the suggestions, the research team inserted the prompt sentence “If ___
then I predict that ___ because ___”, on the Placemat and the Guide Sheet. Using sentence
prompts can help reinforce scientific language and provide the students with giving a reason for
the prediction/claim that they made. In addition, pictures of the species were added to the Guide
Sheet.
Developing an Educational Tool to Model Food Chains 48
Electronic Journal of Science Education ejse.southwestern.edu
Figure 3. Guide Sheet and Placemat
We held a post-activity discussion with the teachers. Most of the teachers had positive
comments about the tool and stated that it could be used as an introduction to the unit or
reinforcement/review at the end of the unit. High school teachers said they could use the tool as a
refresher for students even though it was designed for upper elementary. Some teachers also
indicated that with small additions the tool could be used to explain energy loss in a food chain,
natural selection (easy to remove blocks can be considered “weak” individuals in the
population), invasive species (adding ‘wild cards’ to represent the introduction of exotic species),
and it could even be used as the start for developing their own activity.
Regarding the Placemat and Event Cards, the biology teachers did not have any
modifications. However, some teachers suggested a vocabulary handout for teachers. Most of the
teachers suggested a worksheet were students in groups could write their predictions.
Conversely, other teachers felt that part of the value of the tool was the open discussion and
argumentation that could occur as each student took his/her turn. We also added new vocabulary
to the Teacher’s Handout.
Phase III post activity student responses to the tool
Quantitative. Overall, students had a positive opinion of the PFCJ as seen by their scores
on the questionnaire. Section 4, rules and playfulness, got the highest average mean with 4.67.
The question with the highest rating was also on this section (#23: ‘participating in the activity
was fun’ = 4.878). This indicates that the students found the tool to be fun and understandable. In
contrast, Section 3 (organization) got the lowest rating (4.23). The lowest question in the section
addressed the amount of time needed to finish. The result seems to indicate that students wanted
more time. Even so, the overall result was still ‘very satisfactory’ with the organization. The
question with the lowest rating was on Section 1, #6: ‘the activity helped me remember concepts
Biffi, Hartweg, de la Fuente, Patterson, Stewart, Simanek and Weinburgh 49
Electronic Journal of Science Education ejse.southwestern.edu
and vocabulary’ with 3.84. Almost as low was #27: ‘participating in the activity helped me
establish better relationships with the members of the group’ from Section 5. It should be noted
that students felt the activity helped them review the topic (4.58), was a productive use of their
time (4.63), and made them think about what they were doing (4.57).
Table 1. Student evaluation questionnaire. The scale was from 1 to 5, and had the following
information: 1, strongly disagree; 3, neutral; 5, strongly agree.
N° of students
who answered
Mean
Section 1 - Clarity of purpose
1
The purpose and reason for the activity were fully explained to me
89
4.31
2
The goals and objectives of the activity were clearly stated
89
4.38
3
The activity made me think about what I was doing
89
4.57
4
The activity encouraged me to work with other students
88
4.30
5
The activity allowed my group to discuss key concepts
89
4.25
6
The activity helped me remember concepts and vocabulary
89
3.84
Average mean
4.28
Section 2 - Appropriateness of design
7
The placemat is the right size
86
4.56
8
The Jenga tower size is appropriate
86
4.74
9
Having a two-side placemat is helpful for the players
87
4.21
10
Having the animals on both ends of the blocks is helpful for the players
86
4.76
11
The pictures on the placemat and the Jenga blocks are representative of the
topic
86
4.55
12
The placemat does not rip or tear easily
87
4.68
13
The placemat size is easy to move around
86
4.17
14
The Jenga set size is easy to move around
86
4.07
Average mean
4.47
Section 3 - Organization
15
I easily understood the directions
85
4.15
16
The activity emphasized key points of the topic
84
4.46
17
The vocabulary used was just right to my level of knowledge
84
4.27
18
The number of prediction/event cards was just right
83
4.24
19
The amount of time needed to finish the activity was just right
84
4.04
Average mean
4.23
Section 4 – Rules and playfulness
20
The activity encouraged friendly competition and cooperation
83
4.68
21
The activity allowed everyone to play fairly
83
4.60
22
The rules of the activity allow me to make some choice
82
4.51
23
Participating in the activity was fun
82
4.88
Average mean
4.67
Section 5 - Usefulness of lesson
24
The activity helped me review the material
81
4.58
25
The activity encouraged me to dig deeper into the subject matter
80
4.45
26
Participating in the activity is a productive use of time
79
4.63
27
Participating in the activity helped me establish better relationships with the
members of the group
79
3.87
28
I would recommend the activity to my friends
79
4.54
Average mean
4.42
In the comments sections of the questionnaire students expressed a positive attitude for
the game. One third of the students considered it a ‘fun’ activity. Another common comment was
about the number of cards in the game and the time needed. Students thought there were too
Developing an Educational Tool to Model Food Chains 50
Electronic Journal of Science Education ejse.southwestern.edu
many Event Cards and that the time was not enough. However, we considered this downside
may be an opportunity for students to play the game several times getting new ‘questions’ every
time and getting different outcomes, getting a more complete understanding of the lesson.
Qualitative. Additionally, each student completed a “What I did/What I learned” handout.
The coding separated the comments about learning into two themes: those too generic to give
any real indication of what was learned and those giving specific reference to what was learned.
Many of the comments about what they learned were very generic. For example:
I learned about the food chain.
We learned about producers and consumers.
I learned new vocabulary.
I learned about the aquatic chain in Peru.
However, 15 of 50 (30%) student responses were highly specific about what was learned
by using the model. For example, students wrote:
I learned that harming one species could bring down everything.
I learned that the food chain is not as sturdy as I thought.
I learned that if something at the bottom of the food chain is moved then everything
above is affected.
I learned that even if you remove something small it can affect the whole chain.
I learned that you should be careful of your environment.
I learned that the anchovy is a key specie.
I learned that animals could increase or decrease if one thing (animal) decreased or
increased.
These comments helped the research team determine that use of the PFCJ activity
appeared to help students with content knowledge and crosscutting ideas. While the generic
comments do not give indication that the students learned any specific content, they do give
indication that students were able to connect use of the model to the concepts being taught. The
highly specific comments indicate that the model may serve as a useful tool to help students
visualize how changes in one trophic level may affect the entire food chain.
Lastly, when given the opportunity to tell how they would improve the game, no student
offered improvements. Several students wrote that they loved it just as it was presented. One
student wrote, ‘NO improvement needed’. Even the length of time necessary to complete the
tower building and round one of play did not get criticism from the students.
Implication
The goal of this study was to field-test the educational tool for content accuracy, ease of use, and
student approval. Content accuracy is considered important because resent research reveals that
students’ ideas about food chains are usually filled with misconceptions (Umphlett et al., 2009).
In addition, researchers have also found that textbooks do not always explain the complexity of
food chains and food webs (Barman & Mayer, 1994). Moreover, The Framework for K-12
Science Education (NRC, 2012) and Next Generation Science Standards (NGSS Lead States,
2013) stress that in addition to disciplinary core ideas (content), students need to engage in the
Biffi, Hartweg, de la Fuente, Patterson, Stewart, Simanek and Weinburgh 51
Electronic Journal of Science Education ejse.southwestern.edu
practices of science and develop an understanding of the crosscutting concepts such as cause and
effect, systems, and scientific modeling.
Based on student and teacher comments, we consider that the PFCJ is ready to be used as
an instructional tool for upper elementary students. However, such a tool cannot be used unless it
can be demonstrated that the content learning by students taught using the tool is equal to or
greater than the learning by students taught the same content in a traditional method. At this
writing, the research team are examining how use of the PFCJ impacts 5th grade student
conceptualization of food chains compared to traditional teaching methods, and have collected
data using an intervention group and control group design within a 5th grade classroom.
Acknowledgement The authors would like to thank Giri Akkaraju, Matt Chumchal, Aldo
Pacheco, and Dean Williams for their help and valuable suggestions. The Peruvian Food Chain
Jenga was developed with partial funding from the TCU Idea factory (www.tcuideafactory.org)
and the Andrews Institute of Mathematics & Science Education in the College of Education at
TCU.
Notes
1Peruvian marine ecosystem represents <0.1% of the total ocean surface but produces around 10% of the total fish
catch as cited in Chavez, Bertrand, Guevara-Carrasco, Soler & Csirke, 2008.
References
Anderson, J.L., & Barnett, M. (2013). Learning physics with digital game simulations in middle
school science. Journal of Science Education & Technology, 22, 914-926.
Bakhtin, M.M. (1981). The dialogic imagination: Four essays by M.M. Bakhtin. (Ed). Michael
Holquist, trans. Caryl Emerson and Michael Holquist, Austin, TX: University of Texas
press.
Baek, H., Schwarz, C., Chen, J., Hokayem, H., & Zhan, L. (2011). Engaging Elementary
Students in Scientific Modeling: The MoDeLS 5th Grade Approach and Findings. Models
and Modeling, Models and Modeling in Science Education, 6, 195-218.
Baines, L. A., & Slutsky, R. (2009). Developing the Sixth Sense: Play. Educational Horizons,
87, 97-101.
Barab, S.A., & Dede, C. (2007). Games and immersive participatory simulations for science
education: An emerging type of curriculum. Journal of Science Education and
Technology, 16(1), 1-3
Barab, S.A., Sadler, T.D., Heiselt, C., Hickey, D.T., & Zuiker, S. (2007). Relating narrative,
inquiry, and inscriptions: Supporting consequential play. Journal of Science Education
and Technology, 16, 59-82.
Barman, C. R., & Mayer, D. A. (1994). An analysis of high school students’ concepts &
textbooks presentations of food chains & food webs. The American Biology Teacher,
56(3), 160-163.
Chavez, F. P., Bertrand, A., Guevara-Carrasco, R., Soler, P., & Csirke, J. (2008). The northern
Humboldt Current System: Brief history, present status and a view towards the future.
Progress in Oceanography, 79, 95-105.
Developing an Educational Tool to Model Food Chains 52
Electronic Journal of Science Education ejse.southwestern.edu
Clark, D.B., Nelson, B.C., Chang, H.Y., Martinez-Garza, M., Slack, K., & D’Angelo, C. M.
(2011). Exploring Newtonian mechanics in a conceptually-integrated digital gam:
Comparison of learning and affective outcomes for students in Taiwan and the United
States. Computers & Education, 57(3), 2178-2195.
Dempsey, J.V., Haynes, L.L., Lucassen, B.A., & Casey, M.S. (2002). Forty simple computer
games and what they could mean to education. Simulation & Gaming, 33(2), 157-168.
De Ruiter, P. C., Wolters, V., Moore, J. C., & Winemiller, K. O. (2005). Food Web Ecology:
Playing Jenga and Beyond. Science, 309, 68-71.
Franklin, S., Peat, M., & Lewis, A. (2003). Non-traditional interventions to stimulate discussion:
the use of games and puzzles. Journal of Biology Education, 37, 79-84.
Gee, J. P. (2003). What video games have to teach us about learning and literacy. ACM
Computers in Entertainment, 1(1), 1-4.
Gee, J. P. (2004a). Situated language and learning: A critique of traditional schools. New York,
NY: Routledge.
Gee, J. P. (2004b). Language in the Science classroom. In E.W. Saul (Ed.), Crossing borders in
literacy and science instruction. (pp. 13-32). Arlington, VA: NSTA Press.
Gee, J. P. (2008). What is academic language? In A. S. Rosebery and B. Warren (Eds.),
Teaching science to English language learners: building on students’ strengths. (pp. 57-
69). Arlington, VA: NSTA Press.
Glaser, B.G., & Strauss, A. L. (1967). The Discovery of Grounded Theory: Strategies for
Qualitative Research. New York, NY: Aldine De Gruyter.
Grotzer, T. A., & Basca, B. B. (2003). How does grasping the underlying causal structures of
ecosystems impact students’ understanding? Journal of Biological Education, 38, 16-29.
Gutierrez, A. F. (2014). Development and effectiveness of an educational card game as
supplementary material in understanding selected topics in biology. Life Science
Education, 13, 76-82.
Jacobson, M., & Wilensky, U. (2006). Complex system in education: Scientific and educational
importance and implications for the learning science. Journal of the Learning Science,
15(1), 11-34.
Kuo, M.J. (2007). How does an online game based learning environment promote students’
intreinsic motivation for learning natural science and how does it affect their learning
outcomes? Paper presented at the First IEEE International Workshop on Digital Game
and Intelligent Toy Enhanced Learning, 2007(DIGITEL’07, Jhongli, Taiwan.
Lehrer, R. & Schauble, L. (2007). Scientific Thinking and Science Literacy. Handbook of Child
Psychology, IV, 1-5.
Luria, A. R. (1976). Cognitive development: Its cultural and social foundations. Cambridge,
MA: Harvard University Press.
Manz, E. (2012). Understanding the codevelopment of modeling practice and ecological
knowledge. Science Education, 96(6), 1071-1105.
Mayer, R.E. (2011). Multimedia learning and games. In S. Tobias & J.D. Fletcher (Eds.),
Computer games and instruction (pp. 281-305). Charlotte, NC: Information Age.
Mayo, M.J. (2007). Games for science and engineering education. Communications of the ACM,
50(7), 31-35.
National Center for Education Statistics (NCES). (2011). Trends in International Mathematics
and Science Study. Retrieved from https://nces.ed.gov/TIMSS/
Biffi, Hartweg, de la Fuente, Patterson, Stewart, Simanek and Weinburgh 53
Electronic Journal of Science Education ejse.southwestern.edu
National Center for Education Statistics (NCES). (2014). National Assessment of Educational
Progress. Retrieved from https://nces.ed.gov/nationsreportcard/
National Research Council (NRC). (2012). A framework for K-12 science education: Practices,
crosscutting concepts, and core ideas. Washington, D.C.: The National Academies Press.
NGSS Lead States. (2013). Next generation science standards: For states, by states.
Washington, DC: The National Academies Press.
Oceana (2015). GrubHub, It’s Time to Take Shark Fin Soup off the Menu. Rretrieved from
http://oceana.org/blog/ceo-note-grubhub-it%E2%80%99s-time-take-shark-fin-soup-menu
Odenweller, C.M., Hsu, C.T. & DiCarlo, S. E. (1998). Educational card games for understnding
gastrointestinal physiology. Advanced Physiological Education, 20, S78-S84.
O’Loughlin, M. (1992). Rethinking science education: Beyond Piagetia constructivism toward a
sociocultural model of teaching and learning. Journal of Research in Science Teaching,
29(8), 791-820.
Pinder, P. J. (2008). Utilizing Instructional Games to Improve Students’ Conceptualization of
Science Concepts: Comparing K Students Results With Grade 1 Students, Are There
Differences? Regional Eastern Educational Research Association Conference. Hilton
Head Island.
Rastegapour, H., & Marashi, P. (2012). The effect of card games and computer games on
learning of chemistry concepts. Procedia-Social and Behavioral Science, 31, 597-601.
Rowe, J.P., Shores, L.R., Mott, B.W., & Lester, J.C. (2010). Individual diffrences in gameplay
and learning: A narrative-centered learning perspective (pp 171-178). Proceedings of the
Fifth International Confrence on the Foundations of Digital Games, Monterey, CA.
Ruzhitskaya, L., Speck, A., Ding, N., Baldridge, S., Witzig, S., & Laffey, J. (2013), Going
virtual … or not: Development and testing of a 3 D virtual astronomy environment.
Communicating Science: A National Conference on Science Education and Publid
Outreach, 473, 255.
Schollum, B. (1983). Arrows in science diagrams: Help or hindrance for pupils?, Journal of
Research in Science Education, 13, 45-59.
Tran L. U., Payne D. L., & Whitley, L. (2010). Research on Learning and Teaching Ocean and
Aquatic Sciences. Special Report #3. National Marine Educators Association.
Umphlett, N., Brosius, T., Laungani, R., Rousseau, J. & Leslie-Pelecky, D.L. (2009). Ecosystem
Jenga! Science Scope, 33, 57-60
Vygotsky, L. (1978). Mind in society: The development of higher psychological processes.
Cambridge, MA: Harvard University Press.
Vygotsky, L. (1986). Thought and language. Cambridge, MA: MIT Press.