Conference PaperPDF Available

Interactive Design by Children: A Construct Map for Programming

Authors:

Abstract

In this paper, we present our analysis of 92 fourth graders' digital story projects completed in LaPlaya, a Scratch-like programming environment. Projects were analyzed for the way that students programmed the start of the story, and if the program integrated user-centered design by providing instruction to the user on how to interact with the digital story. We found that fourth grade students rarely used user-centered design while creating digital stories in our block-based programming environment. Without explicit instruction, the demands of learning programming and simultaneously programming for an abstract user may be too cognitively demanding for the average fourth grader.
Interactive Design by Children:
A Construct Map for Programming
Alexandria K. Hansen1, Hilary A. Dwyer1, Charlotte Hill2, Ashley Iveland1, Timothy Martinez2,
Danielle Harlow1, Diana Franklin2
1Department of Education 2Computer Science Department
Gevirtz Graduate School of Education 2104 Harold Frank Hall
UC Santa Barbara UC Santa Barbara
Santa Barbara, CA 93106-9490 Santa Barbara, CA 93106-9490
{akillian, hdwyer, aockey, dharlow} {tmartinez}@umail.ucsb.edu
@education.ucsb.edu {charlottehill, franklin}@cs.ucsb.edu
ABSTRACT
In this paper, we present our analysis of 92 fourth graders’ digital
story projects completed in LaPlaya, a Scratch-like programming
environment. Projects were analyzed for the way that students
programmed the start of the story, and if the program integrated
user-centered design by providing instruction to the user on how
to interact with the digital story. We found that fourth grade
students rarely used user-centered design while creating digital
stories in our block-based programming environment. Without
explicit instruction, the demands of learning programming and
simultaneously programming for an abstract user may be too
cognitively demanding for the average fourth grader.
Categories and Subject Descriptors
D.1.7 [Programming Techniques]: Visual Programming; K.3.2
[Computer and Information Science Education]: Computer
Science Education.
General Terms
Design, Human Factors, Theory
Keywords
Graphical programming; Computer science education; Elementary
school, Interactive, Design, Scratch
1. INTRODUCTION
Coding has taken K-12 education by storm. The increase in
popularity and prevalence of graphical, student-friendly
programming environments has greatly increased the amount of
students who are coding. A 2014 New York Times article [1]
claimed that 20,000 K-12 teachers nationwide have introduced
coding in their classrooms, and 30 school districts plan to add
coding in the fall of 2015. Over 4.5 million 4th-6th grade students
participated in this fall’s Hour of Code, hosted by Code.org [2].
As coding becomes integrated into the traditional school day,
research-based findings need to inform the instructional design of
curriculum and accompanying resources for students and teachers.
The popularity of block-based programming environments is no
surprise; these interfaces reduce the cognitive load required of
novice student programmers. Block-based programming
environments such as Scratch [3] and LaPlaya [4] reduce typing
requirements, remove potential for syntax errors, and provide
visual cues such as block color and shape. In these environments,
students drag and drop blocks (representing commands) to create
code (scripts). Each script begins with an event (e.g., “when space
bar is pressed”) and follows with a sequence of action blocks
(e.g., “move 10 steps,” “turn right”). The scripts are organized by
sprite (a programmable agent which is often an image of a person,
animal, or object), and each sprite’s code is shown next to a stage
that displays the visual output of the program. Block-based
programming environments are designed for creating interactive
projects that engage users, providing an opportunity to explore
novice programming along with novice interaction design.
In this paper, we present our analysis of 92 fourth grade students
digital story projects. These fourth graders (aged 9-10) had
recently completed a 16-hour curricular module designed to teach
computational thinking concepts and computer programming
skills. The digital story was the final, culminating project for the
first module. Digital storytelling “allows computer users to
become creative storytellers” [5] by using their knowledge of a
variety of forms of technology, or, in our case, those created by
programming in the interface LaPlaya, a modified version of
Scratch. Digital stories consist of multiple characters (sprites) and
scenes. The characters can be programmed to interact by dialog or
actions triggered by motions of sprites or other actions on the
interface. Triggers include pressing keys, clicking on sprites,
timing, or messages passed between the characters.
We focus our analysis on two aspects of the interactive design.
First, we analyzed the sophistication of programming
implemented by children (e.g., key clicks, broadcasting
messages). Second, we analyzed children’s use of user-centered
design by examining how those control choices impacted the user.
User-centered design is thedesign processes in which end-users
influence how a design takes place” [6]. In the context of digital
stories, we consider user-centered design as providing an
interactive experience with explicit instruction to the user about
how to progress through the story.
While extensive user-centered design has informed the
development of many block-based programming environments to
ensure the design and available tools are developmentally
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or
distributed for profit or commercial advantage and that copies bear this notice and
the full citation on the first page. Copyrights for components of this work owned
by others than the author(s) must be honored. Abstracting with credit is permitted.
To copy otherwise, or republish, to post on servers or to redistribute to lists,
requires prior specific permission and/or a fee.
Request permissions from Permissions@acm.org.
IDC '15, June 21 - 25, 2015, Medford, MA, USA
Copyright is held by the owner/author(s). Publication rights licensed to ACM.
ACM 978-1-4503-3590-4/15/06$15.00
DOI: http://dx.doi.org/10.1145/2771839.2771893
appropriate for a target age group [3, 4, 7, 8], scant research exists
on how young students integrate user-centered design in their own
block-based programming. We found no existing elementary
school curricula that included explicit lessons or activities to teach
students this distinction when programming. In this paper, we
present an initial categorization of the ways that children
controlled their digital stories and the ways in which they
integrated user-centered design without explicit, prior instruction.
2. RELATED WORK
Seymour Papert’s learning theory of Constructionism [9]
motivates this work. Like Papert, we believe that individuals learn
best when they are actively constructing an entity for public
consumption; in our case, students created digital stories. Papert’s
theory shares roots with Piaget’s theory of constructivism. Piaget
believed that individuals construct knowledge through
experiences, and those experiences are then sorted into cognitive
schemes [10]. As more complex experiences occur, new
information is integrated into pre-existing cognitive schemes.
Some researchers have recently questioned when students should
begin programming [11]. Piaget’s developmental stages can help
answer this question. While many scholars no longer view
Piaget’s stages as fixed, they still provide a basic model for what
children are able to accomplish, and at what age. According to
Piaget’s model, children from approximately 7 to 11 years old are
in the concrete operational period and are acquiring increased
physical dexterity. We contend that students at this age are able to
interact with computers through actions such as clicking and
dragging, which have proven to be a limiting factor for younger
children [12]. Additionally, children at this age are starting to be
able to “represent transformations as well as static states,”- an
important component of programming [13]. However, children in
the concrete operational period still struggle with abstract
reasoning. It is not until the formal operational stage is reached at
age 11-12 that children are capable of using abstract and
systematic thinking, necessary skills for computer science.
Following Piaget’s model, our participants (aged 9-10) were
approaching the formal operational period. We believe this was an
ideal time to begin teaching programming, with age-appropriate
instruction. Similarly, Duncan, Bell and Tanimoto [11] concluded
that while “there might be a limit on the level of abstraction that
students of this age can naturally work with…a different
pedagogical approach” can still support their learning.
User-centered design involves “understanding the user and his or
her experiences” - an abstract idea. Some claim that this can be
achieved through engaging with empathetic design [14]. Cognitive
empathy is described as “intellectually taking the role or
perspective of another person” [15]. Integrating cognitive
empathy as a pedagogical tool may support children’s use of user-
centered design while programming.
3. RESEARCH METHODS
This study is part of a larger study in which we are developing a
computational thinking curriculum and researching how 4th-6th
grade students learn programming. As stated above, we use a
modified version of Scratch, called LaPlaya, which was developed
to be user-friendly and age-appropriate for upper elementary
school students in classroom settings. For more information about
our modifications to Scratch, see [4], and for more information
regarding our curriculum, see [16].
This larger study was informed by design-based research methods
and used both qualitative and quantitative data analysis. Design-
based research studies [17] simultaneously inform the
development of curriculum, research, and practice [18], allowing
improvements for curriculum and practice as more is learned
about student learning.
3.1 Data Collection
In the 2013-2014 school year, we piloted our curriculum in fifteen
4th-6th-grade classrooms at five schools across California. In two
that were located furthest away (nine classrooms), we collected
only student projects. In the remaining three schools (six
classrooms), we filmed classroom instruction and interviewed
teachers and students to iteratively inform the curriculum and
programming environment. The schools had varying numbers of
classrooms, grades participating, start dates, and order of
curriculum. Participating schools ranged from 2%-82%
designated English language learners, and 4%-100% of students
qualifying for free or reduced lunch. Schools generally had equal
numbers of female and male students. For this study, we analyzed
only the final projects. Students were allowed to select the topic
for their final, digital stories.
Overall we collected 103 digital stories from fourth grade
students. We omitted eleven stories because they either did not
have human subjects permission or children did not write any
code. Thus our data set consisted of 92 digital stories. We
reviewed a subset of these by watching videos of children
presenting their stories and inspecting the children’s projects to
create an initial construct map and related coding scheme. We
used this to create an automated analysis of all 92 projects [19].
This was augmented by visual inspection of each project to
determine if children provided directions to the user.
3.2 Developing the Construct Map
We identified two sub-constructs related to interactive design by
children: 1) User-Centered Design and 2) Sophistication of
Programming. We assigned levels by letters for user-centered and
numbers for sophistication of programming.
We defined “user-centered” to mean that the student provided a
mechanism for a user to control the actions of the program (e.g.,
by clicking on sprites or keys) and they provided instruction for
doing so. Instruction could be provided by typing instructions
onto a background scene (e.g., “Click cat to begin program!”) or
by creating buttons or sprites that say, “Click me”. Our qualitative
analysis revealed three levels: (A) Programmer-controlled, (B)
Non-Interactive, and (C) Interactive. In programmer-controlled
programs, multiple sprites were controlled in different ways, but
the control was unintuitive and, as such, the programmer needed
to be present to run the program. Non-interactive programs had a
single, clear mechanism for running the program (e.g., clicking
the green flag) and included multiple events, but no input was
required from either the user or the programmer after the initial
event. Finally, interactive programs prompted the user to trigger
multiple events and provided clear instructions to do so.
The second construct, sophistication of programming, related to
the types of programs that students created: Level 1: Simple,
Level 2: When key pressed/sprite clicked, Level 3: Timing, and
Level 4: Message Passing. Level 1 programs had two attributes:
students used only one event (controlled by clicking on green
flag) and the scripts contained less than two blocks. These
projects were all non-interactive; events were triggered on the
green flag and required no further input from the user or
programmer. Level 2 programs included multiple sprites that were
controlled completely independently either by pressing keys or by
clicking on sprites. Level 3 programs included multiple sprites
that appeared to be coordinated because the timing had been
manipulated. These programs hard-coded the timing through wait
blocks. Lastly, in Level 4 programs sprites interacted by passing
messages using the broadcast and receive blocks. This mechanism
guaranteed the order in which sprites would run, representing the
highest sophistication of programming coordination.
4. FINDINGS
In this section, we describe the ways that students combined user-
centered design and programming sophistication to create their
digital stories. All combinations are shown in Table 1, with
percentages of student projects that fit each category.
4.1 Mechanisms of Control in Digital Stories
Although there were three user-centered design categories and
four programming sophistication levels in our construct map, we
could not distinguish all twelve possible combinations in this
study (these are marked with “n/a” in Table 1). Simple projects
(Level 1) ran when the green flag was clicked, the default in
LaPlaya. A user could click the green flag without the
programmer being present, but we could not determine if this was
intended with so few blocks present. Thus, we could not classify a
Level 1 project as programmer-controlled though this may have
been possible. Students who started their programs on a pressed
key or clicked sprite (Level 2) already engaged the user through
their event choices so we could not identify non-interactive
attributes. Lastly, students who implemented timing and
broadcast/receive blocks (Level 3 and 4) used coordinated action
without the input of the programmer. By definition, these
programs could not be programmer-controlled.
4.1.1 Level 1B: Simple, Non-Interactive
In this level, action among multiple sprites occurred when the
green flag was clicked. This was the standard way that programs
were run in LaPlaya, but our curriculum focused on teaching a
variety of ways that students could start programs. Level 1
projects used only one or two sprites with few completed scripts.
All observed Level 1 projects were characterized as Level 1B
(12%) because the projects had a clear mechanism for running the
program (the green flag), but no additional input was required
from either the user or programmer after the initial event.
4.1.2 Level 2A: Multiple Independent, Programmer-
controlled Events
In this level, the digital story was programmed so that actions
occurred when specific keys or sprites were clicked, but such
actions were not made explicit to the user. For example when the
letter “A was pressed, the first character said “Hello”, and when
“Bwas pressed, a second character responded. Unless explicit
instruction was provided, the user would not know to press “A”
and “B” on the keyboard. We found 50% of projects fell into this
category.
4.1.3 Level 2C: Multiple Independent, User-
controlled Events
In this level, the digital story was programmed so that actions
occurred when specific keys or sprites were clicked, and
instructions or other visual cues were provided to the user so s/he
could control the program. For example, a student could program
a car to move each direction when the corresponding arrow keys
are pressed, and provide instruction to the user about how to
control the car. We did not see any student projects that fit this
description (0%).
4.1.4 Level 3B: Non-Interactive with Timed
Coordination
In this level, the story was triggered by a single event (such as
clicking the green flag, a sprite, or a key), but, unlike Level 1B,
multiple sprites performed actions. They were coordinated with a
set of “wait” blocks orsay __ for __ seconds blocks. Students
may have programmed two characters to converse by creating
scripts for each characters talk using say blocks and wait blocks
to control the timing of when words appeared. This level of
control required the student to pre-plan what would be said, but
the actual timing was a process of guess and check to see if the
story played correctly. Other than the initial command to start the
program, this level was non-interactive for the user. We found 5%
of projects fell into this category.
A program could be categorized as Level 3C if direction was
provided to the user about which sprites or keys to press to
progress through the story, but we did not observe any student
projects that did this (0%).
4.1.5 Level 4B: Non-Interactive with Message
Coordination
In this level, the story was controlled by sprites sending and
receiving messages. The visual effect of Level 3B and Level 4B
was the same; only the programming distinguished them.
Messages replaced the hard-coded timing blocks. When sprites
completed a designated action, they broadcasted an invisible
message, which triggered actions in other sprites. This required
the student to plan and coordinate actions among multiple sprites.
The story may have started by clicking the green flag, key, or
sprite. Other than beginning the story, this mode was non-
interactive for the user. We found 32% of projects fell into this
category.
4.1.6 Level 4C: Interactive and Coordinated
In this level, the student included explicit instruction to the user
about how to interact with the program. The story could be
initiated in a variety of ways, such as clicking the green flag,
sprite, or key. Additional instruction was provided to the user
about how to progress through the story. The program may have
User-Centered Design
Sophistication of Programming
Programmer
Controlled
User: Non-Interactive
Simple
N/A
Level 1B (12%)
Level 1C (0%)
When Key Pressed or Sprite Clicked
Level 2A (50%)
N/A
Level 2C (0%)
Timing Blocks
N/A
Level 3B (5%)
Level 3C (0%)
Broadcast/Receive Messages
N/A
Level 4B (32%)
Level 4C (1%)
Table 1. Interactive Design Control Construct Map for Programming
provided instructions such asClick on the sprites 1, 2, and 3 to
move on,” or “Press keys 1, 2 and 3 to see each scene of the
story.” Without explicit instruction, only 1% of students fell into
this category (n=1). Figure 1 shows an example of a digital story
that involved the user. None of the other student projects analyzed
included explicit instructions to the user.
5. DISCUSSION
Our analysis demonstrates that students used a variety of events
and coordination techniques to create their stories. However, we
also found that, perhaps not surprisingly, when not specifically
prompted, students tended to view their project from their own
perspective, omitting interactive features that could be understood
by an outside user. Very few employed user-centered design in
the absence of instruction.
We propose several possible explanations to explore in further
research. Fourth grade students were not prompted to choose a
particular user as their audience. Perhaps an abstract user was too
difficult to design for, but the prompt of thinking about a
particular person could have improved the designs. A second
explanation relates to the programming environment. In LaPlaya,
the development environment is exposed at the same time as the
runtime environment. The scripts were visible when students ran
their program. Students could have viewed the user as someone
with access to the scripts, making explicit instructions
unnecessary. This could further confuse students about the
relationship between development and deployment if/when they
transition to more traditional text-based languages.
Our categorization presented here suggests a preliminary learning
progression for user-centered design that can be further articulated
and tested. We chose to focus on fourth grade students to identify
the lower anchor point of the learning progression. This analysis
led to concrete revisions to our curriculum for the 2014-2015
school year. We added engineering design lessons to teach
students the process of design, including thinking about the user
in the program they are creating. We also created demonstrations
that showed students examples of projects that did and did not
include instructions to the user.
6. ACKNOWLEDGEMENTS
This work is supported by the National Science Foundation CE21
Award CNS-1240985. We are grateful to the teachers and
children who participated in this project.
7. REFERENCES
[1] Richtel, M. 2014. Reading, writing, arithmetic, and lately,
coding. The NY Times.
[2] Alvarado, C. 2014. CS Ed Week 2013: The hour of code.
ACM SIGCSE Bull, 46,1, 2-4.
[3] Resnick, M., Maloney, J., Monroy-Hernandez, A., Rusk, N.,
Eastmond, E., Brennan, K., Millner, A., Rosenbaum, E.,
Silver, J., Silverman, B., Kafai, Y. 2009. Scratch:
Programming for all. Commun ACM, 52, 11, 60-67.
[4] Hill, C., Dwyer, H. A., Martinez, T., Harlow, D., & Franklin,
D. 2015. Floors and flexibility: Designing a programming
environment for 4th 6th grade classrooms. In SIGCSE ’15.
ACM.
[5] Robin, B. 2008. Digital storytelling. Theor Pract, 47, 220-
228.
[6] Ko, A. J., Abraham, R., Beckwith, L., Blackwell, A.,
Burnett, M., Erwig, M. & Wiedenbeck, S. 2011. The state of
the art in end-user software engineering. ACM Comput Surv,
43, 3, 21.
[7] Flannery, L. P., Silverman, B., Kazakoff, E. R., Bers, M. U.,
Bontá, P, & Resnick, M. 2013. Designing ScratchJr: Support
for early childhood learning through computer programming.
In IDC ’13. ACM.
[8] Cooper, S. 2010. The design of Alice. ACM T Computing
Education, 10, 4, 15.
[9] Papert, S. & Harel, I. 1991. Situating constructionsim.
Constructionism, 36, 1-11.
[10] Driver, R., Asoko, H., Leach, J., Mortimer, E., & Scott, P.
1994. Constructing scientific knowledge in the classroom.
Edu Researcher, 23, 7, 5-12.
[11] Duncan, C., Bell, T., & Tanimoto, S. 2014. Should your 8-
year-old learn coding? In WiPSCE ‘14. ACM.
[12] Donker, A., & Reitsma, P. 2007. Young children’s ability to
use a computer mouse. Comput & Edu, 48, 4, 602-617.
[13] Siegler, R. & Alibali, M.W. 2005. Children’s thinking.
Pearson Prentice Hall, New Jersey.
[14] Kouprie, M. & Visser, F.S. 2009. A framework for empathy
in design: Stepping into and out of the user’s life. J Eng
Design, 20, 5, 437-448.
[15] Mead, G.H., 1934. Mind, self and society. Chicago, IL:
University of Chicago Press.
[16] Franklin, D., Harlow, D., Dwyer, H., Henken, J., Hill, C.,
Iveland, A., Killian, A., & Development Staff. 2014. Kids
Enjoying Learning Programming (KELP-CS)- Module 1
Digital Storytelling. Available at
https://discover.cs.ucsb.edu/kelpcs/educators/KELPCSIntro.p
df
[17] Barab, S. & Squire, K. 2004. Design-based research: Putting
a stake in the ground. J Learn Sci, 13, 1, 114.
[18] Brown, A.L. 1992. Design experiments: Theoretical and
methodological challenges in creating complex interventions
in classroom settings. J Learn Sci, 2, 2, 141178.
[19] Bryce B., Hill, C., Len, M., Dreschler, G., Franklin, D.,
Conrad, P. 2013. Hairball: Lint-inspired static analysis of
scratch projects. In SICSCE ’13. ACM.
Figure 1. A Level 4C student project.
... Only one project was interactive. For more details see Hansen et al. (2015a). ...
... Following this, they discussed how the program could be improved to be easier to use. For more information, see Hansen et al. (2015a). ...
... Only one student (1%) created an interactive story that provided instructions to the user on how to control. Our analysis of the final projects revealed a continuum of whether and how participating students designed stories with the user in mind (see Hansen et al., 2015a). ...
Article
Full-text available
This article integrates an ecological approach and design-based research in computer science education research by following the simultaneous development of a computer programming environment and curriculum for elementary school age children over 2-1/2 years. We studied the alignment of the affordances provided by the programming environment and curriculum with the effectivities of students in 4th through 6th grade (9-12 years old). We used the computer science concept of initializing as a tracer idea and both qualitative and quantitative data to identify mismatches between the affordances provided by our programming environment and the learners’ effectivities. These included requisite mathematical skills, confusion between resetting and setting up, and incorrectly assuming that features of the programming environment conveyed information. We then describe how we addressed the mismatches by removing or adding functionality to the programming environment, adding signifiers, adapting the curriculum to include scaffolding related to the effectivities, or removing activities.
... In our previous work [7], we found that when making projects to share with others, several of our young students (fourth grade; aged 9-10) used complex interfaces without providing the user any instructions for how to use their program. In fact, some students' stories used such a complex combination of key presses and sprite clicks, that the presence of the student who created the program was required to run the program correctly. ...
... During the 2013-2014 school year, we piloted the digital storytelling module in fifteen 4 th -6 th -grade classrooms at five schools. However, as our initial work [7] indicated, we did not include explicit lessons to teach students about user-centered design. As a result, many students did not incorporate usercentered design within their final digital stories, in retrospect an unsurprising finding. ...
... Similar to our previous work [7], we focused our analysis of the digital stories on two sub-constructs to user-centered design: 1) the programmed control choices (or event blocks) students used, and 2) if/how they communicated those mechanisms of control to the user. These two sub-constructs were intertwined in student work and were sometimes difficult to tease apart. ...
Conference Paper
Full-text available
In this paper, we present an analysis of 123 students' (aged 9-12) digital stories created in a visual block-based programming language across three grade levels (grades 4-6). These students were all involved in the same introductory computer science curriculum. Participating students attended the same school and received computer science instruction from the same teacher within the context of the academic day. We analyzed each project for the extent of user-centered design that the student programmed. Specifically, we identified two components of user-centered design: 1) the programmed control choices students used, and 2) if/how they communicated those mechanisms of control to the user. Our work indicates that students in fifth and sixth grade (aged 10-12) used higher diversity of event blocks and coordinated action across multiple sprites at a higher rate compared to fourth grade students (aged 9-10). In contrast, fourth grade students tended to create more simplistic programs, rarely coordinating actions across multiple sprites. This work suggests that the construct of user-centered design within visual block-based programming languages is more complex than previously indicated. Additionally, explicit instruction about user-centered design is necessary, but may be more effective when a student reaches the age of 10 or 11 years old. CCS Concepts • Human-centered computing~ User centered design • Human-centered computing~ Graphical user interfaces • Social and professional topics~ Computing education • Social and professional topics~ K-12 education
... Sounds is also common outputs for many computational toolkits. Sound can be combined with animation to produce multisensory effects [20,55], or being used on his own to produce music or stories [2,64]. Finally, many computational toolkits focus on haptic outputs, especially through the use of robots that can be moved in space according to specific instructions. ...
Conference Paper
Many computational toolkits to promote early learning of basic computational concepts and practices are inaccessible to learners with reduced visual abilities. We report on the design of TIP-Toy, a tactile and inclusive open-source toolkit, to allow children with diferent visual abilities to learn about computational topics through music by combining a series of physical blocks. TIP-Toy was developed through two design consultations with experts and potential users. The frst round of consultations was conducted with 3 visually impaired adults with signifcant programming experience ; the second one involved 9 children with mixed visual abilities. Through these design consultations we collected feedback on TIP-Toy, and observed children's interactions with the toolkit. We discuss appropriate features for future iterations of TIP-toy to maximise the opportunities for accessible and enjoyable learning experiences.
Conference Paper
We describe the design and a trial run of an integrated course of instruction in reading, writing, and computer programming, in order to assess potential synergies of learning them together. Twelve pre-teen students diagnosed with dyslexia each took a sequence of lessons of approximately 90 minutes each over a 3-month period. In addition to computer learning activities in handwriting, word reading, word spelling, sentence and text reading comprehension, there were coding activities using “Kokopelli's World,” a blocks-style visual language with its own microworld. The results suggest that the potential synergies from this form of integrated instruction in written language and computer programming include increased student motivation, complementary pedagogical affordances and increased awareness of the relationship between written language and technology.
Conference Paper
Full-text available
The recent renaissance in early computer science education has provided K-12 teachers with multiple options for introducing children to computer science. However, tools for teaching programming for children with wide-scale adoption have been targeted mostly at pre-readers or middle school and higher grade-levels. This leaves a gap for 4th -- 6th grade students, who differ developmentally from older and younger students. In this paper, we investigate block-based programming languages targeted at elementary and middle school students and demonstrate a gap in existing programming languages appropriate for 4th -- 6th grade classrooms. We analyze the benefits of Scratch, ScratchJr, and Blockly for students and curriculum developers. We describe the design principles we created based on our experiences using block-based programming in 4th -- 6th grade classrooms, and introduce LaPlaya, a language and development environment designed specifically for children in the gap between grades K-3 and middle school students.
Article
Full-text available
Digital storytelling has emerged over the last few years as a powerful teaching and learning tool that engages both teachers and their students. However, until recently, little attention has been paid to a theoretical framework that could be employed to increase the effectiveness of technology as a tool in a classroom environment. A discussion of the history of digital storytelling and how it is being used educationally is presented in this article. The theoretical framework, technological pedagogical content knowledge (TPCK), is described, along with a discussion of how this model might be used with digital storytelling.
Article
Full-text available
In user-centred design, a widespread recognition has surfaced for the importance of designers to gain empathy with the users for whom they are designing. Several techniques and tools have been developed to support an empathic design process and several issues are indicated that support an empathic process, but precise definitions and a framework of what makes ‘empathy’ is missing. Although the need for empathic approaches in design has been repeatedly stressed, a fundamental basis of the concept of empathy is missing. The goal of this paper is to inform the discussion in the design community by applying the concept of empathy as it has developed in psychology. This paper presents a review of how empathy has been discussed in design and psychology literature, and proposes a background framework for supporting empathic approaches in designing. The framework presents empathy in design as a process of four phases, and gives insight into what role the designer's own experience can play when having empathy with the user. This framework can be applied to three areas: research activities, communication activities and ideation activities.
Conference Paper
There has been considerable interest in teaching "coding" to primary school aged students, and many creative "Initial Learning Environments" (ILEs) have been released to encourage this. Announcements and commentaries about such developments can polarise opinions, with some calling for widespread teaching of coding, while others see it as too soon to have students learning industry-specific skills. It is not always clear what is meant by teaching coding (which is often used as a synonym for programming), and what the benefits and costs of this are. Here we explore the meaning and potential impact of learning coding/programming for younger students. We collect the arguments for and against learning coding at a young age, and review the initiatives that have been developed to achieve this (including new languages, school curricula, and teaching resources). This leads to a set of criteria around the value of teaching young people to code, to inform curriculum designers, teachers and parents. The age at which coding should be taught can depend on many factors, including the learning tools used, context, teacher training and confidence, culture, specific skills taught, how engaging an ILE is, how much it lets students explore concepts for themselves, and whether opportunities exist to continue learning after an early introduction.
Article
As advertised in the October 2013 edition of the Bulletin, this year's CS Ed Week encouraged everyone to host an "Hour of Code" event, in which educators hosted (at least) an hour of CS instruction to any audience they chose. SIGCSE members responded to the call enthusiastically. Here is just a sampling of the many great Hour of Code events that were put on.
Conference Paper
ScratchJr is a graphical programming language based on Scratch and redesigned for the unique developmental and learning needs of children in kindergarten to second grade. The creation of ScratchJr addresses the relative lack of powerful technologies for digital creation and computer programming in early childhood education. ScratchJr will provide software for children to create interactive, animated stories as well as curricula and online resources to support adoption by educators. This paper describes the goals and challenges of creating a developmentally appropriate programming tool for children ages 5-7 and presents the path from guiding principles and studies with young children to current ScratchJr designs and plans for future work.
Conference Paper
Scratch programming has risen in prominence, not only as a potential language for K-12 computer science, but also in introductory college courses. Unfortunately, grading Scratch programs is time-consuming, requiring manual execution of each program. Automation of this process is greatly complicated by the very reason Scratch is an attractive introductory language--the projects are multimedia in nature, requiring eyes and ears to fully appreciate. We propose Hairball, an automated system that can be used both by a student to point out potential errors or unsafe practices, and by a grader to assist in inspecting the implementation of Scratch programs. Because automatic analysis will not be able to determine the sensory effect, Hairball focuses instead on the implementation, including safe/robust programming practices, providing a "lint-like" tool for Scratch. In this case study, we have created an initial set of Hairball plugins that detect and label instances of initialization of Scratch state, synchronization between say and sound blocks, synchronization between broadcast and receive blocks, and use of timing and loops for complex animation. Our evaluation shows that Hairball is very useful in conjunction with manual analysis. Overall, Hairball was actually slightly more accurate than manual analysis at labeling these instances. Specifically for broadcast/receive, Hairball's analysis correctly classified 99% of the 432 instances, manual analysis only 81%. Overall, if Hairball was only used to identify correctly implemented instances, with manual analysis for the remainder, it would remove 76% of the instances for the manual analysis and assist in the rest, with a false positive rate of less than 0.5%.
Article
The view that knowledge cannot be transmitted but must be constructed by the mental activity of learners underpins contemporary perspectives on science education. This article, which presents a theoretical perspective on teaching and learning science in the social setting of classrooms, is informed by a view of scientific knowledge as socially constructed and by a perspective on the learning of science as knowledge construction involving both individual and social processes. First, we present an overview of the nature of scientific knowledge. We then describe two major traditions in explaining the process of learning science: personal and social constructivism. Finally, we illustrate how both personal and social perspectives on learning, as well as perspectives on the nature of the scientific knowledge to be learned, are necessary in interpreting science learning in formal settings.