Conference PaperPDF Available

Rafigh: A Living Media Interface for Speech Intervention

Authors:

Abstract and Figures

Digital media can engage children in therapeutic and learning activities. Incorporating living media in these designs can create feelings of empathy and caring in users. We present, Rafigh, a living media interface designed to motivate children with speech disorders to use their speech to care for a living mushroom colony. The mushrooms' growth is used to communicate how much speech is used during interaction. The main focus of the interface is to motivate children to use their speech as part of interaction.
Content may be subject to copyright.
Rafigh: A Living Media Interface for Speech Intervention
Foad Hamidi
Department of Computer Science and Engineering
Lassonde School of Engineering, York University,
Toronto, Canada M3J 1P3
fhamidi@cse.yorku.ca
Melanie Baljko
Department of Computer Science and Engineering
Lassonde School of Engineering, York University,
Toronto, Canada M3J 1P3
mb@cse.yorku.ca
ABSTRACT
Digital media can engage children in therapeutic and
learning activities. Incorporating living media in these
designs can create feelings of empathy and caring in users.
We present, Rafigh, a living media interface designed to
motivate children with speech disorders to use their speech
to care for a living mushroom colony. The mushrooms’
growth is used to communicate how much speech is used
during interaction. The main focus of the interface is to
motivate children to use their speech as part of interaction.
Author Keywords
Living Media Interfaces; Speech Intervention; Embedded
Computing.
ACM Classification Keywords
H.5.2. User Interfaces
INTRODUCTION
Embedded electronics allow for new ways to interact with
living beings. A new wave of hybrid biological interfaces
sometimes referred to as “moistmedia” [2], explore novel
ways to engage users through combining digital and
biological elements in design. Additionally, therapeutic
digital activities have the potential to motivate users,
especially children, to perform repetitive and otherwise
boring tasks for long periods of time. We bring together
these two ideas in an empathetic living media interface that
focuses on caring as interaction goal in an interface to be
used in the context of speech intervention and elicitation.
In the face of increasing urbanization and lack of contact
with nature, it is important to design systems that facilitate
a re-connection or at least dialogue around our interaction
with living beings. Many children are naturally fascinated
by animals and plants. By developing technology that
encourages and builds on this fascination we can support
children’s relationship with nature and the environment
around them. Additionally, this area provides opportunities
to build-in caring and empathy into interaction; elements
whose absence have been a cause of concern for the critics
of computer games [1].
The need for regularity in the care of living beings
corresponds well with therapeutic and intervention
applications for repetitive and regular use [17]. In this
work, we present a design that requires children to take
responsibility for taking care of living beings with
regularity and by performing tasks that would benefit them
therapeutically and aid with intervention.
Rafigh is an embedded tangible interface that uses a living
mushroom colony as part of its display, where the growth
rate of the mushrooms corresponds with the amount of
speech practiced through the use of the interface. It is
designed to motivate small children (ages 4-7) with speech
disorders to use their speech to perform a series of digital
activities that results in the irrigation of the mushrooms. We
have chosen to use mushrooms as the living interface
because of their relatively rapid growth rate that is suitable
for engaging children (typically 6-10 days). Children who
undergo speech intervention typically work with trained
Speech-Language Pathologists (SLPs) who assign speech
exercises to them and provide corrective feedback [9]. To
inform our design we interviewed five SLPs who work with
children and also performed a review of extant systems [9].
The results were incorporated into our design. We have
used a holistic design method that emphasizes not only the
therapeutic use of speech but also the promotion of
knowledge about the environment and encouragement of
family and community involvement.
In the next section, we review similar research projects that
examine human-nature interaction through computational
material. We will describe Rafigh next, including a
discussion of its design rationale, and results of a series of
SLP interviews that informed it.
BACKGROUND
Our fascination with nature is as old as humanity. Recently,
digital designs have emerged that explore intimate digital-
biological interaction scenarios. While there are relatively
few research projects in this area, they cover a wide
spectrum of interactions with nature. At the one end of the
spectrum are biohacking projects that directly manipulate
plant and animal biology, turning them into cyborgs and
allowing control or monitoring of their activities (e.g.,
Permission to make digital or hard copies of all or part of this work for
personal or classroom use is granted without fee provided that copies are
not made or distributed for profit or commercial advantage and that copies
bear this notice and the full citation on the first page. Copyrights for
components of this work owned by others than ACM must be honored.
Abstracting with credit is permitted. To copy otherwise, or republish, to
post on servers or to redistribute to lists, requires prior specific permission
and/or a fee. Request permissions from Permissions@acm.org.
CHI 2014, April 26 May 01, 2014, Toronto, Canada.
Copyright 2014 ACM 978-1-4503-2473-1/14/04...$15.00.
http://dx.doi.org/10.1145/2556288.2557402
Session: Engage and Educate Children
CHI 2014, One of a CHInd, Toronto, ON, Canada
1817
Cyborg Beetle [18]). On the other end of the research
spectrum are projects that aim to foster nurturing and less
hierarchical relationships with living beings through
technology. For example, in the Botanicalls project,
embedded sensors communicated information about a
plant’s well-being (e.g., moisture level, soil quality) via
phone calls and Twitter messages, to plant owners [10].
Previous studies have shown that interacting with living
pets can have positive therapeutic and educational benefits.
Caring for pets has been shown to increase children’s self-
esteem [3]. Many parents admit that interacting with pets
gives children valuable lessons about life events [15]. Other
positive outcomes have included reduced levels of
loneliness [5], stress and anxiety [19].
The Time to Eat project has examined the effectiveness of
using virtual pets to promote positive behavior change,
specifically health eating, in children [17]. During the
period of use, virtual pets request to be fed regularly and
encourage the children to eat breakfast with them. In a user
study with 53 middle school children, it was observed that
children who used the game ate breakfast more frequently.
Children also expressed interest in and attachment to their
pets in interviews. While having virtual pets can be
beneficial in some contexts, we believe there is a qualitative
difference between caring for a living being and a virtual
one. A review of hybrid biological-digital games has
identified several potential benefits, such as enabling care,
education and interspecies awareness [14]. While the
review focuses on games that interact with animals, the
mentioned implications are relevant to this project.
A small number of projects have used living media to
communicate information. PlantDisplay uses plant growth
to display information about the owners amount of daily
communication with friends, collected through monitoring
of phone call logs [13]. Plant growth is correlated with the
amount of communication: the more the owner
communicates the more the plant grows.
Babbage Cabbage uses live red cabbage as empathetic
biological feedback display [8]. Each head of cabbage is
viewed as a single organic pixel that can change color based
on the pH level of an administered solution. Social and
ecological information are communicated to a viewer of the
system through a range of colors that the cabbage head
displays. The same research group has developed an
ambient empathic interface that uses DNA-transformed E.
coli to communicate information through glowing
microorganisms [6].
Spore consists of a self-sustaining rubber tree plant watered
depending on the stock exchange value of a corporation [7].
Information about the rising or falling price of the
company’s stock controlled the amount of water that was
given to the plant, thus affecting the health of the plant
based on the activity of the company’s finances. The project
ended when the plant died of overwatering.
Meet Eater consists of a garden of plants watered based on
activity on its Facebook page [12]. Activity on the plants
Facebook page (i.e., receiving “likes”) triggered a watering
mechanism. The plant garden is used as an ambient display
that can communicate about the plant’s online social life via
its health and growth. Similar to Spore, the plant got
overwatered due to increased activity on its Facebook page.
Our system is different from other previous living media
interfaces in several important ways: a) the growth and
wellbeing of the mushrooms are mapped to the actions of
the child (as opposed to an abstract form of data as in the
cases of Spore, Babbage Cabbage and Meet Eater), b) the
child is responsible for the well-being of the mushrooms
and in this way is empowered and, c) the growth cycle of
the mushrooms is short and eventually they are to be
harvested (and potentially eaten) after a week of care, thus
having a concrete and tangible outcome for the child in the
short term that is different from caring for a plant over a
long period of time.
RAFIGH: A LIVING MEDIA INTERFACE FOR SPEECH
INTERVENTION
We present Rafigh (which translates to “companion” in
Farsi), an empathetic living media interface for speech
elicitation and intervention for children with speech
disorders. Figure 1 shows Rafigh. The interface consists of
a box designed to house a mushroom colony (with its
growing side exposed), an irrigation system controlled by a
wireless microcontroller and housing for an iPad.
Figure 1. Rafigh interface (left): the iPad is mounted on the left
and the mushroom colony peers out of an opening on the right.
The irrigation system (right) is a water pump activated by a
microcontroller.
The child can interact with the digital activity on the iPad at
any time but typically we expect them to use it at least once
a day. The SLP can specify the number of repetitions and
words needed for each child, determining the length of the
interaction. The SLP can also input new images and audio
for a word or phrase. Other parameters such as how often
the activity should be done each day and how many
repetitions requested for a word can also be specified. After
a period of use, the child’s recorded activities can be
reviewed (via segmented and categorized video logs).
Session: Engage and Educate Children
CHI 2014, One of a CHInd, Toronto, ON, Canada
1818
The software interface consists of a series of audio and
image prompts that require the user to repeat names of
familiar animals, fruits and vegetables. The current setup
uses a set of words that contain common English phonetic
sounds, but the prompts and images are customizable and
can be changed by the SLP as needed for each client. Once
the user finishes a set of exercises (that currently consists of
10 words but can also be customized), he or she is informed
that the mushrooms will be “fed” (i.e., watered). The
irrigation mechanism (shown in Figure 1) consists of an
Arduino microcontroller and a small water pump originally
designed for use in fish tanks. The mushrooms will be
irrigated for durations calculated based on the amount of
the child’s input speech. Once irrigation starts, the
mushrooms grow considerably every day (Figure 2).
Figure 2. Rafigh grows every day; after the second or third day,
mushroom growth is clearly visible.
We have set three levels for the amount of water to be
administered to the mushrooms based on how regularly and
thoroughly the child repeats the set of exercise words and
phrases: High, Medium and Low. High is activated if all
words are repeated, Medium, if half the words are repeated
and, Low, if less than half are repeated. Feedback is
provided to the child after each repeated word.
The thresholds are set such that the mushroom colony will
always live and grow no matter how little speech is used.
While this might seem unintuitive as it provides positive
feedback to the child even if they don’t practice their
speech, we made this decision based on the ethical principle
of wanting to avoid the death of the mushrooms at any cost
due to over- or under-watering (which can easily happen as
is apparent from the case of earlier projects reviewed
earlier). Thus, we provide the child with positive feedback
through the growth and size of the mushrooms not its life
and death.
We use a mushroom colony developed by the Back to the
Roots Company and designed for educational purposes [2].
The mushrooms are edible and can be consumed by the
child’s family and friends after growth. Therefore, another
aspect of the design is that it empowers the child through
food production. The idea of incorporating a living being as
part of the interface was inspired by our observation of
several children who are family and friends and through
informal conversations with theirs parents who told us that
the children are generally interested in living beings and
nature. While there is merit in including rewards such as
badges and scores in games, we wanted to experiment with
caring as a mode of interaction and the health of the living
component and its growth as a form of alternative (possibly
more meaningful) reward.
Rafigh’s design is informed by interviews with five SLPs
who work with children [9]. All of the SLPs agreed that
developing a digital media toy for speech elicitation would
be useful. Three SLPs regularly use props such as dolls and
physical toys, as well as, images and flash cards to engage
children. They emphasized that for young children (ages 4
to 7) having toys that can be grasped, touched and are
durable is recommended. Tangible language intervention
games have proven promising for children [11].
A key design idea was to focus on engagement rather than
the generation and presentation of automatic feedback. All
SLPs indicated that having no or little feedback that is
consistent and accurate is preferable to having inconsistent
or incorrect feedback, especially in the absence of the SLP
who can mediate between the technology and the child.
However, they recommended having a measure of progress
so that not all speech is rewarded equally. This confirmed
the results of a review of extant computational intervention
systems that showed that the use of automatic speech
recognition to provide corrective feedback to children is
technically challenging and might backfire by providing
incorrect and inconsistent feedback due to a lack of analysis
capabilities of non-standard speech in current systems [14].
On the other hand, projects that focus on motivating and
engaging speech rather than analyzing it automatically,
have shown promising results [16]. Thus, we decided to use
the interface to motivate and engage children to use their
speech and record it for future analysis by a qualified SLP.
The recording of speech samples was recommended for
other reasons as well: One SLP noted that capturing the
child’s natural speech (i.e., speech spoken in the absence of
the SLP) would be helpful in assessing intervention needs.
Our system is designed such that, once set, it can be used in
the absence of the SLP. Another SLP records samples of
her clients’ speech during some of her sessions. She uses
these samples for future comparison of intervention
outcomes and analysis of speech in the absence of the
client. In addition to recording speech, all the SLPs
suggested that automatic tracking and record keeping of
exercises are useful functions that a computational toy
could provide, and that we have built into the design.
Another important recommendation by the SLPs was to
make the application customizable as each client is unique.
Given the multicultural context in which they work
(Toronto, Canada), three of the interviewed SLPs
recommended support for multilingual contexts. In addition
to the customizations mentioned already, support for
multiple languages will be incorporated into the future
versions of the software.
The use of an iPad as part of the interface was initially
motivated by the observation that two of the SLPs already
use iPads to play games that engage speech. They believe
Session: Engage and Educate Children
CHI 2014, One of a CHInd, Toronto, ON, Canada
1819
the use of tablets will become more prevalent among SLPs
and it is important to incorporate these tools into future
designs. Surprisingly, they preferred activities that allowed
for the engagement of speech through play but were not
specifically developed for speech intervention and have
simple interfaces (e.g., My PlayHome). One SLP
commented that she prefers to use non-computational
material during intervention because too much technology
can be distracting for the children. We used a simple digital
activity on the iPad and decoupled it as much as possible
from the living media using modular design, so that, in the
future SLPs could use a variety of games and activities of
their choice to trigger the irrigation mechanism.
Another factor that led to the inclusion of the iPad was an
unsuccessful experience with an earlier prototype that relied
on LED lights and a bubble blower that were activated by
input speech to engage the user. When one of the SLPs
invited us to show it to one of her clients, a 4-year old boy
with speech delays, the prototype failed to engage the child.
We realized that we needed engaging video and audio
components that a tablet such as the iPad can provide.
CONCLUSION AND FUTURE WORK
We have presented Rafigh, a living media interface for
children with speech disorders that encourages them to use
their speech to care for a living mushroom colony.
In future, we plan to conduct user studies with the interface
and examine its usefulness and impact, not only in terms of
speech intervention, but also in how it affects user
satisfaction and experience. Additionally, we plan to
explore its use for other applications such as second
language learning and adult population speech intervention
and speech banking.
REFERENCES
1. Anderson, C. A., Shibuya, A., Ihori, N., Swing, E. L.,
Bushman, B. J., Sakamoto, A., and Saleem, M. (2010).
Violent video game effects on aggression, empathy, and
prosocial behavior in eastern and western countries: a
meta-analytic review. Psychological Bulletin 136, 2,
151.
2. Ascott, R. (2007). Telematic Embrace: Visionary
Theories of Art, Technology, and Consciousness by Roy
Ascott. University of California Press, Berkeley, CA.
3. Back to the Roots. http://www.backtotheroots.com
4. Bloch, L. R., and Lemish, D. (1999) Disposable Love
The Rise and fall of a virtual pet. New Media & Society
1, 3, 283-303.
5. Calvert, M. M. (1989). Human-pet interaction and
loneliness: a test of concepts from Roy’s adaptation
model. Nurs Sci Q 2, 4, 194202.
6. Cheok, A. D., Kok, R. T., Tan, C., Newton Fernando, O.
N., Merritt, T., and Sen, J. Y. P. (2008). Empathetic
living media. In Proc. DIS 2008, ACM Press, 465-473.
7. Easterly, D. Bio-Fi: inverse biotelemetry projects.
(2004). In Proc. MM 2004, ACM Press, 182-183.
8. Fernando, O. N., Cheok, A. D., Merritt, T., Peiris, R. L.,
Fernando, C. L., Ranasinghe, N., and Karunanayaka, K.
(2009). Babbage cabbage: Biological empathetic media.
VRIC Laval Virtual Proceedings, 363-366.
9. Hamidi, F., Baljko, M. (2013). Automatic Speech
Recognition: A Shifted Role in Early Speech
Intervention. In Proc. SPLAT’13, 55-61.
10. Hartman, K. (2006). Botanicalls: The plants have your
number. http://www. botanicalls. com/.
11. Hengeveld, B., Hummels, C., van Balkom, H., Voort,
R., and de Moor, J. (2013). Wrapping up LinguaBytes,
for now. In Proc. TEI 2013. ACM Press, 237-244.
12. Isai, B., and Viller, S. (2010). Meet Eater: affectionate
computing, social networks and human-plant
interaction. In Proc. OzCHI 2010, ACM Press, 414-415.
13. Kuribayashi, S., and Wakita, A. (2006). PlantDisplay:
turning houseplants into ambient display. In Proc.
SIGCHI 2006, ACM Press, 40.
14. Lamers, M. H., and van Eck, W. (2012). Why simulate?
hybrid biological-digital games. In Proc. Applications of
Evolutionary Computation 2012, Springer Berlin
Heidelberg, 214-223.
15. Levinson, B. The child and his pet: A world of non-
verbal communication. (2001). Ethnology and non-
verbal communication in mental health 12, 4, 63-83.
16. Mitra, S., Tooley, J., Inamdar, P. and Dixon, P. (2003).
Improving English pronunciation - an automated
instructional approach. Information Technologies and
International Development 1, 1, 75-84.
17. Pollak, J., Gay, G., Byrne, S., Wagner, E., Retelny, D.,
and Humphreys, L. (2010). It's time to eat! Using
mobile games to promote healthy eating. Pervasive
Computing 9, 3, 21-27.
18. Sato, H., Berry, C. W., Casey, B. E., Lavella, G., Yao,
Y., VandenBrooks, J. M., and Maharbiz, M. M. (2008).
A cyborg beetle: insect flight control through an
implantable, tetherless microsystem. In Proc. MEMS
2008, 164-167.
19. Wilson, C. C. (1991). The pet as an anxiolytic
intervention. J Nerv Ment Dis 179, 8, 482489.
Session: Engage and Educate Children
CHI 2014, One of a CHInd, Toronto, ON, Canada
1820
... The integration of living microbes into interactive systems is a growing area of interest for HCI and design researchers. Organisms have, for example, been embedded in sensing devices [15,53], ambient displays [14,24], games [42,70], and interactive installations [44,45], in which novel functionalities and interaction possibilities are achieved through organisms' distinct biological afordances. In most of these interactive systems, interaction between humans and microorganisms is indirect, as human input is translated into a specifc stimulus that is known to afect living microorganisms. ...
... Numerous other concepts within microbe-HCI [40] concentrate on engaging with microbes outside their natural setting and incorporating them into interactive artefacts, which Merritt et al. [58] characterise as living media interfaces. Rafgh [24], for example, encompasses a living mushroom colony, motivating children to do speech exercises as data on usage of a digital app is used to regulate the interface's irrigation system. ...
Conference Paper
Full-text available
HCI designers increasingly engage in the integration of microbes into artefacts, leveraging their distinct biological affordances for novel interactions. While in many explorations, the interaction between humans and microbes is mediated, scholars also highlight the potential of direct interactions, such as visualising mechanical distortions or fostering a sense of relationality with nonhumans through eliciting intimate encounters. Seizing upon this potential, our study delves into the realm of direct interactions involving Flavobacteria, recently introduced as a colour-changing interactive medium in HCI. We present a design space for direct interactions where humans can (re)activate, (re)direct, and (re)arrange Flavobacteria’s colourations, thereby fostering a personal and dynamic interplay between humans and microbes. With our work, we aspire to provide pathways and ignite inspiration among HCI designers to create living artefacts that cultivate active engagement and heightened attentiveness towards microbial worlds and beyond.
... Within Human-Computer Interaction (HCI) and Interaction Design (IxD) research, biomaterials are becoming an increasingly relevant field of development as they provide sustainable alternatives to typical prototyping materials and even to electronics. Common biomaterials in this context include microbial cellulose bioleathers [4,14,15,120,123,129], mycelium biocomposites [39,60,65,69,169,170,175], bioplastics and biofoams [13,97,104,155,178], and bioclays and biopastes [16,20,29,44,147]. ...
Article
Full-text available
In this work, we introduce biodegradation as a process of more-than-human unmaking. We begin by positioning biodegradation amongst related works in design research before presenting a circular process of making and unmaking biomaterials and living organisms through biodegradation. To exemplify this process, we detail two existing works—ReClaym and Biomenstrual—that exemplify how biodegradability can be explored in design through different biomaterials, methods, and contexts. By diffractively reading these projects through one another, we identify six themes and corresponding suggestions for researchers engaging with biodegradation. Lastly, we discuss the broader design implications and limitations, as well as the more-than-human values that emerge from designing for biodegradation via biomaterials. Through this, we aim to provide design researchers with practical tools and insights for engaging with biodegradation to unmake anthropocentric hierarchies between humans, non-humans, and biomaterials, which in turn can promote environmental sustainability and support more-than-human collaboration and care.
... However, most bio-digital artifacts fall within the category of Living Media Interfaces [54], which utilize living biological materials to support human-digital interactions. For example, μMe [11], Flavorium [30], Infotropism [36], Rafigh [33], Babbage Cabbage [24], and Cyano-chromic Interface [93] all use living organisms (microbiome, flavobacteria, house plants, mushrooms, cabbage, and cyanobacteria respectively) to visually display information. Meanwhile, other bio-digital artifacts use living organisms as functional components such as natto cell actuators for shape-changing fabric [91] or slime mold wires for a smartwatch [52]. ...
Conference Paper
Full-text available
We explore how actively engaging with the temporalities of a nonhuman organism can lead to multispecies understanding. To do so, we design a bio-digital calendar that brings attention to the growth and health of kombucha SCOBY, a symbiotic culture of bacteria and yeast that lives in a tea medium. The non-invasive bio-digital calendar surrounds the kombucha SCOBY to track (via sensors) and enhance (via sound) its growth. As we looked at and listened to our kombucha SCOBY calendar on a daily basis, we became attuned to the slowness of kombucha SCOBY. This multisensory noticing practice with the calendar, in turn, destabilized our preconceived human-centered positionality, leading to a more humble, decentered relationship between us and the organism. Through our experiences with the bio-digital calendar, we gained a better relational multispecies understanding of temporalities based on care, which, in the long term, might be a solution to a more sustainable future.
Conference Paper
Recently, there has been a growing attention in the HCI community on leveraging biological affordances of living organisms to expand interaction possibilities. Current interests range from the development of grown materials for interactive products to the design of bio-digital systems. To facilitate such work, biological practices are no longer only undertaken in professional laboratories, but also increasingly in fabrication settings. In this paper, we aim to provide an understanding of considerations and strategies that could inform future infrastructural designs for working with living organisms. Our insights are drawn from site visits and interviews in Western European locations currently exploring this domain. Building on our findings, we argue that the infrastructural designs of these spaces are not defined by discrete material entities, but instead, point towards conditions which are cultivated differently in each context.
Conference Paper
Incorporating living microorganisms in artifacts offers opportunities for novel modes of expression and interaction. Bioluminescent algae are unicellular microorganisms that produce light in response to kinetic stimuli and have been a focus of design and HCI research when exploring expressivity of living media. This study advances prior work using bioluminescent algae through designing and engineering a Living Light Interface comprising of bio-kinetic pixels. The resulting interactive system translates digital input into the biological domain by modulating the bioluminescent mechanism and creating different pixel states. The kinetic design of the vibration module uses adjustable weights to induce a wide range of lighting patterns. The hardware design is coupled with organism-centric algorithms, which allow for the generation of dynamic light patterns across the interface. The paper provides a comprehensive visual narrative of a design process that brings these living organisms to the forefront of our technological imagination, blurring the boundaries between biology, algorithmic control, and tangible interfaces.
Conference Paper
Full-text available
Biologically inspired algorithms (neural networks, evolutionary computation, swarm intelligence, etcetera) are commonly applied in development of digital games. We argue that there are opportunities and possibilities for integrating real biological organisms inside computer games, with potential added value to the game’s player, developer and integrated organism. In this approach, live organisms are an integral part of digital gaming technology or player experience. To spark further thought and research into the concept of hybrid biological-digital games, we present an overview of its opportunities for creating computer games. Opportunities are categorized by their mainly affected stakeholder: game player, game designer, and bio-digital integrated organism. We clarify the categorization via numerous examples of existing hybrid bio-digital games. Based on our review work we present conclusions about the current state and future outlook for hybrid bio-digital games.
Article
Full-text available
One particular toy, the Tamagotchi, is analyzed as a cultural artifact which incorporates the latest in computer and video technology and virtually engages players in the most basic of nurturing relationships. Here one day, and gone the next, the Tamagotchi is seen as a symbol of its times in which even the most intense connections are disposable. The essay examines how this object relates to popular culture and to other children's playthings in particular, and what it signifies in terms of relationships, gender identity, and existential predicaments. As a consequence, it raises a number of questions concerning the role that these and similar toys perform in the socialization of children and of society in general.
Conference Paper
Full-text available
We describe a new form of interactive living media used to communicate social or ecological information in the form of an empathetic ambient media. In the fast paced modern world people are generally too busy to monitor various significant social or human aspects of their lives, such as time spent with their family, their overall health, state of the ecology, etc. By quantifying such information digitally, information is semantically coupled into living microorganisms, E. coli . Through the use of transformed DNA, the E. coli will then glow or dim according to the data. The core technical innovation of this system is the development of an information system based on a closed-loop control system through which digital input is able to control input fluids to the E. coli , and thereby control the output glow of the E. coli in real time. Thus, social or ecological based information is coupled into a living and organic media through this control system capsule and provides a living media which promotes empathy. We provide user design and feedback results to verify the validity of our hypothesis, and provide not only system results but generalized design frameworks for empathetic living media in general.
Conference Paper
Full-text available
The Meet Eater is a physical computing project which explores how social networks can be used to convey anthropomorphic qualities of an inanimate object. The installation consists of a real garden of plants with a synthetic ecosystem that automatically triggers a water pump when its 'social needs' are being sustained on its own Facebook page.
Conference Paper
In this paper we present the final research prototype of LinguaBytes, a tangible interface aimed at stimulating the language development of non- or hardly speaking children between 1 and 4 years old. LinguaBytes was developed in a three-year Research through Design process in which five incremental prototypes were designed, built and evaluated in real-life settings. In this paper we present the original starting points of the project, describe our method and illustrate the resulting end-design using example scenarios of use. We give an overview of the most significant findings at the ten-month evaluation moment, after which we reflect on the original starting points and assess whether they hold up.
Article
It's never been more important to teach youth the importance of healthy eating habits. Time to Eat, a mobile-phone-based game, motivates children to practice healthy eating habits by letting them care for a virtual pet. Players send the pet photos of the food they consume throughout the day; the food's healthiness determines the game's outcome. An examination of the game's design provides insight into the potential of deploying health games on mobile phones.
Conference Paper
Bio-Fi is a collection of art projects undertaken by S.W.A.M.P. (Studies of Work Atmospheres and Mass Production), collaborative art projects by Douglas Easterly and Matt Kenyon. S.W.A.M.P. projects attempt to find creative expression within elements of culture that are inherently counter-creative. The Bio-Fi series utilizes physical computing technology to access patterns and relationships surrounding a corporation, that couldn't be seen using any other medium. The field of 'biotelemetry' researches ways of gathering vital physiological data from living organisms through transponders (worn or implanted), which relay information to remote hardware [1]. With all biotelemetric applications, it is integral that the transponder-bearing subject is a synecdoche for its larger social group. In this respect, Bio-Fi projects are a sort of 'inverse-biotelemetry'. Test subjects are not released into a natural environment, but trapped within a synthetic environment whose conditions are tempered by various systems of information: Wi-Fi signals are like water, information mined from the internet is food, and electronic pulses become sunlight.