Emotional Intelligence: Affective Computing in Architecture and Design

To read the full-text of this research, you can request a copy directly from the author.


What if material interfaces could adapt physically to the user’s emotional state in order to develop a new affective interaction? By using emotional computing technologies to track facial expressions, material interfaces can help to regulate emotions. They can serve either as a tool for intelligence augmentation or as a means of leveraging an emphatic relationship by developing an affective loop with users. This paper explores how color and shape-changing can be used as an interactive design tool to convey emotional information, and is illustrated by two projects, one at the intimate scale of fashion, and one at a more architectural scale. By engaging with design, art, psychology, computer and material science, this paper envisions a world where materials can detect the emotional responses of a user and reconfigure themselves in order to enter into a feedback loop with the user’s affective state and influence social interaction.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Shape-change is increasingly relevant in affective interaction research [2,66], with potential application areas for affective aspects of shape-change including therapeutics and social robotics where it has been demonstrated that shape-change functions can broaden communication capabilities [11,20,33]. As shape-change research expands and as applications proliferate, there is thus a real need to increase our understanding of affective responses to shape-change. ...
Conference Paper
Full-text available
With the proliferation of shape-change research in affective computing, there is a need to deepen understandings of affective responses to shape-change display. Little research has focused on affective reactions to tactile experiences in shape-change, particularly in the absence of visual information. It is also rare to study response to the shape-change as it unfolds, isolated from a final shape-change outcome. We report on two studies on touch-affect associations, using the crossmodal ``Bouba-Kiki'' paradigm, to understand affective responses to shape-change as it unfolds. We investigate experiences with a shape-change gadget, as it moves between rounded (``Bouba'') and spiky (``Kiki'') forms. We capture affective responses via the circumplex model, and use a motion analysis approach to understand the certainty of these responses. We find that touch-affect associations are influenced by both the size and the frequency of the shape-change and may be modality-dependent, and that certainty in affective associations is influenced by association-consistency.
Full-text available
Background Brain network can be well used in emotion analysis to analyze the brain state of subjects. A novel dynamic brain network in arousal is proposed to analyze brain states and emotion with Electroencephalography (EEG) signals. New Method Time factors is integrated to construct a dynamic brain network under high and low arousal conditions. The transfer entropy is adopted in the dynamic brain network. In order to ensure the authenticity of dynamics and connections, surrogate data are used for testing and analysis. Channel norm information features are proposed to optimize the data and evaluate the level of activity of the brain. Results The frontal lobe, temporal lobe, and parietal lobe provide the most information about emotion arousal. The corresponding stimulation state is not maintained at all times. The number of active brain networks under high arousal conditions is generally higher than those under low arousal conditions. More consecutive networks show high activity under high arousal conditions among these active brain networks. The results of the significance analysis of the features indicates that there is a significant difference between high and low arousal. Comparison with Existing Method(s) Compared with traditional methods, the method proposed in this paper can analyze the changes of subjects' brain state over time in more detail. The proposed features can be used to quantify the brain network for accurate analysis. Conclusions The proposed dynamic brain network bridges the research gaps in lacking time resolution and arousal conditions in emotion analysis. We can clearly get the dynamic changes of the overall and local details of the brain under high and low arousal conditions. Furthermore, the active segments and brain regions of the subjects were quantified and evaluated by channel norm information.This method can be used to realize the feature extraction and dynamic analysis of the arousal dimension of emotional EEG, further explore the emotional dimension model, and also play an auxiliary role in emotional analysis.
This research aims to develop a cyber-physical adaptive architectural space capable of real-time responses to people’s emotions, based on biological and neurological data. To achieve this goal, we integrated artificial intelligence (AI), wearable technology, sensory environments, and adaptive architecture to create an emotional bond between a space and its occupants and encourage affective emotional interactions between the two. The project’s objectives were to (1) measure and analyze biological and neurological data to detect emotions, (2) map and illustrate that emotional data, and (3) link occupants’ emotions and cognition to a built environment through a real-time emotive feedback loop. Using an interactive installation as a case study, this work examines the cognition-emotion-space interaction through changes in volume, color, and light as a means of emotional expression. It contributes to the current theory and practice of cyber-physical design and the role AI plays, as well as the interaction of technology and empathy.
Conference Paper
Full-text available
In this paper, we explore how shape changing interfaces might be used to communicate emotions. We present two studies, one that investigates which shapes users might create with a 2D flexible surface, and one that studies the efficacy of the resulting shapes in conveying a set of basic emotions. Results suggest that shape parameters are correlated to the positive or negative character of an emotion, while parameters related to movement are correlated with arousal level. In several cases, symbolic shape expressions based on clear visual metaphors were used. Results from our second experiment suggest participants were able to recognize emotions given a shape with a good accuracy within 28% of the dimensions of the Circumplex Model. We conclude that shape and shape changes of a 2D flexible surface indeed appear able to convey emotions in a way that is worthy of future exploration.
Full-text available
We propose that the tendency to anthropomorphize nonhuman agents is determined primarily by three factors (Epley, Waytz, & Cacioppo, 2007), two of which we test here: sociality motivation and effectance motivation. This theory makes unique predictions about dispositional, situational, cultural, and developmental variability in anthropomorphism, and we test two predictions about dispositional and situational influences stemming from both of these motivations. In particular, we test whether those who are dispositionally lonely (sociality motivation) are more likely to anthropomorphize well-known pets (Study 1), and whether those who have a stable need for control (effectance motivation) are more likely to anthropomorphize apparently unpredictable animals (Study 2). Both studies are consistent with our predictions. We suggest that this theory of anthropomorphism can help to explain when people are likely to attribute humanlike traits to nonhuman agents, and provides insight into the inverse process of dehumanization in which people fail to attribute human characteristics to other humans.
Was love invented by European poets in the Middle Ages or is it part of human nature? Will winning the lottery really make you happy? Is it possible to build robots that have feelings? Emotion: A Very Short Introduction explores the latest thinking about the emotions, drawing upon a wide range of scientific research, from anthropology and psychology to neuroscience and artificial intelligence. Anthropologists have begun to question their previous views on the cultural relativity of emotional experience; cognitive psychologists have abandoned their exclusive focus on reasoning, perception, and memory, and are rediscovering the importance of affective processes; and neuroscientists and researchers in artificial intelligence have also joined the debate.
It is now easy to find examples of interactive software agents and animated creatures that have the ability to express emotion; this paper describes research for giving them the ability to recognize emotion. The ability to recognize a person 's emotions is a key aspect of human "emotional intelligence," which has been described by a number of scientists as being more important to success in life than are the traditional forms of mathematical and verbal intelligence. This paper describes research underway in emotion recognition at the MIT Media Lab, especially research involving new wearable interfaces. 1 Introduction People often laugh or express delight at something presented by a computer---a funny animation, a virtual pet, a piece of humor mail---even though computers, to date, have been unaware of these human reactions. It is perhaps even more frequent to see a person expressing frustration or irritation at a computer, especially when they feel that the system is hind...
Smile for candy: Hershey eyes-in store excitement with facial recognition sampler
  • O Nieburg
Affective computing. MIT Media Laboratory; Perceptual Computing
  • R W Picard
Sentics: Emotional healing through music and touch
  • M Popova