Conference PaperPDF Available

Emotion Capture among Real Couples in Everyday Life

Authors:

Abstract

Illness management among married adults is mainly shared with their spouses and it involves social support. Social support among couples has been shown to affect emotional well-being positively or negatively and result in healthier habits among diabetes patients. Hence, through automatic emotion recognition, we could have an assessment of the emotional well-being of couples which could inform the development and triggering of interventions to help couples better manage chronic diseases. We are developing an emotion recognition system to recognize the emotions of real couples in everyday life and in this paper, we describe our approach to collecting sensor and self-report emotion data among Swiss-based German-speaking couples in everyday life. We also discuss various aspects of the study such as our novel approach of triggering data collection based on detecting that the partners are close and speaking, the self-reports and multimodal data as well as privacy concerns with our method.
Emotion Capture among Real Couples
in Everyday Life
George Boateng
ETH Zürich
Zürich, Switzerland
gboateng@ethz.ch
Urte Scholz
University of Zürich
Zürich, Switzerland
Janina Lüscher
University of Zürich
Zürich, Switzerland
janina.luescher@psychologie.uzh.ch
Tobias Kowatsch
ETH Zürich, University of St.
Gallen
Abstract
Illness management among married adults is mainly shared
with their spouses and it involves social support. Social
support among couples has been shown to affect emotional
well-being positively or negatively and result in healthier
habits among diabetes patients. Hence, through automatic
emotion recognition, we could have an assessment of the
emotional well-being of couples which could inform the de-
velopment and triggering of interventions to help couples
better manage chronic diseases. We are developing an
emotion recognition system to recognize the emotions of
urte.scholz@psychologie.uzh.ch Zürich, St. Gallen, Switzerland
tobias.kowatsch@unisg.ch
Paper presented at the 1st Momentary Emotion Elicitation & Capture (MEEC)
workshop, co-located with the ACM CHI Conference on Human Factors in
Computing Systems, Honolulu, Hawaii, USA, April 25th, 2020. This is an open-
access paper distributed under the terms of the Creative Commons Attribution
License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted
use, distribution, and reproduction in any medium, provided the original work is
properly cited.
real couples in everyday life and in this paper, we describe
our approach to collecting sensor and self-report emotion
data among Swiss-based German-speaking couples in ev-
eryday life. We also discuss various aspects of the study
such as our novel approach of triggering data collection
based on detecting that the partners are close and speak-
ing, the self-reports and multimodal data as well as privacy
concerns with our method.
Author Keywords
Emotion; Couples; Multimodal Sensor Data; Smartwatch;
Smartphone; Wearable Computing; Mobile Computing
CCS Concepts
Human-centered computing Ubiquitous and mobile
computing systems and tools; Applied computing
Psychology;
Introduction
Evidence suggests that for married adults, illness manage-
ment is mainly shared with their spouses and it involves
social support [16, 12]. Social support among spouses is
associated with healthier habits among diabetes patients
[9] and has been shown to have positive or negative effects
on emotional well-being [11, 6, 4]. Hence through emotion
recognition, we could have an assessment of the emotional
well-being of couples which could inform the development
and triggering of interventions to help couples better man-
age chronic diseases. In effect, the development of a sys-
tem for automatic recognition of couples’ emotions could
aid social psychology researchers to understand various
dynamics of couples’ relationships and their impact on well-
being.
Currently, psychologists measure emotions through various
self-reports such as the PANAS [18]. These self-reports
are however not practical for continuous emotion measure-
ment in everyday life because completing self-reports fre-
quently will be obtrusive. Several works in the area of emo-
tion recognition use data from actors reading texts in a spe-
cific emotional tone [10] or acting out dyadic interactions
like couples [5]. It is not clear whether the algorithms devel-
oped using these data will work well for the use case of the
naturalistic interactions of real couples.
We are developing an emotion recognition system to rec-
ognize the emotions of real couples in everyday life and in
this paper, we describe our approach to collecting sensor
and self-report emotion data among Swiss-based German-
speaking couples in everyday life. We then discuss various
aspects of the study such as our novel approach of trigger-
ing data collection based on detecting that the partners are
close and speaking, the self-reports and multimodal data as
well as privacy concerns with our method.
Data Collection
We are running a field study in which we collect sensor and
self-report emotion data in the context of chronic disease
management among couples. Specifically, we collect data
for seven days from German-speaking couples in Switzer-
land in which one partner has type-2 diabetes [7]. We have
collected data from eight (8) couples so far.
Each partner is given a smartwatch and smartphone run-
ning the DyMand system, a novel open-source mobile and
wearable system that we developed for ambulatory assess-
ment of couples’ chronic disease management [2]. The
DyMand system triggers the collection of sensor and self-
report data for 5 minutes each hour during the hours that
subjects pick. We collect the following sensor data from
the smartwatch: audio, heart rate, accelerometer, gyro-
scope, Bluetooth low energy (BLE) signal strength between
watches and ambient light. After the sensor data collection,
a self-report is triggered on the smartphone that asks about
emotions over the last 5 minutes using the Affective Slider
[1] which assesses the valence and arousal dimensions of
their emotions. We also record a 3-second video of their
facial expression while they complete the self-report on the
smartphone.
We trigger sensor data collection when the partners are
close and speaking in two steps. First, we determine close-
ness using the BLE signal strength between the smart-
watches. We check if the signal strength is within a certain
threshold, which corresponds to a distance estimate [2].
Then, we determine if the partners are speaking by using a
voice activity detection (VAD) machine learning model that
classifies speech versus non-speech, which we developed
and implemented to run in real-time on the smartwatch [3].
Discussion
Our hypothesis is that we are likely to collect high-quality
sensor and self-report emotion data during times that the
partners are interacting. Hence, rather than trigger data
collection at some random times in the hour which is the
standard approach [8, 14], we use a novel method entailing
triggering data collection after we detect that the partners
are close and speaking. If none of these conditions are met
in the hour, we do a backup recording by triggering data
collection in the last 15 minutes of the hour. This approach
has the potential to collect data that contain several conver-
sation moments which would provide several data for devel-
oping the emotion recognition system. Other researchers
can use our DyMand system for their data collection as the
code is open source [2]. Additionally, the methods we use
could also be used by other researchers to optimize the
collection of sensor and self-report data among couples or
other dyads in daily life.
We use the Affective Slider as opposed to other self-reports
like the PANAS because it can be easily and quickly com-
pleted. Additionally, the valence and arousal dimensions
based on Russell’s circumplex model of emotions [15] can
be used to place various emotions. Currently, we collect
self-report emotion data on the smartphones which are
given to the couples. It is possible to implement the Affec-
tive Slider on the smartwatch to ease the burden of com-
pleting the self-report and make the process quicker.
We collect multimodal sensor data using a smartwatch
because previous works have shown that multimodal ap-
proaches to emotion recognition perform better than uni-
modal approaches [10]. Additionally, in an everyday life
context, certain data modalities might not be available and
hence, emotion recognition systems need to be developed
that can perform well with subsets of these data modali-
ties. Also, in the future, to aid in the recognition task, other
sensor data about behavioral patterns could be collected
such as phone unlock frequency, frequency of phone calls,
messages sent, among others [17].
There are huge privacy concerns and ethical implications
as sensitive data such as audio are collected frequently.
We address these concerns by first subjecting our study
protocol to review and resulting in approval by the ethics
committee of the canton of Zurich. Also, we ensure that we
collect a maximum of 5-minute of audio data per hour in or-
der not to record a significant percentage of the couples’
everyday life. Additionally, to protect the privacy of subjects
not taking part in the study, we ask subjects to wear a tag
we give to them indicating to others around them that they
may be recorded. Finally, when the couples return the de-
vices after the study, we give them the option to listen to the
recorded audios and to request the deletion of any as they
wish without any explanation. This approach has been used
in other studies [13, 14].
Conclusion
In this work, we described our approach to collecting sen-
sor and self-report emotion data from Swiss-based German-
speaking couples in everyday life. We discussed various
aspects of the study. First, we discussed the use of a novel
approach of triggering data collection based on detecting
that the partners are close and speaking rather than just
randomly in the hour. Next, we discussed using a smartphone-
based Affective Slider self-report because it is quick to com-
plete. Then, we discussed collecting multimodal sensor
data with a smartwatch because it could produce more ac-
curate emotion recognition models. Finally, we discussed
our approach to addressing privacy concerns such as giv-
ing subjects the option to request the deletion of any of their
audio upon returning the devices.
Acknowledgements
We are grateful to Prabakaran Santhanam and Dominik
Rügger for helping with the development of the mobile
software tools that we are using in running the study. This
work is funded by the Swiss National Science Foundation
(CR12I1_166348).
REFERENCES
[1] Alberto Betella and Paul FMJ Verschure. 2016. The
affective slider: A digital self-assessment scale for the
measurement of human emotions. PloS one 11, 2
(2016), e0148037.
[2] George Boateng, Prabhakaran Santhanam, Janina
Lüscher, Urte Scholz, and Tobias Kowatsch. 2019a.
Poster: DyMandAn Open-Source Mobile and
Wearable System for Assessing Couples’ Dyadic
Management of Chronic Diseases. In The 25th Annual
International Conference on Mobile Computing and
Networking. 13.
[3] George Boateng, Prabhakaran Santhanam, Janina
Lüscher, Urte Scholz, and Tobias Kowatsch. 2019b.
VADLite: an open-source lightweight system for
real-time voice activity detection on smartwatches. In
Adjunct Proceedings of the 2019 ACM International
Joint Conference on Pervasive and Ubiquitous
Computing and Proceedings of the 2019 ACM
International Symposium on Wearable Computers.
902906.
[4] Niall Bolger and David Amarel. 2007. Effects of social
support visibility on adjustment to stress: Experimental
evidence. Journal of personality and social psychology
92, 3 (2007), 458.
[5] Carlos Busso, Murtaza Bulut, Chi-Chun Lee, Abe
Kazemzadeh, Emily Mower, Samuel Kim, Jeannette N
Chang, Sungbok Lee, and Shrikanth S Narayanan.
2008. IEMOCAP: Interactive emotional dyadic motion
capture database. Language resources and evaluation
42, 4 (2008), 335.
[6] Masumi Iida, Mary Ann Parris Stephens, Karen S
Rook, Melissa M Franks, and James K Salem. 2010.
When the going gets tough, does support get going?
Determinants of spousal support provision to type 2
diabetic patients. Personality and Social Psychology
Bulletin 36, 6 (2010), 780791.
[7] Janina Lüscher, Tobias Kowatsch, George Boateng,
Prabhakaran Santhanam, Guy Bodenmann, and Urte
Scholz. 2019. Social Support and Common Dyadic
Coping in Couples’ Dyadic Management of Type II
Diabetes: Protocol for an Ambulatory Assessment
Application. JMIR research protocols 8, 10 (2019),
e13685.
[8] Matthias R Mehl, Megan L Robbins, and Fenne große
Deters. 2012. Naturalistic observation of
health-relevant social processes: The Electronically
Activated Recorder (EAR) methodology in
psychosomatics. Psychosomatic Medicine 74, 4
(2012), 410.
[9] Daisy Miller and J Lynne Brown. 2005. Marital
interactions in the process of dietary change for type 2
diabetes. Journal of Nutrition Education and Behavior
37, 5 (2005), 226234.
[10] Soujanya Poria, Erik Cambria, Rajiv Bajpai, and Amir
Hussain. 2017. A review of affective computing: From
unimodal analysis to multimodal fusion. Information
Fusion 37 (2017), 98125.
[11] Gabriele Prati and Luca Pietrantoni. 2010. The relation
of perceived and received social support to mental
health among first responders: a meta-analytic review.
Journal of Community Psychology 38, 3 (2010), 403
417.
[12] Tuula-Maria Rintala, Pia Jaatinen, Eija Paavilainen,
and Päivi Åstedt-Kurki. 2013. Interrelation between
adult persons with diabetes and their family: a
systematic review of the literature. Journal of family
nursing 19, 1 (2013), 3–28.
[13] Megan L Robbins, Elizabeth S Focella, Shelley Kasle,
Ana María López, Karen L Weihs, and Matthias R
Mehl. 2011. Naturalistically observed swearing,
emotional support, and depressive symptoms in
women coping with illness. Health Psychology 30, 6
(2011), 789.
[14] Megan L Robbins, Ana María López, Karen L Weihs,
and Matthias R Mehl. 2014. Cancer conversations in
context: naturalistic observation of couples coping with
breast cancer. Journal of Family Psychology 28, 3
(2014), 380.
[15] James A Russell. 1980. A circumplex model of affect.
Journal of personality and social psychology 39, 6
(1980), 1161.
[16] Amber J Seidel, Melissa M Franks, Mary Ann Parris
Stephens, and Karen S Rook. 2012. Spouse control
and type 2 diabetes management: moderating effects
of dyadic expectations for spouse involvement. Family
relations 61, 4 (2012), 698709.
[17] Mirjam Stieger, Marcia Nißen, Dominik Rüegger,
Tobias Kowatsch, Christoph Flückiger, and Mathias
Allemand. 2018. PEACH, a smartphone-and
conversational agent-based coaching intervention for
intentional personality change: study protocol of a
randomized, wait-list controlled trial. BMC psychology
6, 1 (2018), 43.
[18] David Watson, Lee Anna Clark, and Auke Tellegen.
1988. Development and validation of brief measures of
positive and negative affect: the PANAS scales.
Journal of personality and social psychology 54, 6
(1988), 1063.
... We then discuss these studies and make five recommendations for future data collection among couples in the lab to improve automatic emotion recognition. For work focusing on data collected from couples in everyday life, see our paper (under review) [2]. ...
Conference Paper
Full-text available
Couples’ relationships affect partners’ mental and physical well-being. Automatic recognition of couples’ emotions will not only help to better understand the interplay of emotions, intimate relationships, and health and well-being, but also provide crucial clinical insights into protective and risk factors of relationships, and can ultimately guide interventions. However, several works developing emotion recognition algorithms use data from actors in artificial dyadic interactions and the algorithms are likely not to perform well on real couples. We are developing emotion recognition methods using data from real couples and, in this paper, we describe two studies we ran in which we collected emotion data from real couples — Dutch-speaking couples in Belgium and German-speaking couples in Switzerland. We discuss our approach to eliciting and capturing emotions and make five recommendations based on their relevance for developing well-performing emotion recognition systems for couples.
Preprint
Full-text available
Couples generally manage chronic diseases together and the management takes an emotional toll on both patients and their romantic partners. Consequently, recognizing the emotions of each partner in daily life could provide an insight into their emotional well-being in chronic disease management. The emotions of partners are currently inferred in the lab and daily life using self-reports which are not practical for continuous emotion assessment or observer reports which are manual, time-intensive, and costly. Currently, there exists no comprehensive overview of works on emotion recognition among couples. Furthermore, approaches for emotion recognition among couples have (1) focused on English-speaking couples in the U.S., (2) used data collected from the lab, and (3) performed recognition using observer ratings rather than partner's self-reported / subjective emotions. In this body of work contained in this thesis (8 papers - 5 published and 3 currently under review in various journals), we fill the current literature gap on couples' emotion recognition, develop emotion recognition systems using 161 hours of data from a total of 1,051 individuals, and make contributions towards taking couples' emotion recognition from the lab which is the status quo, to daily life. This thesis contributes toward building automated emotion recognition systems that would eventually enable partners to monitor their emotions in daily life and enable the delivery of interventions to improve their emotional well-being.
Conference Paper
Full-text available
Married adults share illness management with spouses and it involves social support and common dyadic coping (CDC). Social support and CDC have an impact on health behavior and well-being or emotions in couples' dyadic management of diabetes in daily life. Hence, understanding dyadic interactions in-situ in chronic disease management could inform behavioral interventions to help the dyadic management of chronic diseases. It is however not clear how well social support and CDC can be assessed in daily life among couples who are managing chronic diseases. In this ongoing work, we describe the development of DyMand, a novel open-source mobile and wearable system for ambulatory assessment of couples' dyadic management of chronic diseases. Our first prototype is used in the context of diabetes mellitus Type II. Additionally, we briefly describe our experience deploying the prototype in two pre-pilot tests with five subjects and our plans for future deployments.
Conference Paper
Full-text available
Smartwatches provide a unique opportunity to collect more speech data because they are always with the user and also have a more exposed microphone compared to smartphones. Speech data could be used to infer various indicators of mental well being such as emotions, stress and social activity. Hence, real-time voice activity detection (VAD) on smartwatches could enable the development of applications for mental health monitoring. In this work, we present VADLite, an open-source, lightweight, system that performs real-time VAD on smartwatches. It extracts mel-frequency cepstral coefficients and classifies speech versus non-speech audio samples using a linear Support Vector Machine. The real-time implementation is done on the Wear OS Polar M600 smartwatch. An offline and online evaluation of VADLite using real-world data showed better performance than WebRTC's open-source VAD system. VADLite can be easily integrated into Wear OS projects that need a lightweight VAD module running on a smartwatch.
Article
Full-text available
Background: Type II diabetes mellitus (T2DM) is a common chronic disease. To manage blood glucose levels, patients need to follow medical recommendations for healthy eating, physical activity, and medication adherence in their everyday life. Illness management is mainly shared with partners and involves social support and common dyadic coping (CDC). Social support and CDC have been identified as having implications for people’s health behavior and well-being. Visible support, however, may also be negatively related to people’s well-being. Thus, the concept of invisible support was introduced. It is unknown which of these concepts (ie, visible support, invisible support, and CDC) displays the most beneficial associations with health behavior and well-being when considered together in the context of illness management in couple’s everyday life. Therefore, a novel ambulatory assessment application for the open-source behavioral intervention platform MobileCoach (AAMC) was developed. It uses objective sensor data in combination with self-reports in couple’s everyday life. Objective: The aim of this paper is to describe the design of the Dyadic Management of Diabetes (DyMand) study, funded by the Swiss National Science Foundation (CR12I1_166348/1). The study was approved by the cantonal ethics committee of the Canton of Zurich, Switzerland (Req-2017_00430). Methods: This study follows an intensive longitudinal design with 2 phases of data collection. The first phase is a naturalistic observation phase of couples’ conversations in combination with experience sampling in their daily lives, with plans to follow 180 T2DM patients and their partners using sensor data from smartwatches, mobile phones, and accelerometers for 7 consecutive days. The second phase is an observational study in the laboratory, where couples discuss topics related to their diabetes management. The second phase complements the first phase by focusing on the assessment of a full discussion about diabetes-related concerns. Participants are heterosexual couples with 1 partner having a diagnosis of T2DM. Results: The AAMC was designed and built until the end of 2018 and internally tested in March 2019. In May 2019, the enrollment of the pilot phase began. The data collection of the DyMand study will begin in September 2019, and analysis and presentation of results will be available in 2021. Conclusions: For further research and practice, it is crucial to identify the impact of social support and CDC on couples’ dyadic management of T2DM and their well-being in daily life. Using AAMC will make a key contribution with regard to objective operationalizations of visible and invisible support, CDC, physical activity, and well-being. Findings will provide a sound basis for theory- and evidence-based development of dyadic interventions to change health behavior in the context of couple’s dyadic illness management. Challenges to this multimodal sensor approach and its feasibility aspects are discussed. International Registered Report Identifier (IRRID): PRR1-10.2196/13685
Article
Full-text available
Background: This protocol describes a study that will test the effectiveness of a 10-week non-clinical psychological coaching intervention for intentional personality change using a smartphone application. The goal of the intervention is to coach individuals who are willing and motivated to change some aspects of their personality, i.e., the Big Five personality traits. The intervention is based on empirically derived general change mechanisms from psychotherapy process-outcome research. It uses the smartphone application PEACH (PErsonality coACH) to allow for a scalable assessment and tailored interventions in the everyday life of participants. A conversational agent will be used as a digital coach to support participants to achieve their personality change goals. The goal of the study is to examine the effectiveness of the intervention at post-test assessment and three-month follow-up. Methods/Design: A 2x2 factorial between-subject randomized, wait-list controlled trial with intensive longitudinal methods will be conducted to examine the effectiveness of the intervention. Participants will be randomized to one of four conditions. One experimental condition includes a conversational agent with high self-awareness to deliver the coaching program. The other experimental condition includes a conversational agent with low self-awareness. Two wait-list conditions refer to the same two experimental conditions, albeit with four weeks without intervention at the beginning of the study. The 10-week intervention includes different types of micro-interventions: (a) individualized implementation intentions, (b) psychoeducation, (c) behavioral activation tasks, (d) self-reflection, (e) resource activation, and (f) individualized progress feedback. Study participants will be at least 900 German-speaking adults (18 years and older) who install the PEACH application on their smartphones, give their informed consent, pass the screening assessment, take part in the pre-test assessment and are motivated to change or modify some aspects of their personality. Discussion: This is the first study testing the effectiveness of a smartphone- and conversational agent-based coaching intervention for intended personality change. Given that this novel intervention approach proves effective, it could be implemented in various non-clinical settings and could reach large numbers of people due to its low-threshold character and technical scalability.
Article
Full-text available
Self-assessment methods are broadly employed in emotion research for the collection of subjective affective ratings. The Self-Assessment Manikin (SAM), a pictorial scale developed in the eighties for the measurement of pleasure, arousal, and dominance, is still among the most popular self-reporting tools, despite having been conceived upon design principles which are today obsolete. By leveraging on state-of-the-art user interfaces and metacommunicative pictorial representations, we developed the Affective Slider (AS), a digital self-reporting tool composed of two slider controls for the quick assessment of pleasure and arousal. To empirically validate the AS, we conducted a systematic comparison between AS and SAM in a task involving the emotional assessment of a series of images taken from the International Affective Picture System (IAPS), a database composed of pictures representing a wide range of semantic categories often used as a benchmark in psychological studies. Our results show that the AS is equivalent to SAM in the self-assessment of pleasure and arousal, with two added advantages: the AS does not require written instructions and it can be easily reproduced in latest-generation digital devices, including smartphones and tablets. Moreover, we compared new and normative IAPS ratings and found a general drop in reported arousal of pictorial stimuli. Not only do our results demonstrate that legacy scales for the self-report of affect can be replaced with new measurement tools developed in accordance to modern design principles, but also that standardized sets of stimuli which are widely adopted in research on human emotion are not as effective as they were in the past due to a general desensitization towards highly arousing content.
Article
Full-text available
This study explored the feasibility and potentials of a naturalistic observation approach to studying dyadic coping in everyday life. Specifically, it examined the natural context and content of the spontaneous cancer conversations of couples coping with cancer, and how they relate to patients' and spouses' psychological adjustment. Women with breast cancer (N = 56) and their spouses wore the electronically activated recorder (EAR), an unobtrusive observation method that periodically records snippets of ambient sounds, over one weekend to observe the couples' cancer conversations in their natural context. Both patients and spouses completed self-reported measures of psychological adjustment at baseline and at a 2-month follow-up. Cancer was a topic of approximately 5% of couples' conversations. Cancer conversations occurred more often within the couple than with friends and family, and they were more often informational than emotional or supportive. Consistent with research on the social cognitive processing model (Lepore & Revenson, 2007), spouses' engagement in emotional disclosure and informational conversation with patients predicted better patient adjustment. This first naturalistic observation study of dyadic coping revealed that the EAR method can be implemented with high compliance and relatively low obtrusiveness within the sensitive context of couples coping with cancer, and having a spouse who discussed cancer in an emotional or informational way predicted better patient adjustment. As a complement to in-lab and other momentary assessment methods, a naturalistic observation approach with a method such as the EAR can contribute to a more comprehensive understanding of the role that communication processes play in coping with cancer. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
Article
Full-text available
In recent studies of the structure of affect, positive and negative affect have consistently emerged as two dominant and relatively independent dimensions. A number of mood scales have been created to measure these factors; however, many existing measures are inadequate, showing low reliability or poor convergent or discriminant validity. To fill the need for reliable and valid Positive Affect and Negative Affect scales that are also brief and easy to administer, we developed two 10-item mood scales that comprise the Positive and Negative Affect Schedule (PANAS). The scales are shown to be highly internally consistent, largely uncorrelated, and stable at appropriate levels over a 2-month time period. Normative data and factorial and external evidence of convergent and discriminant validity for the scales are also presented. (PsycINFO Database Record (c) 2010 APA, all rights reserved)
Article
Full-text available
Factor-analytic evidence has led most psychologists to describe affect as a set of dimensions, such as displeasure, distress, depression, excitement, and so on, with each dimension varying independently of the others. However, there is other evidence that rather than being independent, these affective dimensions are interrelated in a highly systematic fashion. The evidence suggests that these interrelationships can be represented by a spatial model in which affective concepts fall in a circle in the following order: pleasure (0), excitement (45), arousal (90), distress (135), displeasure (180), depression (225), sleepiness (270), and relaxation (315). This model was offered both as a way psychologists can represent the structure of affective experience, as assessed through self-report, and as a representation of the cognitive structure that laymen utilize in conceptualizing affect. Supportive evidence was obtained by scaling 28 emotion-denoting adjectives in 4 different ways: R. T. Ross's (1938) technique for a circular ordering of variables, a multidimensional scaling procedure based on perceived similarity among the terms, a unidimensional scaling on hypothesized pleasure–displeasure and degree-of-arousal dimensions, and a principal-components analysis of 343 Ss' self-reports of their current affective states. (70 ref) (PsycINFO Database Record (c) 2012 APA, all rights reserved)
Article
Affective computing is an emerging interdisciplinary research field bringing together researchers and practitioners from various fields, ranging from artificial intelligence, natural language processing, to cognitive and social sciences. With the proliferation of videos posted online (e.g., on YouTube, Facebook, Twitter) for product reviews, movie reviews, political views, and more, affective computing research has increasingly evolved from conventional unimodal analysis to more complex forms of multimodal analysis. This is the primary motivation behind our first of its kind, comprehensive literature review of the diverse field of affective computing. Furthermore, existing literature surveys lack a detailed discussion of state of the art in multimodal affect analysis frameworks, which this review aims to address. Multimodality is defined by the presence of more than one modality or channel, e.g., visual, audio, text, gestures, and eye gage. In this paper, we focus mainly on the use of audio, visual and text information for multimodal affect analysis, since around 90% of the relevant literature appears to cover these three modalities. Following an overview of different techniques for unimodal affect analysis, we outline existing methods for fusing information from different modalities. As part of this review, we carry out an extensive study of different categories of state-of-the-art fusion techniques, followed by a critical analysis of potential performance improvements with multimodal analysis compared to unimodal analysis. A comprehensive overview of these two complementary fields aims to form the building blocks for readers, to better understand this challenging and exciting research field.
Article
Diabetes mellitus is a common chronic disease all over the world. Self-management plays a crucial role in diabetes management. The purpose of this systematic review was to summarize what is known about the interactions between adult persons with diabetes, their family, and diabetes self-management. MEDLINE, CINAHL, PSYCHINFO, LINDA, and MEDIC databases were searched for the years 2000 to 2011 and for English language articles, and the reference lists of the studies included were reviewed to capture additional studies. The findings indicate that family members have influence on the self-management of adult persons with diabetes. The support from family members plays a crucial role in maintaining lifestyle changes and optimizing diabetes management. Diabetes and its treatment also affect the life of family members in several ways, causing, for example, different types of psychological distress. More attention should be paid to family factors in diabetes management among adult persons.