Article

Emotional Sensitivity in Human-Computer Interaction (Emotionale Sensitivität in der Mensch-Maschine Interaktion).

it - Information Technology 01/2009; 51:325-328. DOI: 10.1524/itit.2009.0557
Source: DBLP
0 Bookmarks
 · 
58 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Affect sensing by machines has been argued as an essential part of next-generation human-computer interaction (HCI). To this end, in the recent years a large number of studies have been conducted, which report automatic recognition of emotion as a difficult, but feasible task. However most effort has been put towards offline analysis, whereas to date only few applications exist, which are able to react to a user's emotion in real-time. In response to this deficit we introduce a framework we call Smart Sensor Integration (SSI), which considerably jump-starts the development of multimodal online emotion recognition (OER) systems. In particular SSI supports the pattern recognition pipeline by offering tailored tools for data segmentation, feature extraction, and pattern recognition, as well as, tools to apply them offline (training phase) and online (real-time recognition). Furthermore, it has been designed to handle input from various input modalities and to suit the fusion of multimodal information.
    Affective Computing and Intelligent Interaction and Workshops, 2009. ACII 2009. 3rd International Conference on; 10/2009
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Little attention has been paid so far to physiological signals for emotion recognition compared to audiovisual emotion channels such as facial expression or speech. This paper investigates the potential of physiological signals as reliable channels for emotion recognition. All essential stages of an automatic recognition system are discussed, from the recording of a physiological dataset to a feature-based multiclass classification. In order to collect a physiological dataset from multiple subjects over many weeks, we used a musical induction method which spontaneously leads subjects to real emotional states, without any deliberate lab setting. Four-channel biosensors were used to measure electromyogram, electrocardiogram, skin conductivity and respiration changes. A wide range of physiological features from various analysis domains, including time/frequency, entropy, geometric analysis, subband spectra, multiscale entropy, etc., is proposed in order to find the best emotion-relevant features and to correlate them with emotional states. The best features extracted are specified in detail and their effectiveness is proven by classification results. Classification of four musical emotions (positive/high arousal, negative/high arousal, negative/low arousal, positive/low arousal) is performed by using an extended linear discriminant analysis (pLDA). Furthermore, by exploiting a dichotomic property of the 2D emotion model, we develop a novel scheme of emotion-specific multilevel dichotomous classification (EMDC) and compare its performance with direct multiclass classification using the pLDA. Improved recognition accuracy of 95\% and 70\% for subject-dependent and subject-independent classification, respectively, is achieved by using the EMDC scheme.
    IEEE Transactions on Pattern Analysis and Machine Intelligence 01/2009; 30(12):2067-83. · 4.80 Impact Factor
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: For on-line classification of user states such as emotions or stress levels, we present a new, generic, and efficient physiological fea-ture set. In contrast to common approaches using features specifi-cally tailored to each physiological signal, we break up feature ex-traction into a simple, signal-specific pre-processing step, and the calculation of a comprehensive set of signal-independent features. This systematizes feature design for each physiological signal and facilitates the transfer to other signals. The time complexity of the approach is independent of the size of the analysis window and of the frequency with which feature vectors are computed for classi-fication. We also provide a variant of the feature set that has low memory requirements. Thus, our approach is well suited for im-plementing real-time applications. We evaluate the proposed fea-tures with an emotion and a stress classification task, showing that they are competitive w.r.t. the performance of classifications using signal-tuned state-of-the-art features.