Are you Mohammed E. Hoque?

Claim your profile

Publications (6)0 Total impact

  • Source
    Affective Computing and Intelligent Interaction - 4th International Conference, ACII 2011, Memphis, TN, USA, October 9-12, 2011, Proceedings, Part I; 01/2011
  • Affective Computing and Intelligent Interaction - Fourth International Conference, ACII 2011, Memphis, TN, USA, October 9-12, 2011, Proceedings, Part II; 01/2011
  • Source
    [show abstract] [hide abstract]
    ABSTRACT: Participatory user interface design with adolescent users on the autism spectrum presents a number of unique challenges and opportunities. Through our work developing a system to help autistic adolescents learn to recognize facial expressions, we have learned valuable lessons about software and hardware design issues for this population. These lessons may also be helpful in assimilating iterative user input to customize technology for other populations with special needs.
    Proceedings of the 27th International Conference on Human Factors in Computing Systems, CHI 2009, Extended Abstracts Volume, Boston, MA, USA, April 4-9, 2009; 01/2009
  • Source
    Mohammed E. Hoque, Rana El Kaliouby, Rosalind W. Picard
    [show abstract] [hide abstract]
    ABSTRACT: This paper describes the challenges of getting gro und truth affective labels for spontaneous video, and presents implicat ions for systems such as virtual agents that have automated facial analysis capabilities. We first present a dataset from an intelligent tutoring application an d describe the most prevalent approach to labeling such data. We then present an alternative labeling approach, which closely models how the majority of automated facial analysis systems are designed. We show that while participan ts, peers and trained judges report high inter-rater agreement on expressions of delight, confusion, flow, frustration, boredom, surprise, and neutral when sh own the entire 30 minutes of video for each participant, inter-rater agreement d rops below chance when human coders are asked to watch and label short 8 s econd clips for the same set of labels. We also perform discriminative analysis for facial action units for each affective state represented in the clips. The results emphasize that human coders heavily rely on factors such as familiarity of the person and context of the interaction to correctly infer a person's affec tive state; without this information, the reliability of humans as well as m achines attributing affective labels to spontaneous facial-head movements drops s ignificantly.
    Intelligent Virtual Agents, 9th International Conference, IVA 2009, Amsterdam, The Netherlands, September 14-16, 2009, Proceedings; 01/2009
  • Source
    [show abstract] [hide abstract]
    ABSTRACT: Individuals on the autism spectrum often have difficulties producing intelligible speech with either high or low speech rate, and atypical pitch and/or amplitude affect. In this study, we present a novel intervention towards customizing speech enabled games to help them produce intelligible speech. In this approach, we clinically and computationally identify the areas of speech production difficulties of our participants. We provide an interactive and customized interface for the participants to meaningfully manipulate the prosodic aspects of their speech. Over the course of 12 months, we have conducted several pilots to set up the experimental design, developed a suite of games and audio processing algorithms for prosodic analysis of speech. Preliminary results demonstrate our intervention being engaging and effective for our participants.
    INTERSPEECH 2009, 10th Annual Conference of the International Speech Communication Association, Brighton, United Kingdom, September 6-10, 2009; 01/2009
  • Source
    [show abstract] [hide abstract]
    ABSTRACT: Social communication in autism is significantly hindered by difficulties processing affective cues in realtime face-to-face interaction. The interactive Social-Emotional Toolkit (iSET) allows its users to record and annotate video with emotion labels in real time, then review and edit the labels later to bolster understanding of affective information present in interpersonal interactions. The iSET demo will let the ACII audience experience the augmentation of interpersonal interactions by using the iSET system.
    01/2009;