Conference Paper

Sensors Model Student Self Concept in the Classroom.

DOI: 10.1007/978-3-642-02247-0_6 Conference: User Modeling, Adaptation, and Personalization, 17th International Conference, UMAP 2009, formerly UM and AH, Trento, Italy, June 22-26, 2009. Proceedings
Source: DBLP

ABSTRACT In this paper we explore ndings from three experiments that use minimally invasive sensors with a web based geometry tutor to create a user model. Minimally invasive sensor technology is mature enough to equip classrooms of up to 25 students with four sensors at the same time while using a computer based intelligent tutoring system. The sensors, which are on each student's chair, mouse, monitor, and wrist, provide data about posture, movement, grip tension, arousal, and facially expressed mental states. This data may provide adaptive feedback to an intelligent tutoring system based on an individual student's aective states. The experiments show that when sensor data supplements a user model based on tutor logs, the model reects a larger percentage of the students' self-concept than a user model based on the tutor logs alone. The models are further expanded to classify four ranges of emotional self-concept including frustration, interest, condence, and excitement with over 78% accuracy. The emotional predictions are a rst step for intelligent tutor systems to create sensor based personalized feedback for each student in a classroom environment. Bringing sensors to our children's schools addresses real problems of students' relationship to mathematics as they are learning the subject.

  • [Show abstract] [Hide abstract]
    ABSTRACT: One of the primary goals of affective computing is enabling computers to recognize human emotion. To do this we need accurately labeled affective data. This is challenging to obtain in real situations where affective events are not scripted and occur simultaneously with other activities and feelings. Affective labels also rely heavily on subject self-report for which can be problematic. This paper reports on methods for obtaining high quality emotion labels with reduced bias and variance and also shows that better training sets for machine learning algorithms can be created by combining multiple sources of evidence. During a 7 day, 13 participant field study we found that recognition accuracy for physiological activation improved from 63% to 79% with two sources of evidence and in an additional pilot study this improved to 100% accuracy for one subject over 10 days when context evidence was also included.
    Affective Computing and Intelligent Interaction - 4th International Conference, ACII 2011, Memphis, TN, USA, October 9-12, 2011, Proceedings, Part I; 01/2011
  • [Show abstract] [Hide abstract]
    ABSTRACT: If computers are to interact naturally with humans, they must express social competencies and recognize human emotion. This talk describes the role of technology in responding to both affect and cognition and examines research to identify student emotions (frustration, boredom and interest) with around 80% accuracy using hardware sensors and student self-reports. We also discuss “caring” computers that use animated learning companions to talk about the malleability of intelligence and importance of effort and perseverance. Gender differences were noted in the impact of these companions on student affect as were differences for students with learning disabilities. In both cases, students who used companions showed improved math attitudes, increased motivation and reduced frustration and anxiety over the long term. We also describe social tutors that scaffold collaborative problem solving in ill-defined domains. These tutors use deep domain understanding of students' dialogue to recognize (with over 85% accuracy) students who are engaged in useful learning activities. Finally, we describe tutors that help online participants engaged in situations involving differing opinions, e.g., in online dispute mediation, bargaining, and civic deliberation processes.
    Proceedings of the 10th international conference on Intelligent Tutoring Systems - Volume Part I; 06/2010
  • [Show abstract] [Hide abstract]
    ABSTRACT: Touch is a new and significantly different method of interacting with a computer and it is being adapted at a rapidly increasing rate with the introduction of the tablet computer. We log the characteristics of a student's touch interaction while solving math problems on a tablet. By correlating this data to high and low effort problem solving conditions we demonstrate the ability to predict student effort level. The technique is context free, thus can potentially be applied to any computer tablet application.

Full-text (3 Sources)

Available from
Aug 12, 2014

David G. Cooper