Conference Paper

Sensors Model Student Self Concept in the Classroom

DOI: 10.1007/978-3-642-02247-0_6 Conference: User Modeling, Adaptation, and Personalization, 17th International Conference, UMAP 2009, formerly UM and AH, Trento, Italy, June 22-26, 2009. Proceedings
Source: DBLP

ABSTRACT

In this paper we explore ndings from three experiments that use minimally invasive sensors with a web based geometry tutor to create a user model. Minimally invasive sensor technology is mature enough to equip classrooms of up to 25 students with four sensors at the same time while using a computer based intelligent tutoring system. The sensors, which are on each student's chair, mouse, monitor, and wrist, provide data about posture, movement, grip tension, arousal, and facially expressed mental states. This data may provide adaptive feedback to an intelligent tutoring system based on an individual student's aective states. The experiments show that when sensor data supplements a user model based on tutor logs, the model reects a larger percentage of the students' self-concept than a user model based on the tutor logs alone. The models are further expanded to classify four ranges of emotional self-concept including frustration, interest, condence, and excitement with over 78% accuracy. The emotional predictions are a rst step for intelligent tutor systems to create sensor based personalized feedback for each student in a classroom environment. Bringing sensors to our children's schools addresses real problems of students' relationship to mathematics as they are learning the subject.

Download full-text

Full-text

Available from: Ivon Arroyo, Aug 12, 2014
  • Source
    • "Detection techniques have overlooked posture as a serious contender to facial expressions [7] and acoustic characteristics. It is surprising how the advantages of using posture for diagnosing the affective states of a user have been overlooked, given that human bodies are relatively large and are capable of multiple degrees of movement and thus can assume a wealth of unique configurations, which can be combined simultaneously and temporarily, potentially making posture an ideal channel for communication [8]. "
    [Show abstract] [Hide abstract]
    ABSTRACT: The aim of this work is the development of a network of wireless devices to determine, along with a time-stamp, postural changes of users that are to be used in personalized learning environments. For this purpose, we have designed a basic low-cost pressure sensor that can be built from components easily available. Several of these basic sensors (of sizes and shapes chosen specifically for the task) are integrated into a posture sensor cushion, which is electronically controlled by an Arduino microcontroller board. This accounts for experiments involving either a single cushion to be used by an individual end-user setting approach or classroom approaches where several of these cushions make up a sensor network via ZigBee wireless connections. The system thus formed is an excellent alternative to other more expensive commercial systems and provides a low-cost, easy-to-use, portable, scalable, autonomous, flexible solution with free hardware and software, which can be integrated with other sensing devices into a larger affect detection system, customizable to cope with postural changes at required time intervals and support single and collective oriented experimentation approaches.
    Full-text · Article · Aug 2015 · International Journal of Distributed Sensor Networks
  • Source
    • "Predicting the students' affective states, that is, attempting to determine these states while students interact with the system, is a challenging problem in education research, and is the focus of several current research efforts [6], [7]. Methods that have been implemented in ITS to predict the affective state include human observation [5], [8], [9], learners' self-reported data of their affective state [10], [11], mining the system's log file [12], [13], modeling affective states [11], [14], facebased emotion recognition systems [4], [3], analyzing the data from physical sensors [15], [16], [10], and more recently, sensing devices such as physiological sensors [17], [18]. Advances in these methods look promising in a lab setting. "
    [Show abstract] [Hide abstract]
    ABSTRACT: The importance of affect in learning has led many intelligent tutoring systems (ITS) to include learners' affective states in their student models. The approaches used to identify affective states include human observation, self-reporting, data from physical sensors, modeling affective states, and mining students' data in log files. Among these, data mining and modeling affective states offer the most feasible approach in real-world settings, which may involve a huge number of students. Systems using data mining approaches to predict frustration have reported high accuracy, while systems that predict frustration by modeling affective states, not only predict a student's affective state but also the reason for that state. In our approach, we combine these approaches. We begin with the theoretical definition of frustration, and operationalize it as a linear regression model by selecting and appropriately combining features from log file data. We illustrate our approach by modeling the learners' frustration in Mindspark, a mathematics ITS with large-scale deployment. We validate our model by independent human observation. Our approach shows comparable results to existing data mining approaches and also the clear interpretation of the reasons for the learners' frustration.
    Full-text · Article · Oct 2013 · IEEE Transactions on Learning Technologies
  • Source
    • "Several approaches have been used for affect recognition and researchers have explored how these approaches complement or supplement each other in learning scenarios [1]. Affect recognition using facial expression as input has been regarded as the most accurate measurement [2]. However, brain-computer interfaces (BCI) have not yet been incorporated. "
    [Show abstract] [Hide abstract]
    ABSTRACT: The ability of a learning system to infer a student’s affects has become highly relevant to be able to adjust its pedagogical strategies. Several methods have been used to infer affects. One of the most recognized for its reliability is face- based affect recognition. Another emerging one involves the use of brain-computer interfaces. In this paper we compare those strategies and explore if, to a great extent, it is possible to infer the values of one source from the other source.
    Full-text · Conference Paper · Jul 2013
Show more