Conference Paper

MEMS based gesture recognition

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Gesture recognition has been extensively investigated and has traditionally been accomplished by methods based on computer vision. In this paper we present an hand gesture recognition system which is of general use and can be used for different applications. The system is based on a MEMS accelerometer and it is able to recognise several gestures which have been previously recorded into a suitable vocabulary.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Our work fulfils both the criterion. Previously several works [1][2][3][4][5] are reported on automated wheelchair with gesture control but so far no work has been done using ArduinoLilypad [6]. The previously proposed systems use bulky and weighty transmitter part, which is not at all easy to carry, thus makes it difficult to use. ...
Article
Full-text available
This paper represents a medical assistance system ,which can be controlled wirelessly via gesture, for specially abled people. A sensor (accelerometer) detects the gesture or change in the gesture through which the patient will control the robot for assistance and a microcontroller will command wirelessly, depending upon sensor’s value to move in desired direction. There are two parts of the whole process, a transmitting circuit and the receiving circuit. The most important part for any medical assistant is the part which will be associated with the patient and in this case it is the transmitting circuit. So it has to be easy to use and most importantly easy to carry. In our project we have used ArduinoLilypad as the main governing microcontroller which makes the transmitting circuit wearable. So the patient only needs to wear the transmitting circuit to the body part through which he/she intend to make the gesture. A RF module has been used to make the data transmission wireless and the programming has been done in Arduino IDE
... Mirabella et al. created a gesture recognition system that allowed users to navigate digital photos, home TVs or to provide special services to people with physical disabilities through pre-defined sets of gestures. The system uses the acceleration sensor to read the input data of the gesture, trains through the HMM and recognizes the condition of the user-defined gesture, and the user can add a new gesture to the gesture list according to the application needs [45] . Similarly, Kela used the accelerometer controller and HMM to collect and recognize the user's input gestures and to study the effects of gesture modality on user interaction. ...
Article
Full-text available
With the development of virtual reality (VR) and human-computer interaction technology, how to use natural and efficient interaction methods in the virtual environment has become a hot topic of research. Gesture is one of the most important communication methods of human beings, which can effectively express users’ demands. In the past few decades, gesture-based interaction has made significant progress. This article focuses on the gesture interaction technology and discusses the definition and classification of gestures, input devices for gesture interaction, and gesture interaction recognition technology. The application of gesture interaction technology in virtual reality is studied, the existing problems in the current gesture interaction are summarized, and the future development is prospected.
Article
This paper is about a car controlled through gesture based on wireless technology. This aids physically challenged people and additionally for sure tasks educated by human. The aim of this project is to manage the automaton victimization hand gesture. measuring device works according to the hand gestures of the person having gloves containing transmitter and it consists of Arduino Uno for controlling action. Four main Hand gesture movements like FORWORD, BACKWORD, LEFT and RIGHT area unit detected and enforced.
Conference Paper
Interaction with appliances present in an environment can be difficult for people who have mobility problems (such as the elderly or moving out of a wheelchair). This paper presents an approach based on the use of an Inertial Measurement Unit which identifies human movements from two points of view: on the one hand, the user can specify the object with which it wants to interact by simply pointing it with his arm, from the other he can pass a command to the object indicated by an appropriate gesture of the arm. The paper describes the operating modes of the approach, the Hw/Sw architecture used and the underlying mathematical analysis.
Conference Paper
This paper presents a motion capture system designed to detect and manage body motion, measuring some motion parameters, and providing a graphical reconstruction of the movement. This system can be used by a doctor to record some target body movements, which a patient can execute remotely at home. Such movements are later analyzed by the doctor who evaluates the report using effective metrics or viewing the patient's performances stored in the exchanged file. The main goal is to achieve a simple index/vote for each exercise, which is used by the patient to improve his performances and by doctor to check the rehabilitation status.
Conference Paper
Activity recognition has become one of the most popular research topics. In this paper, efforts on activity recognition based on the WIMU (Wireless Inertial Measurement Unit) sensors are reported. We use triaxial accelerometer to capture the acceleration of the human body and train the models for different activities, and then employ a further application of recognizing the human activities with these models. Different settings of the sensors are compared and the importance of different features is discussed. In addition, also a real-time detection system is designed and realized, which can be used as the real-time recognition of human activities and is of great importance for the application of WIMU in the healthcare for the elderly.
Article
This paper presents a motion capture system which provides both an accurate measurement of some motion parameters of a human arm and a graphical reconstruction of the movement on a synthetic model. The hardware uses several MEMS inertial sensors whose data are processed using mathematical tools based on quaternions. Measurements on a prototype have proven the system to be very accurate, since through the sensors placed on the joints of the arms it was possible to extract a video motion very faithful to the real. Monitoring the movements of the human body is of great importance in various application fields including medicine, and sport. In medicine, rehabilitation after an accident or illness may require to execute a number of exercises related to the movement of a limb or a specific body part. The purpose may be to restore function of the limb or to carry out clinical examinations in controlled stress conditions. Similarly, in the sport it may be useful to undertake appropriate exercises to increase both muscle mass or muscle tone. In both cases, these activities are usually carried out under the supervision of a specialist (a doctor or a coach) that monitors the exercises, assesses the proper execution and the results. Traditional procedures however show several issues that severely limit the effectiveness of the results: • First of all, the continuous presence of a supervisor (doctor or coach) that monitors the activities carried out and corrects any error is required. Since the cost of skilled personnel is high, the supervisor is usually shared between several patients, thus reducing the effectiveness of the supervision. • The execution of these exercises is evaluated only qualitatively. It is not possible to accurately measure the parameters involved in the execution of an exercise such as the speed in the movement of an arm or the angle of rotation of a joint. Even counting the number of
Article
Full-text available
This paper describes the Magic Pen, a modified pen that allows natural cable-less interaction with AR environments. The pen is a normal pen with bright Light Emitting Diodes (LEDs) mounted at each end. Using computer vision techniques, these LEDs are tracked in the image captured by a head-mounted camera, and the pen's position and orientation are computed. This permits the use of image plane interaction techniques for interacting with virtual images. We describe the computer vision algorithms used, the tracking results and the interaction supported.
Article
Full-text available
This paper presents an application that uses hand gesture input to control a computer while giving a presentation. In order to develop a prototype of this application, we have defined an interaction model, a notation for gestures, and a set of guidelines to design gestural command sets. This works aims to define interaction styles that work in computerized reality environments. In our application, gestures are used for interacting with the computer as well as for communicating with other people or operating other devices.
Article
Full-text available
In many applications today user interaction is moving away from mouse and pens and is becoming pervasive and much more physical and tangible. New emerging interaction technologies allow developing and experimenting with new interaction methods on the long way to providing intuitive human computer interaction. In this paper, we aim at recognizing gestures to interact with an application and present the design and evaluation of our sensor-based gesture recognition. As input device we employ the Wii-controller(Wiimote) which recently gained much attention world wide. We use the Wiimote’s acceleration sensor independent of the gaming console for gesture recognition. The system allows the training of arbitrary gestures by users which can then be recalled for interacting with systems like photo browsing on a home TV. The developed library exploits Wii-sensor data and employs a hidden Markov model for training and recognizing user-chosen gestures. Our evaluation shows that we can already recognize gestures with a small number of training samples. In addition to the gesture recognition we also present our experiences with the Wii-controller and the implementation of the gesture recognition. The system forms the basis for our ongoing work on multimodal intuitive media browsing and are available to other researchers in the field.
Conference Paper
Full-text available
A glove with 2-axis accelerometers on the finger tips and back of the hand has been built using commercial-off-the-shelf components. Taking advantage of gravity induced acceleration offsets, we have been able to identify pseudo static gestures. We have also developed software that allows the glove to be used as a mouse pointing device for a Windows 95 or NT machine.
Article
Full-text available
In this paper we present an average-case analysis of the Bayesian classifier, a simple probabilistic induction algorithm that fares remarkably well on many learning tasks. Our analysis assumes a monotone conjunctive target concept, Boolean attributes that are independent of each other and that follow a single distribution, and the absence of attribute noise. We first calculate the probability that the algorithm will induce an arbitrary pair of concept descriptions; we then use this expression to compute the probability of correct classification over the space of instances. The analysis takes into account the number of training instances, the number of relevant and irrelevant attributes, the distribution of these attributes, and the level of class noise. In addition, we explore the behavioral implications of the analysis by presenting predicted learning curves for a number of artificial domains. We also give experimental results on these domains as a check on our reasoning. Finally, we ...
Conference Paper
The use of gesture as a natural interface serves as a motivating force for research in modeling, analyzing and recognition of gestures. In particular, human computer intelligent interaction needs vision-based gesture recognition, which involves many interdisciplinary studies. A survey on recent vision-based gesture recognition approaches is given in this paper. We shall review methods of static hand posture and temporal gesture recognition Several application systems of gesture recognition are also described in this paper. We conclude with some thoughts about future research directions.
Article
In k\hbox{-}{\rm{means}} clustering, we are given a set of n data points in d\hbox{-}{\rm{dimensional}} space {\bf{R}}^d and an integer k and the problem is to determine a set of k points in {\bf{R}}^d, called centers, so as to minimize the mean squared distance from each data point to its nearest center. A popular heuristic for k\hbox{-}{\rm{means}} clustering is Lloyd's algorithm. In this paper, we present a simple and efficient implementation of Lloyd's k\hbox{-}{\rm{means}} clustering algorithm, which we call the filtering algorithm. This algorithm is easy to implement, requiring a kd-tree as the only major data structure. We establish the practical efficiency of the filtering algorithm in two ways. First, we present a data-sensitive analysis of the algorithm's running time, which shows that the algorithm runs faster as the separation between clusters increases. Second, we present a number of empirical studies both on synthetically generated data and on real data sets from applications in color quantization, data compression, and image segmentation.
Article
Ambient intelligence applications pose the challenge of enabling real-time interaction with virtual environments in a natural manner. Gesture recognition based on body-mounted accelerometers has been proposed as a viable solution to translate patterns of movements that are associated with user commands, thus substituting point-and-click methods or other cumbersome input devices. These wearable systems are a promising approach to gesture recognition because of their low cost and size, high robustness and minimal constraints on user location and freedom of movements. In this work we present a gesture interface system based on low-cost integrated accelerometers for navigation in virtual spaces. Qualitative and quantitative assessments are presented using a 3D game application as a test-bed to evaluate the effectiveness of the interface.
Article
This tutorial provides an overview of the basic theory of hidden Markov models (HMMs) as originated by L.E. Baum and T. Petrie (1966) and gives practical details on methods of implementation of the theory along with a description of selected applications of the theory to distinct problems in speech recognition. Results from a number of original sources are combined to provide a single source of acquiring the background required to pursue further this area of research. The author first reviews the theory of discrete Markov chains and shows how the concept of hidden states, where the observation is a probabilistic function of the state, can be used effectively. The theory is illustrated with two simple examples, namely coin-tossing, and the classic balls-in-urns system. Three fundamental problems of HMMs are noted and several practical techniques for solving these problems are given. The various types of HMMs that have been studied, including ergodic as well as left-right models, are described
Conference Paper
The use of gesture as a natural interface serves as a motivating force for research in modeling, analyzing and recognition of gestures. In particular, human computer intelligent interaction needs vision-based gesture recognition, which involves many interdisciplinary studies. A survey on recent vision-based gesture recognition approaches is given in this paper. We shall review methods of static hand posture and temporal gesture recognition. Several application systems of gesture recognition are also described in this paper. We conclude with some thoughts about future research directions.
Tilt and Fill: Scrolling with Vibrotactile Display
  • I Oakley
  • J Angesleva
  • S Hughes
  • S Omodhrain
Java-based gesture recognition library for the Wii remote
  • Wiigee
Interface Board with PIC18F2520
  • Muin-Multi