Fusion of Eye Gaze Point and Speech in Real Time
With a growing number of computer devices around us, and the increasing time we sped for interacting with such devices, we are strongly interested in finding new interaction methods which ease the use of computers or increasing interaction efficiency. Eye tracking seems to be a promising technology to achieve this goal. As the disable user cannot handle the traditional input devices, the alternate for this category of users must be available. Speech is another promising technology to achieve this goal. The first approach researches eye gaze as pointing device. The second approach researches speech as an input and the third approach deals with the further the combination of both. The am of ongoing research is to develop an application to replace a computer mouse for a people with physical impairment. The application is based on an eye tracking algorithm and assumes that the camera and the head position are fixed. The system after successful development will able to interact user with at least on the level of simple application.