Article

Computer Input System Based on Gaze Vector Estimation with Iris Center Detection from Face Images Acquired with a Web Camera Allowing User Movement

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

A system which allows computer input without a keyboard is proposed. The system utilizes a display-mounted Web camera for acquisition of the user's face and also a display-mounted lamp for illumination of the user. It is found that the proposed system allows almost perfect computer input (90%) if the distance between the user and display is within 30 cm and if the keyboard image is displayed on a 19-inch computer display (8-cm key distance) with 40-W fluorescent light as normal illumination from both sides of the display. The proposed system thus requires one retry of key entry in ten times. The proposed system allows user movement, because a moving picture of the user's face is acquired in real time. The relation of the allowable user movement to the success rate, and the relation of the signal-to-noise ratio and the contrast of the acquired image of the user to the success rate are determined. © 2009 Wiley Periodicals, Inc. Electron Comm Jpn, 92(5): 31–40, 2009; Published online in Wiley InterScience (www.interscience.wiley.com). DOI 10.1002/ecj.10015

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

Chapter
Intelligent volume rendering method for 3D image display with semitransparent object representations is proposed. The proposed method is also effective for reducing the frame rate for 3D representation based on characteristics of human eyes (Afterimage). In the proposed method, pixel transparency is changed depending on the depth of the pixel. Through experiments with structured objects (preliminary experiment) and CT scan image in the image dataset provided by the ULB, Belgium, it is found that the proposed method is effective for reduction of the frame rate by the factor of about 1/6. Also, it can represent the 3D depth through visual perceptions as NTSC image display.
Chapter
Sport vision based tennis player training is proposed to accelerate players’ skill-up in tennis play with instructions provided by the proposed sport vision based system. In sports, gaze, dynamic visual acuity, eye movement and viewing place are important. In sports vision, Static eyesight, Dynamic visual acuity, Contrast sensitivity, Eye movement, Deep vision, Instant vision, Cooperative action of eye, hand and foot, and Peripheral field are have to be treated. In particular for the tennis, all of the items are very important. Therefore, sports vision based tennis player training system is proposed. Through experiment, it is found that the proposed system does work well for improvement of tennis players’ skills.
Chapter
Robot arm controlled by Eye Based Human-Computer Interaction: EBHCI is proposed. The proposed system allows disabled person to select desirable food from the meal tray by their eyes only as an example. Robot arm which is used for retrieving the desirable food is controlled by human eye. At the tip of the robot arm, tiny camera is equipped. Disabled person wears a glass of which a single Head Mount Display: HMD and tiny camera is mounted so that disabled person can look at the desired food and retrieve it by looking at the food displayed onto HMD. This is just an example. There are a plenty of available services as a magic arm. Experimental results show that disabled person can retrieve the desired food successfully. It also is confirmed that robot arm control by EBHCI is much faster than that by hands.
Chapter
Mobile phone as such as smart phone as well as i-phone operations by human eyes only is proposed together with its examples of application. Smart phone operation success rate is evaluated with the condition of the parameters of the distance between user and smart phone and ambient illumination. Through experiments, it is found that the smart phone operations can be done with a high performance of success rate. Meanwhile, one of the applications of cooking recipe referring with the proposed smart phone operation without touching anything is realized successfully.
Article
Eye-gaze input interfaces have been proposed. The main purpose of these systems is communication aid for the severely disabled. We have previously developed a text input system using eye-gaze based on an image analysis under natural light. This input system employed dwell time based method to select indicators. By employing voluntary (conscious) blink based method rather than dwell time as indicator selection, it is expected to increase the efficiency of text input. We propose a new text input system based on eye-gaze and voluntary blink method. The newly added eye blink measurement function is necessary to realize high time resolution. Furthermore, each of the eye-gaze and eye blink measurements is very time consuming. If these measurements are performed sequentially, it is difficult to finish calculation within required time for real-time processing. To cope with this problem, we realize the concurrent processing of these measurements by multithread structure. We conducted an examination of nine subjects to input Japanese text using our new input system. All of nine subjects could complete the task of Japanese text input. Based on a result of this experiment, we evaluated our new system.
Conference Paper
Wearable computing with computer input just by sight for health care is proposed. Wearable computing with Input-Output Devices based on Eye-Based Human Computer Interaction: EBHCI which allows location based web services including navigation, location/attitude/health condition monitoring is proposed. Through implementation of the proposed wearable computing system, all the functionality is confirmed. It is also found that the system does work well. It can be used easily and also is not expensive. Experimental results for EBHCI show excellent performance in terms of key-in accuracy as well as input speed. It is accessible to internet, obviously, and has search engine capability. Also, one of the applications of health care, physical and psychological status monitoring is discussed.
Chapter
Computer input just by sight for disable persons is proposed together with its applications. It is confirmed that communication aids, phoning aids, service robot control, information collection aids, etc. are available by using the proposed computer input just by sight.
Article
Full-text available
Wearable computing with Input-Output devices Base on Eye-Based Human Computer Interaction: EBHCI which allows location based web services including navigation, location/attitude/health condition monitoring is proposed. Through implementation of the proposed wearable computing system, all the functionality is confirmed. It is also found that the system does work well. It can be used easily and also is not expensive. Experimental results for EBHCI show excellent performance in terms of key-in accuracy as well as input speed. It is accessible to internet, obviously, and has search engine capability.
Article
Robot arm control and having meal aid system with eye based HCI is proposed. The proposed system allows disabled person to select desirable food from the meal tray by their eyes only. Robot arm which is used for retrieving the desirable food is controlled by human eye. At the tip of the robot arm, tiny camera is equipped. Disabled person wear a glass of which a single Head Mount Display: HMD and tiny camera is mounted so that disabled person can take a look at the desired food and retrieve it by looking at the food displayed onto HMD. Experimental results show that disabled person can retrieve the desired food successfully. It also is confirmed that robot arm control by eye based HCI is much faster than that by hands.
Conference Paper
Full-text available
The problem of assisting people with special needs is assuming a central role in our society, and information and communication technologies are asked to have a key role in aiding people with both physical and cognitive disabilities. This paper describes an eye tracking system, whose strong points are the simplicity and the consequent affordability of costs, designed and implemented to allow people with severe motor disabilities to use gaze as an input device for selecting areas on a computer screen. The motivation for this kind of input device, together with the communication impairments that it may help to solve are reported in the paper, that then describes the adopted technical solution, compared to existing approaches, and reports the results obtained by its experimentation.
Many systems to aid communication have been developed for seriously physically handicapped people such as ALS patients. The eye-gaze input system is currently being evaluated as a novel interface; it can be used to operate a computer with eye movement only. Most conventional eye-gaze input systems use of an infrared ray on the eyes to detect eye-gaze. However, prolonged irradiation could damage the eyes. This paper proposes a new eye-gaze input system with multi-indicators utilizing a personal computer and a home video camera to detect eye-gaze under natural light. It detects both vertical and horizontal eye-gaze through simple image analysis, and does not require special image processing units or sensors. It also compensates for measurement errors caused by head movement, i.e., it can detect eye-gaze with a high degree of accuracy. Therefore, it is capable distinguishing many indicators and its applications are expected to be expanded.
Article
By integrating the eye and head-position monitoring devices, the present authors developed an eye-controlled human/computer interface based on the line-of-sight and an intentional blink to invoke commands. Also modified was an existing calibration method to reduce the visual angle between the target center and the intersection point of the derived line-of-sight. The reduced visual angle allowed 108 or more command blocks to be displayed on the 14 inch monitor with the target acquisition probability (hit rate) of 98% when viewed at a distance of 500 mm apart. This could represent all keys of the standard keyboard on the monitor. An active triggering method, using an intentional blink, was a feasible and efficient alternative to invoke commands with total triggering time of 0.8 s or less. The system could be used by normal people as well as handicapped individuals as a new human/computer interface.
Article
With the subject exposed to an alternating magnetic field, eye position may be accurately recorded from the voltage generated in a coil of wire embedded in a scleral contact lens worn by the subject. Using two magnetic fields in quadrature phase and two coils on the lens, one may measure horizontal, vertical and torsional eye movements simultaneously. The instrument described has an accuracy and linearity of about 2 per cent of full scale, a resolution of 15 seconds of arc and a bandwidth of 1000 cyles per second.
A multi-index gaze input system using image analysis under available light
  • Abe
  • Daisen
  • Oi
Abe, Daisen, Oi. A multi-index gaze input system using image analysis under available light. Trans IITE 2004;58:1656–1664.
Development of a look input interface using EOG
  • Kuno
  • Yagi
  • Fujii
  • Koga
  • Uchikawa
Kuno, Yagi, Fujii, Koga, Uchikawa. Development of a look input interface using EOG. Trans Inf Process Soc Japan 1998;39:1455–1462.
Gaze input communication equip-ment for seriously physically handicapped persons
  • Ito
  • Sudo
  • Ifuku
Ito, Sudo, Ifuku. Gaze input communication equip-ment for seriously physically handicapped persons. Trans IEICE J83D1 2000;5:495–503.
A device for text creation and peripheral equipment control by eye movement
  • Yamada
Yamada, Fukuda. A device for text creation and pe-ripheral equipment control by eye movement. Trans IEICE J69D 1986;7:1103–1107.
Recent trends in research on eye movement
  • Yamada Mitsunori
Mitsunori, Yamada. Recent trends in research on eye movement. J IEICE MBE95-132, NC95-90, p 145– 152, 1995.
Development of a gaze input system based on cursor movement system
  • Kishimoto
  • Yonemura
  • Hirose
  • Nagae
Kishimoto, Yonemura, Hirose, Nagae. Development of a gaze input system based on cursor movement system. Trans IITE 2001;55:917–919.
Eye movement measurement by image acquisition and processing with a video capture card
  • Ito
Ito, Nara. Eye movement measurement by image acquisition and processing with a video capture card. Tech Rep IEICE 2002;102:31–36.
A gaze input system using image analysis by the sclera reflection method
  • Abe
  • Ochi
  • Oi
  • Daisen
Abe, Ochi, Oi, Daisen. A gaze input system using image analysis by the sclera reflection method. Trans IITE 2003;57:1354–1360.
Gaze input communication equipment for seriously physically handicapped persons
  • Ito
Development of a look input interface using EOG
  • Kuno
A gaze input platform for seriously physically handicapped persons
  • Daisen Abe
Development of a gaze input system based on cursor movement system
  • Kishimoto
A multi-index gaze input system using image analysis under available light
  • Abe
An eye-gaze input system based on limbus tracking by image analysis for seriously handicapped persons. 7th ERCIM Workshop, “User Interface for All
  • K Abe
  • S Ohyama
  • M Ohyama
A gaze input system using image analysis by the sclera reflection method
  • Abe