Oleg Špakov

Oleg Špakov
Tampere University | UTA · School of Information Sciences

About

74
Publications
30,562
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
1,378
Citations

Publications

Publications (74)
Article
Full-text available
We examined two vision-based interfaces (VBIs) for performance and user experience during character-based text entry using an on-screen virtual keyboard. Head-based VBI uses head motion to steer the computer pointer and mouth-opening gestures to select the keyboard keys. Gaze-based VBI utilizes gaze for pointing at the keys and an adjustable dwell...
Conference Paper
This study examined the potential of mid-air gestural interfaces specifically designed for children, in the context of full-body motion control for confirmation actions while interacting with large conventional displays. The study aimed to investigate gestures for confirmation actions that were feasible for children, including children with challen...
Chapter
We studied two interaction techniques to perform secondary tasks in a driving simulator environment with the focus on driving safety. In both techniques, the participants (N = 20) used gaze pointing to select virtual task buttons. Toggling the controls was achieved by either mid-air gestures with haptic feedback or physical buttons located on the s...
Chapter
The aim was to utilize ultrasound skin stimulation (UH) on a palm to inform the system state when interacting with mid-air hand gestures in the automotive context. Participants navigated a horizontal menu using touch, buttons, and hand gestures during simulated driving. Mid-air interaction and UH feedback design was tested in two studies. The first...
Article
Full-text available
Projected walk-through fogscreens have been created, but there is little research on the evaluation of the interaction performance with fogscreens. The present study investigated mid-air hand gestures for interaction with a large fogscreen. Participants (N = 20) selected objects from a fogscreen using tapping and dwell-based gestural techniques, wi...
Conference Paper
The automotive evolution in virtual controls for touchscreen interaction provides the opportunity to manage and manipulate In-vehicle Infotainment (IVI) system without the need for large physical control. However, as most of these virtual controls are designed for visual feedback in PCs and mobile devices, their implementation can have usability an...
Article
Full-text available
Head mounted displays provide a good platform for viewing of immersive 360° or hemispheric images. A person can observe an image all around, just by turning his/her head and looking at different directions. The device also provides a highly useful tool for studying the observer’s gaze directions and head turns. We aimed to explore the interplay bet...
Conference Paper
Mid-air ultrasound skin stimulation is a known technology used to provide haptic feedback for touchless communication. Even though some in-car interaction already takes place with mid-air gestures, the utilization of this type of feedback is still uncommon. In this study we evaluated how environmental vibration affects the recognition of ultrasound...
Chapter
Gaze provides an attractive input channel for human-computer interaction because of its capability to convey the focus of interest. Gaze input allows people with severe disabilities to communicate with eyes alone. The advances in eye tracking technology and its reduced cost make it an increasingly interesting option to be added to the conventional...
Chapter
Pie charts can be regularly found both in the popular media and research publications. There is evidence that other forms of visualizations make it easier to evaluate the relative order of the data items. Doughnut charts have been suggested as a variation that has advantages over pie charts. We investigated how pie charts and doughnut charts are af...
Chapter
On-line teaching situations where a tutor and their students are remote from each other mean that contact between them is reduced compared with teaching in a classroom. We report an initial study of two-way gaze sharing between a tutor and a group of students, who were in different locations. A 45-min class consisted of an introductory lecture foll...
Conference Paper
Full-text available
We present an investigation of sharing the focus of visual attention between two players in a collaborative game, so that where one player was looking was visible to the other. The difference between using head-gaze and eye-gaze to estimate the point of regard was studied, the motive being that recording head-gaze is easier and cheaper than eye-gaz...
Article
Full-text available
Collaboration improves efficiency, avoids duplication of efforts, improves goal-awareness, and makes working generally more pleasurable. While collaboration is desirable, it introduces additional costs because of the required coordination. In this article, we study how visual search is affected by gaze-sharing collaboration. There is evidence that...
Article
Full-text available
The recent growth in low-cost eye-tracking systems makes it feasible to incorporate real-time measurement and analysis of eye position data into activities such as learning to read. It also enables field studies of reading behavior in the classroom and other learning environments. We present a study of the data quality provided by two remote eye tr...
Conference Paper
Full-text available
To use eye trackers in a school classroom, children need to be able to calibrate their own tracker unsupervised and on repeated occasions. A game designed specifically around the need to maintain their gaze in fixed locations was used to collect calibration and verification data. The data quality obtained was compared with a standard calibration pr...
Conference Paper
Full-text available
With the proliferation of small-screen computing devices, there has been a continuous trend in reducing the size of interface elements. In virtual keyboards, this allows for more characters in a layout and additional function widgets. However, vision-based interfaces (VBIs) have only been investigated with large (e.g., full-screen) keyboards. To un...
Poster
Full-text available
Poster presented as a part of MUM 2017 conference, made based on the paper in the conference proceedings (DOI: https://doi.org/10.1145/3152832.3156628 ).
Conference Paper
Full-text available
Current aim was to introduce and initially evaluate the performance of a system called Extended Cognition, which was developed for enhancing the filtering of emotionally meaningful information from visual lifelogging data. The system is proposed to add facial expression markers to the stream of visual lifelogging data in order to later find special...
Article
Full-text available
Several popular visualizations of gaze data, such as scanpaths and heatmaps, can be used independently of the viewing task. For a specific task, such as reading, more informative visualizations can be created. We have developed several such techniques, some dynamic and some static, to communicate the reading activity of children to primary school t...
Article
Full-text available
Vibrotactile feedback is widely used in mobile devices because it provides a discreet and private feedback channel. Gaze based interaction, on the other hand, is useful in various applications due to its unique capability to convey the focus of interest. Gaze input is naturally available as people typically look at things they operate, but feedback...
Conference Paper
Full-text available
The best way to construct user interfaces for smart glasses is not yet known. We investigated the use of eye tracking in this context in two experiments. The eye and head movements were combined so that one can select the object to interact by looking at it and then change a setting in that object by turning the head horizontally. We compared three...
Conference Paper
Full-text available
mart glasses equipped with eye tracking technology could be utilized to develop natural interaction techniques. They could be used to conveniently interact with an electronic appliance in the environment from a distance. We describe a technique, HeadTurn, that allows a user to look at a device and then control it by turning the head to the left or...
Article
Full-text available
Emotional reactions to basic, artificial, yet carefully controllable point-light displays (PLDs) were investigated with ratings of valence, arousal, approachability and dominance. PLDs were varied by movement location (upper and lower) and intensity ($10^{\circ }$, $20^{\circ }$ and $30^{\circ }$ angular change) for angular upward and downward move...
Conference Paper
Full-text available
Presentation aids, such as the laser pointer, are commonly used in lectures and public speeches. Their effect on the audience has not been properly studied. We present an experiment that compares several pointer alternatives. One of them is GazeLaser, a new solution that does not need a manually operated pointer, but is based on the lecturers' gaze...
Conference Paper
Full-text available
In a study with 12 participants we compared two smooth pursuit based widgets and one dwell time based widget in adjusting a continuous value. The circular smooth pursuit widget was found to be about equally efficient as the dwell based widget in our color matching task. The scroll bar shaped smooth pursuit widget exhibited lower performance and low...
Conference Paper
Full-text available
Smart glasses, like Google Glass or Microsoft HoloLens, can be used as interfaces that expand human perceptual, cognitive, and actuation capabilities in many everyday situations. Conventional manual interaction techniques, however, are not convenient with smart glasses whereas eye trackers can be built into the frames. This makes gaze tracking a na...
Article
Full-text available
Proper feedback is essential in gaze based interfaces, where the same modality is used for both perception and control. We measured how vibrotactile feedback, a form of haptic feedback, compares with the commonly used visual and auditory feedback in eye typing. Haptic feedback was found to produce results that are close to those of auditory feedbac...
Article
Full-text available
The aim of this paper was to evaluate the use of three facial actions (i.e. frowning, raising the eyebrows, and smiling) in selecting objects on a computer screen when gaze was used for pointing. Dwell time is the most commonly used selection technique in gaze-based interaction, and thus, a dwell time of 400 ms was used as a reference selection tec...
Conference Paper
Full-text available
Wearable devices including smart eyewear require new interaction methods between the device and the user. In this paper, we describe our work on the combined use of eye tracking for input and haptic (touch) stimulation for output with eyewear. Input with eyes can be achieved by utilizing gaze gestures which are predefined patterns of gaze movements...
Conference Paper
Full-text available
Interacting with the environment using mobile eye-tracking is accompanied with challenges in providing non-visual feedback related to gaze events and monitoring the gaze vector estimation quality. Recent studies point to haptic stimulation as a promising feedback channel in this context. In this work we focused on applying haptic stimulation to inf...
Conference Paper
Full-text available
Smartwatches are widely available and increasingly adopted by consumers. The most common way of interacting with smartwatches is either touching a screen or pressing buttons on the sides. However, such techniques require using both hands. We propose glance awareness and active gaze interaction as alternative techniques to interact with smartwatches...
Article
Full-text available
Print interpreting supports people with a hearing disability by giving them access to spoken language. In print interpreting, the interpreter types the spoken text in real time for the hard-of-hearing client to read. This results in dynamic text presentation. An eye movement study was conducted to compare two types of dynamic text presentation form...
Conference Paper
Full-text available
Eye tracking enables automatic scrolling based on natural viewing behavior. We were interested in the effects of haptic feedback on gaze behavior and user experience. We conducted an experiment where haptic feedback was used to forewarn the reader that their gaze had entered an active scrolling area. Results show no statistical differences between...
Conference Paper
Full-text available
The accuracy of gaze point estimation is one of the main limiting factors in developing applications that utilize gaze input. The existing gaze point correction methods either do not support real-time interaction or imply restrictions on gaze-controlled tasks and object screen locations. We hypothesize that when gaze points can be reliably correlat...
Conference Paper
Full-text available
Compared to the mouse, eye pointing is inaccurate. As a consequence, small objects are difficult to point by gaze alone. We suggest using a combination of eye pointing and subtle head movements to achieve accurate hands-free pointing in a conventional desktop computing environment. For tracking the head movements, we exploited information of the ey...
Article
Full-text available
A wide range of applications can benefit from the measurement of facial activity. The current study presents a method that can be used to detect and classify the movements of different parts of the face and the expressions the movements form. The method is based on capacitive measurement of facial movements. It uses principal component analysis on...
Article
Full-text available
The intensity measurement of facial muscle activity can be used in several applications such as human-computer interaction and behavioral science. A new method for the intensity measurement is presented. It is based on a contactless, capacitive measurement of the movements that the facial activity produces. The muscles responsible for raising the e...
Conference Paper
Full-text available
Visual information on eye movements can be used to facilitate scrolling while one is reading on-screen text. We carried out an experiment to find preferred reading regions on the screen and implemented an automatic scrolling technique based on the preferred regions of each individual reader. We then examined whether manual and automatic scrolling h...
Conference Paper
Full-text available
In this position paper we argue of the need for a more complete definition of gaze interaction events. The need is especially clear in non-desktop use scenario. We ask on one hand what the needed gaze events are and on the other hand what can eye tracking technology possibly deliver in dynamic everyday settings. Current understanding on gaze events...
Conference Paper
Full-text available
Conference Paper
Full-text available
We propose a combination of gaze pointing and head gestures for enhanced hands-free interaction. Instead of the traditional dwell-time selection method, we experimented with five simple head gestures: nodding, turning left/right, and tilting left/right. The gestures were detected from the eye-tracking data by a range-based algorithm, which was foun...
Conference Paper
Full-text available
A wide range of applications can benefit from the measurement of facial activity. The current study presents a method that can be used to detect the movements of different parts of the face and expressions that they form. The method is based on capacitive measurement of facial movements and utilisation of principal component analysis on the measure...
Article
Full-text available
Eye movement data were collected and analyzed from 16 participants while they read text from a computer screen. Several text presentation formats were compared, including sentences as part of a full paragraph, sentences presented one by one, sentences presented in chunks of at most 30 characters at a predefined rate, and line-by-line presentation f...
Article
Full-text available
Video-based human-computer interaction has received increasing interest over the years. However, earlier research has been mainly focusing on technical characteristics of different methods rather than on user performance and experiences in using computer vision technology. This study aims to investigate performance characteristics of novice users a...
Article
Full-text available
We compared various real-time filters designed to denoise eye movements from low-sampling devices. Most of the filters found in literature were implemented and tested on data gathered in a previous study. An improvement was proposed for one of the filters. Parameters of each filter were adjusted to ensure their best performance. Four estimation par...
Article
Full-text available
We present a novel vision-based perceptual user interface for hands-free text entry that utilizes face detection and visual gesture detection to manipulate a scrollable virtual keyboard. A thorough experimentation was undertaken to quantitatively define a performance of the interface in hands-free pointing, selection and scrolling tasks. The experi...
Conference Paper
Full-text available
Gaze data processing is an important and necessary step in gaze-based applications. This study focuses on the comparison of several gaze-to-object mapping algorithms using various dwell times for selection and presenting targets of several types and sizes. Seven algorithms found in literature were compared against two newly designed algorithms. The...
Article
Analysis of gaze data collected during experiments is an essential part of any study that deals with eye tracking. In turn, analysis of gaze data is not a trivial procedure, on account of the nature of eye movement, peculiarities of human vision, and the imperfection of the measurement tools. In this chapter, we first draw out the most common issue...
Article
Full-text available
The quality and availability of eye tracking equipment has been increasing while costs have been decreasing. These trends increase the possibility of using eye trackers for entertainment purposes. Games that can be controlled solely through movement of the eyes would be accessible to persons with decreased limb mobility or control. On the other han...
Conference Paper
Full-text available
Ninja cursors aim to speed up target selection on large or multiple monitors. Several cursors are displayed on the screen with one of them selected as the active cursor. Eye tracking is used to choose the active cursor. An experiment with 13 participants showed that multiple cursors speed up the selection over long distances, but not over short dis...
Conference Paper
Full-text available
Previous research shows that text entry by gaze using dwell time is slow, about 5-10 words per minute (wpm). These results are based on experiments with novices using a constant dwell time, typically between 450 and 1000 ms. We conducted a longitudinal study to find out how fast novices learn to type by gaze using an adjustable dwell time. Our resu...
Article
Full-text available
In eye typing, a full on-screen keyboard often takes a lot of space because the inaccuracy in eye tracking requires big keys. We propose "scrollable keyboards" where one or more rows are hidden to save space. Results from an experiment with 8 expert participants show that the typing speed reduced by 51.4% for a 1-row keyboard and 25.3% for a 2-row...
Conference Paper
Full-text available
We tracked the eye movements of 18 students as they translated three short texts with different complexity levels under three different time constraints. Participants with touch typing skills were found to attend more to on-screen text than participants without touch typing skills. Time pressure was found to mainly affect fixations on the source te...
Conference Paper
Full-text available
The increase in usage of eye-tracking technology to study text translation processes has revealed the need for effective tools for visualization of the data collected. We propose a new method for gaze and keystroke data visualization. The visualization utilizes the preprocessed gaze data, where detected fixations are linked to the corresponding wor...
Article
Full-text available
Introduction It has been suggested that visualization tools are necessary for facilitating the understanding of large volumes of data because the visual cortex dominates perception. Moreover, the key aspects of the perception process occur rapidly without conscious thought [1]. The huge amount of data collected during the experiments on gaze behavi...
Article
Full-text available
IComponent is a software tool we have developed to facilitate basic research on eye movements as well as applications of eye gaze for computer input. Despite the variety of eye trackers and gaze-data analysis tools available today, there is still a gap between what researchers need and what products are available on the market to suit their needs....
Article
Full-text available
Usability testing is widely used today to determine, among other things, the quality of web site designs. To help the researchers, a number of techniques have been suggested for visualizing the eye tracker's data. Using one of the most popular techniques, gaze fixations are plotted in 2-D against the stimulus image in the background. However, there...
Conference Paper
Full-text available
Eye trackers have been used as pointing devices for a number of years. Due to inherent limitations in the accuracy of eye gaze, however, interaction is limited to objects spanning at least one degree of visual angle. Consequently, targets in gaze-based interfaces have sizes and layouts quite distant from "natural settings". To accommodate accuracy...
Conference Paper
Full-text available
With recent advances in eye tracking technology, eye gaze gradually gains acceptance as a pointing modality. Its relatively low accuracy, however, determines the need to use enlarged controls in eye-based interfaces rendering their design rather peculiar. Another factor impairing pointing performance is deficient robustness of an eye tracker's cali...