About
73
Publications
50,133
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
2,869
Citations
Citations since 2017
Introduction
Päivi Majaranta currently works at the Faculty of Information Technology and Communication Sciences (ITC), Tampere University, where she gives courses on usability and human-centered design methods. Päivi does research in Human-computer Interaction, with special knowledge on eye tracking and gaze-based HCI. Her research interests also include user experience, accessibility, multimodal interaction, and animal-computer interaction.
Additional affiliations
January 2011 - present
Publications
Publications (73)
Movement sensor data from seven static and dynamic dog behaviours (sitting, standing, lying down, trotting, walking, playing, and (treat) searching i.e. sniffing) was collected from 45 middle to large sized dogs with six degree-of-freedom movement sensors attached to the collar and the harness. With 17 dogs the collection procedure was repeated. Th...
Leaving a dog home alone is part of everyday life for most dog owners. Previous research shows that dog–owner relationship has multifarious effects on dog behavior. However, little is known about the interplay between dog–owner relationship, physical activity of the dog, and affective experiences at the time of the owner leaving home and reunion wh...
In this paper, we present a review of how the various aspects of any study using an eye tracker (such as the instrument, methodology, environment, participant, etc.) affect the quality of the recorded eye-tracking data and the obtained eye-movement and gaze measures. We take this review to represent the empirical foundation for reporting guidelines...
Dog owners’ understanding of the daily behaviour of their dogs may be enhanced by movement measurements that can detect repeatable dog behaviour, such as levels of daily activity and rest as well as their changes. The aim of this study was to evaluate the performance of supervised machine learning methods utilising accelerometer and gyroscope data...
Gaze provides an attractive input channel for human-computer interaction because of its capability to convey the focus of interest. Gaze input allows people with severe disabilities to communicate with eyes alone. The advances in eye tracking technology and its reduced cost make it an increasingly interesting option to be added to the conventional...
In gesture-based user interfaces, the effort needed for learning the gestures is a persistent problem that hinders their adoption in products. However, people's natural gaze paths form shapes during viewing. For example, reading creates a recognizable pattern. These gaze patterns can be utilized in human-technology interaction. We experimented with...
In gesture-based user interfaces, the effort needed for learning the gestures is a persistent problem that hinders their adoption in products. However, people's natural gaze paths form shapes during viewing. For example, reading creates a recognizable pattern. These gaze patterns can be utilized in human-technology interaction. We experimented with...
Human augmentation is a field of research that aims to enhance human abilities through medicine or technology. This has historically been achieved by consuming chemical substances that improve a selected ability or by installing implants which require medical operations. Both of these methods of augmentation can be invasive. Augmented abilities hav...
Dog owners are highly motivated in understanding behavior and physiology of their pets and monitoring their wellbeing. Monitoring with a commercially available activity trackers reveals levels of daily activity and rest but recognizing the behavior of the dog would provide additional information, especially when the dog is not under supervision. In...
Today, the markets are populated with dozens of devices for
dog activity tracking and monitoring. Our aim was to get
detailed insights on how dog owners use these devices in
their everyday life. Two studies, an interview study with
Finnish dog owners (N=7) and an international online
questionnaire (N=35) were done to capture the motivations
for usi...
Exploratory analysis of gaze data requires methods that make it possible to process large amounts of data while minimizing human labor. The conventional approach in exploring gaze data is to construct heatmap visualizations. While simple and intuitive, conventional heatmaps do not clearly indicate differences between groups of viewers or give estim...
The functionality of three dry electrocardiogram electrode constructions was evaluated by measuring canine heart rate during four different behaviors: Standing, sitting, lying and walking. The testing was repeated (n = 9) in each of the 36 scenarios with three dogs. Two of the electrodes were constructed with spring-loaded test pins while the third...
This workshop focuses on the use and influence of technology on human-animal bonding, and how to facilitate them with technology. We explore the elements and characteristics of human-animal bonding, and how technology is connected to emotions and bonding between the human and the animal. We are particularly interested in animal's experiences, emoti...
Vibrotactile feedback is widely used in mobile devices because it provides a discreet and private feedback channel. Gaze based interaction, on the other hand, is useful in various applications due to its unique capability to convey the focus of interest. Gaze input is naturally available as people typically look at things they operate, but feedback...
Gaze gestures are a promising input technology for wearable devices especially in the smart glasses form factor because gaze gesturing is unobtrusive and leaves the hands free for other tasks. We were interested in how gaze gestures can be enhanced with vibrotactile feedback. We studied the effects of haptic feedback on the head and haptic promptin...
Previous work on eye tracking and eye-based human-computer interfaces mainly concentrated on making use of the eyes in traditional desktop settings. With the recent growth of interest in wearable computers, such as smartwatches, smart eyewears and low-cost mobile eye trackers, eye-based interaction techniques for mobile computing are becoming incre...
Larger tablet computers are not always easy to use in handheld configurations. Gaze input and especially gaze gestures provide an alternative input technology in such situations. We investigated the task performance and user experience in gaze gesture use when haptic feedback was provided either to fingers touching the tablet or behind the ears thr...
Modern interaction techniques like non-intrusive gestures provide means for interacting with distant displays and smart objects without touching them. We were interested in the effects of feedback modality (auditory, haptic or visual) and its combined effect with input modality on user performance and experience in such interactions. Therefore, we...
In a study with 12 participants we compared two smooth pursuit based widgets and one dwell time based widget in adjusting a continuous value. The circular smooth pursuit widget was found to be about equally efficient as the dwell based widget in our color matching task. The scroll bar shaped smooth pursuit widget exhibited lower performance and low...
Proper feedback is essential in gaze based interfaces, where the same modality is used for both perception and control. We measured how vibrotactile feedback, a form of haptic feedback, compares with the commonly used visual and auditory feedback in eye typing. Haptic feedback was found to produce results that are close to those of auditory feedbac...
NordiCHI'14 conference attendees got hands-on experience with a number of great new interactive systems. Among the accepted poster, video, and demo submissions, we selected the following four prototypes to illustrate the high-quality design research displayed during the conference, which was held in Helsinki, Finland, October 26--30, 2014. Mikael W...
Wearable computing devices are gradually becoming common, and head-mounted devices such as Google Glass are already available. These devices present new interaction challenges as the devices are usually small in size, and also the usage environment sets limitations on the available interaction modalities. One potential interaction method could be t...
Eye tracking enables automatic scrolling based on natural viewing behavior. We were interested in the effects of haptic feedback on gaze behavior and user experience. We conducted an experiment where haptic feedback was used to forewarn the reader that their gaze had entered an active scrolling area. Results show no statistical differences between...
Haptic feedback can improve the usability of gaze gestures in mobile devices. However, the benefit is highly sensitive to the exact timing of the feedback. In practical systems the processing and trans-mission of signals takes some time, and the feedback may be delayed. We conducted an experiment to determine limits on the feedback delays. The resu...
Haptic feedback can improve the usability of gaze gestures in mobile devices. However, the benefit is highly sensitive to the exact timing of the feedback. In practical systems the processing and transmission of signals takes some time, and the feedback may be delayed. We conducted an experiment to determine limits on the feedback delays. The resul...
Anticipating the emergence of gaze tracking capable mobile devices, we are investigating the use of gaze as an input modality in handheld mobile devices. We conducted a study of combining gaze gestures with vibrotactile feedback. Gaze gestures were used as an input method in a mobile device and vibrotactile feedback as a new alternative way to give...
Eye tracking has a long history in medical and psychological research as a tool for recording and studying human visual behavior. Real-time gaze-based text entry can also be a powerful means of communication and control for people with physical disabilities. Following recent technological advances and the advent of affordable eye trackers, there is...
Eye tracking input often relies on visual and auditory feedback. Haptic feedback offers a previously unused alternative to these established methods. We describe a study to determine the natural time limits for haptic feedback to gazing events. The target is to determine how much time we can use to evaluate the user gazed object and decide if we ar...
Compared to the mouse, eye pointing is inaccurate. As a consequence, small objects are difficult to point by gaze alone. We suggest using a combination of eye pointing and subtle head movements to achieve accurate hands-free pointing in a conventional desktop computing environment. For tracking the head movements, we exploited information of the ey...
In this position paper we argue of the need for a more complete definition of gaze interaction events. The need is especially clear in non-desktop use scenario. We ask on one hand what the needed gaze events are and on the other hand what can eye tracking technology possibly deliver in dynamic everyday settings. Current understanding on gaze events...
In this chapter, the authors explore possibilities to use novel face and gaze tracking technology in educational applications, especially in interactive teaching agents for second language learning. They focus on non-verbal feedback that provides information about how well the speaker has understood the presented information, and how well the inter...
We propose a combination of gaze pointing and head gestures for enhanced hands-free interaction. Instead of the traditional dwell-time selection method, we experimented with five simple head gestures: nodding, turning left/right, and tilting left/right. The gestures were detected from the eye-tracking data by a range-based algorithm, which was foun...
Early work on applied eye tracking investigated gaze as an input modality to interact with a desktop computer and discussed some of the human factors and technical aspects involved in performing common computer tasks with the eyes such as pointing and menu selection. Since then, eye tracking technology has considerably matured. Research on eye-base...
There are several ways to write by gaze. In a typical setup, gaze direction is used to point and dwell-select letters on an on-screen keyboard. Alternatively, if the person cannot fixate, the eyes can be used as switches using blinks or rough gestures to select items. This chapter introduces different ways to enter text by gaze and reviews related...
Severely disabled people will often spend a significant part of their waking day using gaze control. Technology has a positive impact on many areas of their life. What simple features do people who have severe and complex disabilities need to use gaze control technology? In this chapter, we consider features that are enhancing the effective use of...
Technical aids can contribute towards improved health and satisfaction in life by giving the user increased possibilities for participation in a number of areas of daily life. Assessing people with disabilities that affect their motor, communication and cognitive skills can be a complex matter. The result of an assessment might be the user’s only w...
Recent developments in mobile eye tracking equipment and automated eye movement analysis point the way toward unobtrusive eye-based human-computer interfaces that are pervasively usable in everyday life. We call this new paradigm pervasive eye tracking - continuous eye monitoring and analysis 24/7. PETMEI 2011 provides a forum for researcher from h...
Digital systems, such as phones, computers and PDAs, place continuous demands on our cognitive and perceptual systems. They offer information and interaction opportunities well above our processing abilities, and often interrupt our activity. Appropriate allocation of attention is one of the key factors determining the success of creative activitie...
Gaze interaction, as understood in this book, provides a means to exploit information from eye gaze behaviour during human-technology interaction. Gaze can either be used as an explicit control method that enables the user to point at and select items, or information from the user's natural gaze behaviour can be exploited subtly in the background a...
Gaze-controlled computers had already been utilized successfully for well over two decades before the COGAIN project started. However, those actually benefitting from the technology were comparatively few compared to the numbers who needed it. During the five year course of the project, however, systems, software and strategies were developed that...
Most of the studies presented in this book use camera-based gaze trackers, used to monitor where the user is looking on screen or in the 3D environment. While a gaze tracker may resemble a blackbox device that miraculously determines the point of gaze, it actually consists of several hardware and software components with specific purposes. This cha...
Recent advances in eye tracking technology will allow for a proliferation of new applications. Improvements in interactive methods using eye movement and gaze control could result in faster and more efficient human computer interfaces, benefitting users with and without disabilities. Gaze Interaction and Applications of Eye Tracking: Advances in As...
environmental control is the control, operation, and monitoring of an environment via intermediary technology such as a computer.
Typically this means control of a domestic home.Within the scope of COGAIN, this environmental control concerns the control
of the personal environment of a person (with or without a disability). This defines environment...
This paper presents a principled analysis of various combinations of image features to determine their suitability for remote eye tracking. It begins by reviewing the basic theory underlying the connection between eye image and gaze direction. Then a ...
Previous research shows that text entry by gaze using dwell time is slow, about 5-10 words per minute (wpm). These results are based on experiments with novices using a constant dwell time, typically between 450 and 1000 ms. We conducted a longitudinal study to find out how fast novices learn to type by gaze using an adjustable dwell time. Our resu...
In eye typing, a full on-screen keyboard often takes a lot of space because the inaccuracy in eye tracking requires big keys. We propose "scrollable keyboards" where one or more rows are hidden to save space. Results from an experiment with 8 expert participants show that the typing speed reduced by 51.4% for a 1-row keyboard and 25.3% for a 2-row...
Text entry by eye gaze is used by people with severe motor disabilities. An eye tracking device follows the user s eye movements, and a computer program analyzes the gaze behavior. To type by gaze, the user typically points at the characters on an on-screen keyboard by looking at them and selects them by means of dwell time, a prolonged gaze that s...
People use their eyes mainly for observation, but people also use gaze to enhance communication; for example, staring at somebody soon causes a reaction: “What? Do you want something?" Similarly, an intense look at a water jug may be enough to motivate someone at a dinner party to pour more water for you. us the direction of a person’s gaze not onl...
For a long time, eye tracking has been thought of as a promising method for usability testing. During the last couple of years, eye tracking has finally started to live up to these expectations, at least in terms of its use in usability laboratories. We know that the user's gaze path can reveal usability issues that would otherwise go unnoticed, bu...
Dasher is one of the best known inventions in the area of text entry in recent years. It can be used with many input devices, but studies on user performance with it are still scarce. We ran a longitudinal study where 12 participants transcribed Finnish text with Dasher in ten 15-minute sessions using a Tobii 1750 eye tracker as a pointing device....
This chapter reveals that for understanding the prospects and problems of text entry by gaze, it is instrumental to know how eye-tracking devices work and to understand their limitations. Text entry by gaze is intended for users with disabilities. There are also other gaze controlled applications intended for the same user group. In one sense, text...
� Springer-Verlag 2006 Abstract Eye typing provides a means of communication that is especially useful for people with disabilities. However, most related research addresses technical is- sues in eye typing systems, and largely ignores design issues. This paper reports experiments studying the im- pact of auditory and visual feedback on user perfor...
We used eye-tracking to study 28 users when they evaluated result lists produced by web search engines. Based on their different
evaluation styles, the users were divided into economic and exhaustive evaluators. Economic evaluators made their decision about the next action (e.g., query re-formulation, following a link) faster and based on less info...
Existing static visualization techniques for eye-tracking data do not make it possible to easily compare temporal information,
that is, gaze paths. We review existing techniques and then propose a new technique that treats time as the prime attribute
to be visualized. We successfully used the new technique for analysing the visual scanning of web s...
Interactive applications that make use of eye tracking have traditionally been based on command-and-control. Applications that make more subtle use of eye gaze have recently become increasingly popular in the domain of attentive interfaces that adapt their behaviour based on the visual attention of the user. We provide a review of the main systems...
The tutorial provides examples, experiences and design guidelines for using eye-gaze in human-computer interaction. The goal of the tutorial is to give insight into exploiting the information about gaze direction in human-computer interaction. The participants will learn the basics of eye-tracking, but the focus of the tutorial is on the interactio...
Raportti on koostettu Tampereen yliopiston tietojenkäsittelytieteiden laitoksella keväällä 2004 pidetyssä Käytettävyystutkimuksen menetelmät -seminaarissa.
Raportti on saatavailla TamPub-kokoelmassa:
http://urn.fi/URN:ISBN:978-951-44-9724-7
Eye typing provides means of communication especially for people with severe disabilities. Recent research indicates that the type of feedback impacts typing speed, error rate, and the user's need to switch her gaze between the on-screen keyboard and the typed text field. The current study focuses on the issues of feedback when a short dwell time (...
Proactive computing reinforces the need for non-command interfaces that fulfill the user's intentions without explicit commands. Eye gaze is a natural modality to be used in this connection, because it is also used proactively in everyday life. Previously eye-tracking has been used mostly for research purposes or in applications targeted for users...
We describe a study on how auditory and visual feedback affects eye typing. Results show that the feedback method influences both text entry speed and error rate. In addition, a proper feedback mode facilitates eye typing by reducing the user's need to switch her gaze between the on-screen keyboard and the typed text field.
Eye typing provides a means of communication for severely handicapped people, even those who are only capable of moving their eyes. This paper considers the features, functionality and methods used in the eye typing systems developed in the last twenty years. Primary concerned with text production, the paper also addresses other communication relat...
Eye typing provides a means of communication for severely handicapped people, even those who are only capable of moving their eyes. This paper considers the features, functionality and methods used in the eye typing systems developed in the last twenty years. Primary concerned with text production, the paper also addresses other communication relat...
Eye-aware applications have existed for long, but mostly for very special and restricted target populations. We have designed and are currently implementing an eye-aware application, called iDict, which is a general-purpose translation aid aimed at mass markets. iDict monitors the user's gaze path while s/he is reading text written in a foreign lan...
We have constructed an audio-visual text-to-speech synthesizer for Finnish by combining a facial model with an acoustic speech synthesizer. The quality of the visual speech synthesizer has been evaluated twice. In addition, we have started to refine the facial model taking a more physiologically and anatomically based approach. The synthesizer will...