PresentationPDF Available

Controlling Image-Stylization Techniques using Eye Tracking (Presentation)

Authors:

Abstract

Presentation of research paper "Controlling Image-Stylization Techniques using Eye Tracking".
Controlling Image-Stylization Techniques
using Eye Tracking
MAXIMILIAN SÖCHTING, MATTHIAS TRAPP
H A S S O -PLAT T N E R - INSTITUTE, FAC ULTY O F DI GI TA L ENGINEERING, U NI VE R S ITY OF P OTS DAM, GERMANY
HUCAPP 2020 - 4TH INTERNATIONAL CONFERENCE ON
HUMAN COMPUTER INTERACTION THEORY AND APPLICATION
27.02.2020 CONTROLLING IMAGE-STYLIZATION TECHNIQUES USING EYE TRACKING HUCAPP 2020 VALLETTA, MALTA
Context
We live in the “Digital Age”
Smartphones with high-quality photo sensors are widespread
High-speed mobile data network access is affordable for the majority of users
Mobile apps such as Instagram or Snapchat have emerged
Very popular (1 billion / 300 million active monthly users respectively as of 2019)
Images and short videos as primary communication medium
2
27.02.2020 CONTROLLING IMAGE-STYLIZATION TECHNIQUES USING EYE TRACKING HUCAPP 2020 VALLETTA, MALTA
Observations
1. Attention Economy”: platforms fight for the attention of their users
2. Periodic content ideal for high and regular user engagement
3. Focus shifts from the quality of the content to the frequency of creation and sharing
4. Devaluation of the image medium and communicated content?
3
27.02.2020 CONTROLLING IMAGE-STYLIZATION TECHNIQUES USING EYE TRACKING HUCAPP 2020 VALLETTA, MALTA
Approach
1. Support of dynamic, diverting, and temporary artworks
2. Creation using highly customizable real-time image abstraction techniques
3. Creation process controlled by the user using a novel interface: eye tracking
4
27.02.2020 CONTROLLING IMAGE-STYLIZATION TECHNIQUES USING EYE TRACKING HUCAPP 2020 VALLETTA, MALTA
The Minimal Setup
5
27.02.2020 CONTROLLING IMAGE-STYLIZATION TECHNIQUES USING EYE TRACKING HUCAPP 2020 VALLETTA, MALTA
Interactions
Different interaction techniques that explore the system limits and goals
Shifting between different effects or presets, revealing artwork from black, etc.
Game-like environment
Constraints: e.g. not blinking
Goal/reward: abstracted, personal artwork
Reversed power dynamic in contrast to traditional user-assisting interfaces
6
27.02.2020 CONTROLLING IMAGE-STYLIZATION TECHNIQUES USING EYE TRACKING HUCAPP 2020 VALLETTA, MALTA
Interactions
Revelation-Mode:
Fade from black to an abstract artwork wherever the user gazes.
Catch-Me-Mode:
Blend between different parameters / presets / LoD / effects wherever the user gazes.
Submerging-Mode:
Fixating points reveals different parameters / presets / LoD / effects in a confined area.
Do-not-Blink-Mode:
Every blink changes the parameters / presets / LoD / effects of the displayed image.
7
27.02.2020 CONTROLLING IMAGE-STYLIZATION TECHNIQUES USING EYE TRACKING M. CHTING, M. TRAPPHUCAPP 2020 VALLETTA, MALTA
27.02.2020 CONTROLLING IMAGE-STYLIZATION TECHNIQUES USING EYE TRACKING HUCAPP 2020 VALLETTA, MALTA
Related Work
Eye tracking as a direct input device (vs. analytical use)
Case study: air traffic control [Alonso et al. 2013]
Interactive art installation using eye tracking:
game_of_life[Satomi et al. 2007] allows players to move in a virtual environment using their eye gaze
“Molecular Informatics” [Mikami 1996] creates 3D molecule structures in VR based on eye movements
9
27.02.2020 CONTROLLING IMAGE-STYLIZATION TECHNIQUES USING EYE TRACKING HUCAPP 2020 VALLETTA, MALTA
Related Work
Image abstraction techniques with multiple Levels-of-Control [Semmo et al. 16]
Local (mask-based) and global parameter adjustments
Effect presets (sets of parameter values)
Pipeline presets (sets of effects with sets of parameter values)
10
27.02.2020 CONTROLLING IMAGE-STYLIZATION TECHNIQUES USING EYE TRACKINGHUCAPP 2020 VALLETTA, MALTA
Overview of Software Architecture
11
27.02.2020 CONTROLLING IMAGE-STYLIZATION TECHNIQUES USING EYE TRACKING HUCAPP 2020 VALLETTA, MALTA
Implementation of Interaction Techniques
12
27.02.2020 CONTROLLING IMAGE-STYLIZATION TECHNIQUES USING EYE TRACKING HUCAPP 2020 VALLETTA, MALTA
Results (1)
13
Described concept implemented in the context of existing frameworks
Painting controlled through eye tracker interaction in real-time
High processing load during blink events, otherwise interactive
Access to all previously developed effects through standardized effect format
27.02.2020 CONTROLLING IMAGE-STYLIZATION TECHNIQUES USING EYE TRACKING HUCAPP 2020 VALLETTA, MALTA
Results (2)
14
Highly conscious eye movements are required in Revelation-Mode (reveal from black)
Since peripheral vision controls the gaze, it goes against human intuition
Analogy: relaxation exercises and even therapy of mental health conditions[Vaughan et al. 94]
Potential: naturally guide the gaze of users through local changes in the image
27.02.2020 CONTROLLING IMAGE-STYLIZATION TECHNIQUES USING EYE TRACKING HUCAPP 2020 VALLETTA, MALTA
Future Work
15
Additional sensors – extension to “interactive art framework”
Presence sensors, ambient light sensors, buttons
Further development of game-like features for a “full experience”
Integration of a “reward” component such as a printed photo in case of success
Medical application through guided gazes that assist in relaxation or therapy
Naturally induce certain gaze movements through local changes in the image
Maximilian Söchting
maximilian.soechting@student.hpi.de
Matthias Trapp
matthias.trapp@hpi.de
27.02.2020 CONTROLLING IMAGE-STYLIZATION TECHNIQUES USING EYE TRACKING HUCAPP 2020 VALLETTA, MALTA
Literature
Alonso, Roland and Causse, Mickaël and Vachon, François and Parise, Robert and Dehais, Frédéric and Terrier, Patrice. Evaluation of head-free eye tracking as an input device for
air traffic control. (2013) Ergonomics, vol. 56 (n° 2). pp. 246-255.
Mikami, Seiko, Molecular Informatics. Morphogenic substance via eye tracking, Database of Virtual Art, (1996) http://www.virtualart.at/database/general/work/molecular-
informatics-morphogenic-substance-via-eye-tracking.html
Satomi, Mika and Sommerer, Christa. "game_of_life": interactive art installation using eye-tracking interface. (2007). pp. 246-247. 10.1145/1255047.1255107.
SEMMO A., DÜRSCHMID T., TRAPP M., KLINGBEIL M., DÖLLNER J., PASEWALDT S.: Interactive image filtering with multiple levels-of-control on mobile devices. Proceedings
SIGGRAPH ASIA Mobile Graphics and Interactive Applications (MGIA), ACM, pp. 2:12:8. doi:10.1145/2999508.2999521.
Soler-Adillon, Joan. The intangible material of interactive art: agency, behavior and emergence. (2015). Artnodes 16. pp. 43-52.
VAUGHAN K., ARMSTRONG M. S., GOLD R., O’CONNOR N., JENNEKE W., TARRIER N.: A trial of eye movement de sensitization compared to image habituation training and
applied muscle relaxation in post-traumatic stress disorder. Journal of Behavior Therapy and Experimental Psychiatry 25, 4 (1994), 283 291. URL:
http://www.sciencedirect.com/science/article/pii/0005791694900361, doi:https://doi.org/10.1016/0005-7916(94)90036-1. 7
17
... The work also lists out the style-transferral constraints and applications. Maximilian and Matthias [17] presented a human interactive abstraction and stylization is created and controlled by the use of eye tracking devices. This work enables users to experiment, divert and revitalize works of art so as to reply to their eye-movement. ...
Article
Full-text available
This work extricates the image characteristic features for the classification of archeological monument images. At the pre-processing stage, archeological dataset sample images are treated by using structure safeguarding image abstraction framework, which can deliver the most effective image abstraction output by manipulating the perceptible features in the given low-illuminated and underexposed color image samples. Proposed abstraction-framework effectively boosted the significant image property features like color, edge, sharpness, contrast and suppresses complexity and noise. The image properties were also refined at each phase based on the attained statistical feature disposal information. The work adopted the Harris feature identification technique to identify the most significant image features in the input and enhanced images. The framework also preserves significant features in the foreground of an image by intelligently integrating the series of filters during rigorous experimental work and also diminishes the background content of an input image. The proposed archeological system evaluates every stage of the result with assorted subjective matters and calculates the image quality and properties assessment statistical attributes. By this way prominent features in an image have been recognized. The efficiency of this work has been corroborated by performing the trials on the selected archeological dataset. In addition, user’s visual feedback and the standard image quality assessment techniques were also used to evaluate the proposed pre-processing framework. Based on the obtained abstraction images from the framework, this work extracts the image gray color texture features using GLCM, color texture from CTMs and deep-learning features from AlexNet for the classification of archeological monument classification. This work adopted a support vector machine as a classifier. To corroborate the efficiency of the proposed method, an experiment was conducted on our own data set of Chalukya, Kadamba, Hoysala and new engraving monuments, each domain consisting of 500 archeological data set samples with large intra-class variation, with different environmental lighting condition, low-illumination and different pose. Implementation of this work was carried out in MATLAB-2020 with HPC Nvidia Tesla P100 GPU, and obtained results show that combination of multiple features significantly improves the performance to the extent of 98.10%.
Article
Full-text available
Due to the evolution of computer vision and non-photorealistic rendering (NPR) techniques, enhancements of image features from blatant range images are possible. Conveying the shape from blatant range images are a crucial part of surveillance systems, content-aware analysis, image abstraction and line drawing. In view of this, the work presents a combinational high dynamic range (HDR) and image abstraction framework that can deliver the most effective dirt-free line drawing output to convey the shapes from blatant range images. The proposed framework manipulates the visual features from under/overexposed 2D blatant range images by retaining the prominent tonal information, dominant structural features and suppressing the superfluous details. Significant image properties and quality assessment metrics are effectively enhanced based on the statistical parameters computed and by empirically defined conditions at every stage of the framework. The framework exploits image and objective spatial data to create the dirt-free line drawing in order to recognize amplified elements of the enhanced structure by making use of the Harris key-feature detector algorithm. A sequence of HDR tone mapping operators and NPR image filters are comprehensively integrated through rigorous experimental analysis. Hence, this work empirically retains the prominent tonal and structural features in the frontal region and diminishes the background features in given input images. The work is implemented in MatLab-2020 with a 6.6 teraflops/s high-performance super computation ambience and Tesla P100 graphical processing unit. Efficacy of the presented framework has been validated by executing extensive experimentation on the benchmark datasets such as Ruixing Wang dataset, Flickr repository images and many other interesting datasets are collected. The obtained results are compared with other comparable existing work cited in the literary-works. Furthermore, human visualization perceptual analysis opinion process is also used to evaluate the proposed framework. Significant image abstraction and dirt free line drawing detentions, design challenges, applications and potential work in the domain of non-photorealistic rendering are also envisaged in this paper.
Conference Paper
Full-text available
With the continuous development of mobile graphics hardware, interactive high-quality image stylization based on nonlinear filtering is becoming feasible and increasingly used in casual creativity apps. However, these apps often only serve high-level controls to parameterize image filters and generally lack support for low-level (artistic) control, thus automating art creation rather than assisting it. This work presents a GPU-based framework that enables to parameterize image filters at three levels of control: (1) presets followed by (2) global parameter adjustments can be interactively refined by (3) complementary on-screen painting that operates within the filters' parameter spaces for local adjustments. The framework provides a modular XML-based effect scheme to effectively build complex image processing chains-using these interactive filters as building blocks-that can be efficiently processed on mobile devices. Thereby, global and local parameterizations are directed with higher-level algorithmic support to ease the interactive editing process, which is demonstrated by state-of-the-art stylization effects, such as oil paint filtering and watercolor rendering.
Article
Unlabelled: The purpose of this study was to investigate the possibility to integrate a free head motion eye-tracking system as input device in air traffic control (ATC) activity. Sixteen participants used an eye tracker to select targets displayed on a screen as quickly and accurately as possible. We assessed the impact of the presence of visual feedback about gaze position and the method of target selection on selection performance under different difficulty levels induced by variations in target size and target-to-target separation. We tend to consider that the combined use of gaze dwell-time selection and continuous eye-gaze feedback was the best condition as it suits naturally with gaze displacement over the ATC display and free the hands of the controller, despite a small cost in terms of selection speed. In addition, target size had a greater impact on accuracy and selection time than target distance. These findings provide guidelines on possible further implementation of eye tracking in ATC everyday activity. Practitioner summary: We investigated the possibility to integrate a free head motion eye-tracking system as input device in air traffic control (ATC). We found that the combined use of gaze dwell-time selection and continuous eye-gaze feedback allowed the best performance and that target size had a greater impact on performance than target distance.
Conference Paper
In the field of new media art, HCI techniques are often used to let spectators interact with the artwork itself. In the interactive art installation "game_of_life" an eye-tracking interface is used by the spectators. The installation tries to merge the unique experience of the eye-tracking interface with an interactive art expression.