Kris Luyten’s research while affiliated with Flanders Make and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (304)


Fig. 2. Example of a Chernoff Bot, featuring callouts describing various design aspects. Each variation referenced in Table 2 has been thoughtfully translated into distinct features of the robot, illustrating how different data attributes are represented visually.
Overview of the different variations on the training data, model parameters, and training process and their respective representation in the robot's face.
AI-Spectra: A Visual Dashboard for Model Multiplicity to Enhance Informed and Transparent Decision-Making
  • Preprint
  • File available

November 2024

·

3 Reads

Gilles Eerlings

·

Sebe Vanbrabant

·

Jori Liesenborgs

·

[...]

·

Kris Luyten

We present an approach, AI-Spectra, to leverage model multiplicity for interactive systems. Model multiplicity means using slightly different AI models yielding equally valid outcomes or predictions for the same task, thus relying on many simultaneous "expert advisors" that can have different opinions. Dealing with multiple AI models that generate potentially divergent results for the same task is challenging for users to deal with. It helps users understand and identify AI models are not always correct and might differ, but it can also result in an information overload when being confronted with multiple results instead of one. AI-Spectra leverages model multiplicity by using a visual dashboard designed for conveying what AI models generate which results while minimizing the cognitive effort to detect consensus among models and what type of models might have different opinions. We use a custom adaptation of Chernoff faces for AI-Spectra; Chernoff Bots. This visualization technique lets users quickly interpret complex, multivariate model configurations and compare predictions across multiple models. Our design is informed by building on established Human-AI Interaction guidelines and well know practices in information visualization. We validated our approach through a series of experiments training a wide variation of models with the MNIST dataset to perform number recognition. Our work contributes to the growing discourse on making AI systems more transparent, trustworthy, and effective through the strategic use of multiple models.

Download








ViRgilites: Multilevel Feedforward for Multimodal Interaction in VR

June 2024

·

15 Reads

·

2 Citations

Proceedings of the ACM on Human-Computer Interaction

Navigating the interaction landscape of Virtual Reality (VR) and Augmented Reality (AR) presents significant complexities due to the plethora of available input hardware and interaction modalities, compounded by spatially diverse visual interfaces. Such complexities elevate the likelihood of user errors, necessitating frequent backtracking. To address this, we introduce ViRgilites, a virtual guidance framework that delivers multi-level feedforward information covering the available interaction techniques as well as the future possibilities to interact with virtual objects, anticipating the interaction effects and how they fit with the overall user's goal. ViRgilites is engineered to facilitate task execution, empowering users to make informed decisions about action methodologies and alternative courses of action. This paper presents the architecture and functionality of ViRgilites and demonstrates its efficacy through evaluation with a formative user study.



Citations (55)


... This could be done by incorporating proactive strategies, such as feedback loops for user confirmation of critical information and preemptive detection of unfulfilled preconditions for a specific task. Feedforward techniques that allow users to anticipate the consequences of their actions have been shown to significantly decrease errors by enhancing user understanding of the system's behavior and potential outcomes [9,23,57]. Error handling mechanisms need to address both AI and user mistakes, including AI acknowledgment of its errors, user-initiated problem-solving, AI interruption for immediate user error correction, and related impact on learning and flow. Emotional regulation features can enhance the learning experience by reducing user frustration during corrections. ...

Reference:

An Interaction Design Toolkit for Physical Task Guidance with Artificial Intelligence and Mixed Reality
ViRgilites: Multilevel Feedforward for Multimodal Interaction in VR
  • Citing Article
  • June 2024

Proceedings of the ACM on Human-Computer Interaction

... In automotive design, the physical attributes of a car door significantly influence user experience [4], [5]. Similar research in haptic feedback devices and consumer products has shown that the tactile qualities of interfaces, such as knobs and buttons, directly affect user satisfaction and perceived quality [6]. However, the relationship between the physical attributes of a car door and their effects on affection remains elusive, requiring a more in-depth analysis [7]. ...

Substitute Buttons: Exploring Tactile Perception of Physical Buttons for Use as Haptic Proxies

Multimodal Technologies and Interaction

... Include physical measurements. Capturing data from physical objects has been also reported as challenging [43,53,65]. Participants reported difficulties in measuring curved and organic shapes using programming-based CAD. ...

Measurement Patterns: User-Oriented Strategies for Dealing with Measurements and Dimensions in Making Processes
  • Citing Conference Paper
  • April 2023

... A reference framework adding intelligibility to the behavior of a robotic system for improvement of predictability, trust, safety, usability, and acceptance of autonomous robotic systems is proposed by van Deurzen et al. [22]. It comprises an interactive, online, and visual dashboard to help identify where and when adding intelligibility to the interface design is required so that developers and designers can customise the interactions to improve the experience for people working with the robot. ...

Choreobot: A Reference Framework and Online Visual Dashboard for Supporting the Design of Intelligible Robotic Systems
  • Citing Article
  • June 2022

Proceedings of the ACM on Human-Computer Interaction

... Over the years, intelligibility has been included in various context-aware research [51,86] and is now becoming more important in IoT research as well [17,18,42]. During our user study, many participants selected the "turn on" action instead of the "toggle" action for the first application. ...

FortClash: Predicting and Mediating Unintended Behavior in Home Automation
  • Citing Article
  • June 2022

Proceedings of the ACM on Human-Computer Interaction

... In VRRobot [12], a robotic arm provides on-demand haptic sensations, while a motion platform allows the user to walk around in VR while staying stationary to remain within reach of the robotic arm. HapticPanel [36] uses a 2D motion platform instead of a robotic arm to provide a low-cost DIY approach. ...

HapticPanel: An Open System to Render Haptic Interfaces in Virtual Reality for Manufacturing Industry
  • Citing Conference Paper
  • December 2021

... In our own work, we used commercial fitness trackers such as fitness bands and smartwatches because our focus was on data representation and not on the development of new technologies. However, we acknowledge that many types of wearable displays have been proposed [11] and discuss some challenges related to these in our research agenda. ...

An Interactive Design Space for Wearable Displays
  • Citing Conference Paper
  • September 2021

... This type of impairment affects the interaction between user and interface. However, synchronizing interfaces with user needs is a beneficial approach in these situations (Biswas, Langdon, Umadikar, Kittusami, & Prashant, 2014;Heller, Vanacken, Geurts, & Luyten, 2020). Gaze interaction can be utilized to minimize users' close interaction with machines and equipment in high-risk environments or cases of any motor disability (Karlsson, Allsop, Dee-Price, & Wallen, 2018). ...

Impact of Situational Impairment on Interaction with Wearable Displays
  • Citing Conference Paper
  • October 2020

... With this background, researchers have Y. Yamazaki proposed navigation methods that indicate the direction to be travelled by modulating music, so as not to disturb music listening while navigating [2], [3], [4], [5]. These studies have shown that localization using music modulation can provide the same level of navigation capability as voice guidance. ...

Attracktion: Field Evaluation of Multi-Track Audio as Unobtrusive Cues for Pedestrian Navigation
  • Citing Conference Paper
  • October 2020

... These behaviors are critical because they shape patient-provider interactions and significantly impact treatment outcomes. Scholars argue that healthcare strategies must adapt to individual emotional states and information preferences to improve these interactions and ensure more effective treatment [2,39]. The issue is of relevance to the HCI and CSCW community since digital platforms are increasingly leveraged by cancer survivors and caregivers to gain health information [4,8,20]. ...

Enhancing Patient Motivation through Intelligibility in Cardiac Tele-rehabilitation
  • Citing Article
  • March 2019

Interacting with Computers