Michael Nebeling’s research while affiliated with University of Michigan and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (83)


SonoHaptics: An Audio-Haptic Cursor for Gaze-Based Object Selection in XR
  • Conference Paper

October 2024

·

5 Reads

·

1 Citation

·

·

Michael Nebeling

·

[...]

·


Figure 2: The "bouba/kiki" effect: Which one is 'bouba'? Which one is 'kiki'? Cross-modal correspondences enable us to perceive features across multiple sensory modalities, such as shapes visually or aurally.
Figure 3: Perception Study Setup: Participants were shown a cube that varied in color lightness and size. They used the right thumbstick of a Quest Pro controller to manipulate the pitch of an audio signal (left-right direction) and amplitude of a vibration signal (up-down), and the left controller trigger button to confirm selection after selecting the best matching pitch and signal. In-ear stereo earphones were used for audio feedback. Four linear resonance actuators positioned at cardinal directions on a wristband provided haptic feedback (wristband illustrated to maintain anonymity).
Figure 6: Pearson's correlation coefficients í µí±Ÿ for lightness/size to pitch/amplitude mappings. One-to-one mappings are when participants could change only one of pitch and amplitude value at a time when only one of lightness or size changed. Compound mappings are when participants could change both pitch and amplitude values at once while both lightness and size of the cube change simultaneously.
Figure 7: The evaluation study compared SonoHaptics and four baseline feedback techniques: No Feedback, Static, Text-tospeech, and Visual feedbacks.
Figure 9: Average selection time for feedback techniques. Effect of technique was statistically significant.

+2

SonoHaptics: An Audio-Haptic Cursor for Gaze-Based Object Selection in XR
  • Preprint
  • File available

September 2024

·

45 Reads

We introduce SonoHaptics, an audio-haptic cursor for gaze-based 3D object selection. SonoHaptics addresses challenges around providing accurate visual feedback during gaze-based selection in Extended Reality (XR), e.g., lack of world-locked displays in no- or limited-display smart glasses and visual inconsistencies. To enable users to distinguish objects without visual feedback, SonoHaptics employs the concept of cross-modal correspondence in human perception to map visual features of objects (color, size, position, material) to audio-haptic properties (pitch, amplitude, direction, timbre). We contribute data-driven models for determining cross-modal mappings of visual features to audio and haptic features, and a computational approach to automatically generate audio-haptic feedback for objects in the user's environment. SonoHaptics provides global feedback that is unique to each object in the scene, and local feedback to amplify differences between nearby objects. Our comparative evaluation shows that SonoHaptics enables accurate object identification and selection in a cluttered scene without visual feedback.

Download


What Makes XR Dark? Examining Emerging Dark Patterns in Augmented and Virtual Reality through Expert Co-Design

April 2024

·

84 Reads

·

13 Citations

ACM Transactions on Computer-Human Interaction

Dark Patterns are deceptive designs that influence a user's interactions with an interface to benefit someone other than the user. Prior work has identified dark patterns in WIMP interfaces and ubicomp environments, but how dark patterns can manifest in Augmented and Virtual Reality (collectively XR) requires more attention. We conducted ten co-design workshops with 20 experts in XR and deceptive design. Our participants co-designed 42 scenarios containing dark patterns, based on application archetypes presented in recent HCI/XR literature. In the co-designed scenarios, we identified ten novel dark patterns in addition to 39 existing ones, as well as ten examples in which specific characteristics associated with XR potentially amplified the effect dark patterns could have on users. Based on our findings and prior work, we present a classification of XR-specific properties that facilitate dark patterns: perception, spatiality, physical/virtual barriers, and XR device sensing. We also present the experts' assessments of the likelihood and severity of the co-designed scenarios and highlight key aspects they considered for this evaluation, for example, technological feasibility, ease of upscaling and distributing malicious implementations, and the application's context of use. Finally, we discuss means to mitigate XR dark patterns and support regulatory bodies to reduce potential harms.



XRSpotlight: Example-based Programming of XR Interactions using a Rule-based Approach

June 2023

·

84 Reads

·

3 Citations

Proceedings of the ACM on Human-Computer Interaction

Research on enabling novice AR/VR developers has emphasized the need to lower the technical barriers to entry. This is often achieved by providing new authoring tools that provide simpler means to implement XR interactions through abstraction. However, novices are then bound by the ceiling of each tool and may not form the correct mental model of how interactions are implemented. We present XRSpotlight, a system that supports novices by curating a list of the XR interactions defined in a Unity scene and presenting them as rules in natural language. Our approach is based on a model abstraction that unifies existing XR toolkit implementations. Using our model, XRSpotlight can find incomplete specifications of interactions, suggest similar interactions, and copy-paste interactions from examples using different toolkits. We assess the validity of our model with professional VR developers and demonstrate that XRSpotlight helps novices understand how XR interactions are implemented in examples and apply this knowledge in their projects.



XSpace: An Augmented Reality Toolkit for Enabling Spatially-Aware Distributed Collaboration

November 2022

·

31 Reads

·

18 Citations

Proceedings of the ACM on Human-Computer Interaction

Augmented Reality (AR) has the potential to leverage environmental information to better facilitate distributed collaboration, however, such applications are difficult to develop. We present XSpace, a toolkit for creating spatially-aware AR applications for distributed collaboration. Based on a review of existing applications and developer tools, we design XSpace to support three methods for creating shared virtual spaces, each emphasizing a different aspect: shared objects, user perspectives, and environmental meshes. XSpace implements these methods in a developer toolkit, and also provides a set of complimentary visual authoring tools to allow developers to preview a variety of configurations for a shared virtual space. We present five example applications to illustrate that XSpace can support the development of a rich set of collaborative AR experiences that are difficult to produce with current solutions. Through XSpace, we discuss implications for future application design, including user space customization and privacy and safety concerns when sharing users' environments.



1st Workshop on Prototyping Cross-Reality Systems

October 2022

·

120 Reads

·

1 Citation

Cross-Reality (CR) systems offer different levels of virtuality to their users, enabling them to either transition along the reality-virtuality continuum or collaborate with each other across different manifestations. Many Augmented (AR) and Virtual Reality (VR) systems are inherently cross reality since the amount of augmentation of the physical world (AR) or the influence of the physical environment (VR) varies over time. However, traditional prototyping approaches often focus on one specific manifestation, and so are less feasible for prototyping cross-reality systems. In this workshop, we aim to discuss current challenges, solutions, and opportunities that arise from prototyping CR systems and their interactions. We offer attendees a balanced mix of presentation and interactive sessions, including (provocative) research positions and video demonstrations of existing CR prototyping tools. Ultimately, the workshop aims to start a discussion inside the ISMAR community about the current challenges and novel concepts around prototyping CR systems.


Citations (70)


... Studies have demonstrated that eye-tracking can be utilized to estimate factors such as confidence [5], personality [3], attention [41], and cognitive load [46]. Additionally, eye-tracking is employed in diverse interaction tasks like gaze-based typing [10,26], menu navigation [23], and object selection [13]. The choice of eye-tracking device varies depending on the data type and task. ...

Reference:

SensPS: Sensing Personal Space Comfortable Distance between Human-Human Using Multimodal Sensors
SonoHaptics: An Audio-Haptic Cursor for Gaze-Based Object Selection in XR
  • Citing Conference Paper
  • October 2024

... Finally, the accessibility of data stories specifically concerns immersive systems, for which there are still no established assistive technologies available (e.g., screen readers). A recent trend encourages accessibility and inclusiveness in mixed reality, as shown by workshops at ACM CHI and IEEE ISMAR (e.g., [53]), although so far none consider data storytelling specifically. We see an opportunity related to the accessibility of immersive data stories in their potential for multi-modal input and multi-sensory output (DS: Interaction (modality)). ...

Designing Inclusive Future Augmented Realities
  • Citing Conference Paper
  • May 2024

... Researchers have developed systematic knowledge of dark pattern instances in a range of domains, including e-commerce [32], games [43], and social media [35]. Additionally, dark patterns have been studied in many specific contexts, including extended reality (XR; [30]), consent banners [40], and mobile apps [15]. These domains and contexts are further elaborated in systematic reviews of the dark patterns literature by Gray et al. [17] and Chang et al. [12]. ...

What Makes XR Dark? Examining Emerging Dark Patterns in Augmented and Virtual Reality through Expert Co-Design
  • Citing Article
  • April 2024

ACM Transactions on Computer-Human Interaction

... Future authoring tools may embrace a paradigm shift to enhance consideration for viewer autonomy. Some systems [47,55] have demonstrated such possibilities. For example, REFRAME [55] allows creators to anticipate and address potential threats in the early design stage from a user's perspective by personifying various threats as characters in storyboards. ...

Reframe: An Augmented Reality Storyboarding Tool for Character-Driven Analysis of Security & Privacy Concerns
  • Citing Conference Paper
  • October 2023

... Additionally, Rajaram et al. [37] highlight the growing use of AR in collaborative settings and the associated risks of data breaches, unauthorized access, and privacy violations. They conducted a user study to explore concerns and preferences about security and privacy in shared AR environments, using the findings to propose techniques such as granular access controls, encryption, and feedback mechanisms to address vulnerabilities. ...

Eliciting Security & Privacy-Informed Sharing Techniques for Multi-User Augmented Reality
  • Citing Conference Paper
  • April 2023

... The color cue, influenced by luminance contrast, enhances depth perception when combined with other cues, where proximityluminance covariance applies to color as a pictorial depth cue [31]. Research shows that color augmentation improves depth perception in VR, especially against darker backgrounds [2,50]. Partial occlusion combined with the color red produces stronger depth perception and faster judgments, making variations in the red spectrum effective depth cues [31]. ...

Color-to-Depth Mappings as Depth Cues in Virtual Reality
  • Citing Conference Paper
  • October 2022

... This way, the environments themselves do not need to be aligned, but rather a common anchor is established within each collaborator's environment. Herskovitz et al. provide a toolkit capable of displaying collaborators through a portal, world-in-miniature display, or by anchoring them to a common element of both rooms, such as a chair or table [9]. The idea of anchoring has also been explored in other works [6,7,10]. ...

XSpace: An Augmented Reality Toolkit for Enabling Spatially-Aware Distributed Collaboration
  • Citing Article
  • November 2022

Proceedings of the ACM on Human-Computer Interaction

... For instance, dynamic elements such as lighting conditions, moving objects (e.g., vehicles, temporary structures), and pedestrian activity are typically absent from pre-captured 3D models. This absence can lead to inconsistencies between virtual elements and the real environment, ultimately degrading the user experience [54]. Consequently, developers frequently resort to repeated on-site visits, which can be costly, time-consuming, and logistically challenging. ...

XR tools and where they are taking us: characterizing the evolving research on augmented, virtual, and mixed reality prototyping and development tools
  • Citing Article
  • September 2022

XRDS Crossroads The ACM Magazine for Students

... Although many prototyping methods have been applied to support rapid AR prototyping [6,25], the support for creating interactive behavior in AR is limited in the context of goal-driven prototyping of AR use cases from end users' perspective [26]. More specifically, the techniques utilized by prior research are commonly found to be application-oriented and focus on the interactions that the target users are supposed to perform using the application. ...

Rapid prototyping for XR: SIGGRAPH 2022 course
  • Citing Conference Paper
  • August 2022

... 2016] emphasizes the importance of leveraging AR to make complex concepts more accessible and engaging, particularly in topics like human health and biology. In [Cárdenas Gasca et al., 2022], the design of AR exhibitions for sensitive narratives is explored, emphasizing the importance of story-telling-like structures and user interaction in creating immersive experiences. Their work highlights the potential of AR to convey complex historical and cultural narratives in a way that is both engaging and educational, making it a valuable tool for museum curators and designers. ...

AR Exhibitions for Sensitive Narratives: Designing an Immersive Exhibition for the Museum of Memory in Colombia
  • Citing Conference Paper
  • June 2022