Hyung-Il KimSamsung Advanced Institute of Technology
Hyung-Il Kim
Ph.D.
About
16
Publications
3,316
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
279
Citations
Introduction
Education
March 2014 - February 2016
February 2009 - February 2014
Publications
Publications (16)
In this paper, we present a prototype system for sharing a user's hand force in mixed reality (MR) remote collaboration on physical tasks, where hand force is estimated using wearable surface electromyography (sEMG) sensor. In a remote collaboration between a worker and an expert, hand activity plays a crucial role. However, the force exerted by th...
Despite the importance of avatar representation on user experience for Mixed Reality (MR) remote collaboration involving various device environments and large amounts of task-related information, studies on how controlling visual parameters for avatars can benefit users in such situations have been scarce. Thus, we conducted a user study comparing...
An omni-directional (360°) camera captures the entire viewing sphere surrounding its optical center. Such cameras are growing in use to create highly immersive content and viewing experiences. When such a camera is held by a user, the view includes the user's hand grip, finger, body pose, face, and the surrounding environment, providing a complete...
This study investigates the effects of a virtual hand representation on the user experience including social presence during hand-based 3D remote collaboration. Although a remote hand appearance is a critical parts of a hand-based telepresence, it has been rarely studied in comparison to studies on the self-embodiment of virtual hands in a 3D envir...
This paper reports that the time-domain accuracy of bare-hand interactions in HMD-based Augmented Reality can be improved by using finger contact: touching a finger with another or tapping one's own hand. The activation of input can be precisely defined by the moment of finger contact, allowing the user to perform the input precisely at the desired...
One of the main challenges in creating narrative-driven Augmented Reality (AR) content for Head Mounted Displays (HMDs) is to make them equally accessible and enjoyable in different types of indoor environments. However, little has been studied in regards to whether such content can indeed provide similar, if not the same, levels of experience acro...
To better explore the incorporation of pointing and gesturing into ubiquitous computing, we introduce WRIST, an interaction and sensing technique that leverages the dexterity of human wrist motion. WRIST employs a sensor fusion approach which combines inertial measurement unit (IMU) data from a smartwatch and a smart ring. The relative orientation...
This paper investigates the effect of avatar appearance on Social Presence and users’ perception in an Augmented Reality (AR) telepresence system. Despite the development of various commercial 3D telepresence systems, there has been little evaluation and discussions about the appearance of the collaborator’s avatars. We conducted two user studies c...
In this demonstration, we will show a prototype system with sensor fusion approach to robustly track 6 degrees of freedom of hand movement and support intuitive hand gesture interaction and 3D object manipulation for Mixed Reality head-mounted displays. Robust tracking of hand and finger with egocentric camera remains a challenging problem, especia...
This paper presents an augmented reality (AR) authoring system that enables an ordinary user to easily build an AR environment by manipulating and placing 3D virtual objects. The system tracks users’ hand motions via an RGB-D camera which built-in an optical see-through (OST) head mounted display (HMD), and interactive features applied to virtual o...
In this paper, we present a unified framework for remote collaboration using interactive augmented reality (AR) authoring and hand tracking methods. The proposed framework enables a local user to organize AR digital contents for making a shared working environment and collaborate multiple users in the distance. To develop the framework, we combine...
We introduce a smartwatch assisted sensor fusion approach to robustly track 6-DOF hand movement in head mounted display (HMD) based augmented reality (AR) environment, which can be used for robust 3D object manipulation. Our method uses a wrist-worn smartwatch with HMD-mounted depth sensor to robustly track 3D position and orientation of user's han...
In this paper, we present a new kind of wearable augmented reality (AR) 3D sculpting system called AiRSculpt in which users could directly translate their fluid finger movements in air into expressive sculptural forms and use hand gestures to navigate the interface. In AiRSculpt, as opposed to VR-based systems, users could quickly create and manipu...