Seokhee Jeon’s research while affiliated with Kyung Hee University and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (96)


Fig. 2: An overview of the overall study. Experiments with a car door provide the force and position tracking values as well as user ratings for the perception of opening a car door. These data are used to train a CNN-LSTM model that predicts perceived ratings based on force profiles of opening a car door.
Fig. 4: Averaged adjective rating for ten adjective pairs from experiment 3. The error bars show the standard deviation for each bar.
Fig. 5: Angle-normalized force profiles of the six cars used in this study (Top). The position tracking of the door opening is provided for K5, Sorento, and Santafe for reference (Bottom).
Fig. 8: Analysis of the predicted ratings based on standard deviation in user ratings from the perspective of different cars in the dataset. The red line indicates a perfect prediction of the user rating by the algorithm. The red and green bands represent a half and the first standard deviation of the user ratings.
Fig. 9: Analysis of the predicted ratings based on standard deviation in user ratings from the perspective of the adjective pairs. The red line indicates a perfect prediction of the user rating by the algorithm. The red and green bands represent a half and the first standard deviation of the user ratings.
Quantifying Haptic Affection of Car Door through Data-Driven Analysis of Force Profile
  • Preprint
  • File available

November 2024

·

8 Reads

·

·

·

[...]

·

Seokhee Jeon

Haptic affection plays a crucial role in user experience, particularly in the automotive industry where the tactile quality of components can influence customer satisfaction. This study aims to accurately predict the affective property of a car door by only watching the force or torque profile of it when opening. To this end, a deep learning model is designed to capture the underlying relationships between force profiles and user-defined adjective ratings, providing insights into the door-opening experience. The dataset employed in this research includes force profiles and user adjective ratings collected from six distinct car models, reflecting a diverse set of door-opening characteristics and tactile feedback. The model's performance is assessed using Leave-One-Out Cross-Validation, a method that measures its generalization capability on unseen data. The results demonstrate that the proposed model achieves a high level of prediction accuracy, indicating its potential in various applications related to haptic affection and design optimization in the automotive industry.

Download

Fig. 1 Proposed fingertip actuator. (a) Silicone layer with dual air chambers
Fig. 2 Illustration of the overall system
Silicone-made Tactile Actuator Integrated with Hot Thermo-fiber Finger Sleeve

November 2024

·

8 Reads

Multi-mode haptic feedback is essential to achieve high realism and immersion in virtual environments. This paper proposed a novel silicone fingertip actuator integrated with a hot thermal fabric finger sleeve to render pressure, vibration, and hot thermal feedback simultaneously. The actuator is pneumatically actuated to render a realistic and effective tactile experience in accordance with hot thermal sensation. The silicone actuator, with two air chambers controlled by pneumatic valves connected to compressed air tanks. Simultaneously, a PWM signal from a microcontroller regulates the temperature of the thermal fabric sleeve, enhancing overall system functionality. The lower chamber of the silicone actuator is responsible for pressure feedback, whereas the upper chamber is devoted to vibrotactile feedback. The conductive yarn or thread was utilized to spread the thermal feedback actuation points on the thermal fabric's surface. To demonstrate the actuator's capability, a VR environment consisting of a bowl of liquid and a stove with fire was designed. Based on different functionalities the scenario can simulate the tactile perception of pressure, vibration, and temperature simultaneously or consecutively.


Pneumatically Controlled Tactile Actuating Modules for Enhanced VR Safety Training

November 2024

·

1 Read

Our system introduces a modularized pneumatic actuating unit capable of delivering vibration, pressure, and impact feedback. Designed for adaptability, these modular tactile actuating units can be rapidly customized and reconfigured to suit a wide range of virtual reality (VR) scenarios, with a particular emphasis on safety training applications. This flexibility is demonstrated through scenarios such as using construction tools in a virtual environment and simulating safety protocols against falling objects. Innovative mounting solutions securely attach the actuators to various body sites, ensuring both comfort and stability during use. Our approach enables seamless integration into diverse VR safety training programs, enhancing the realism and effectiveness of simulations with precise and reliable haptic feedback.


Fig.1 The proposed haptic device delivers torque feedback along the yaw (horizontal) and pitch (vertical) axes
Wearable Haptic Device to Render 360-degree Torque Feedback on the Wrist

Haptic feedback increases the realism of virtual environments. This paper proposes a wearable haptic device that renders torque feedback to the user's wrist from any angle. The device comprises a control part and a handle part. The control part consists of three DC gear motors and a microcontroller, while the handle part securely holds the Oculus Quest 2 right controller. The control part manages string tension to deliver the sensation of torque feedback during interactions with virtual tools or objects. The three points of the handle part are connected to the three motors of the control part via strings, which pull the handle part to render precise 360-degree (yaw and pitch) torque feedback to the user's wrist. Finally, to show the effectiveness of the proposed device, two VR demos were implemented- Shooting Game and Shielding Experience.


Fig. 2. Hardware setup for recording and estimation of the system properties of the touch device. The target location is set by the acceleration sensor position.
Fig. 3. Left: Front of the touch device with the acceleration sensor located on the bottom left-hand side of the screen. Right: Backside of the touch device with the actuator positioned on the upper right-hand side.
Location-Based Output Adaptation for Enhanced Actuator Performance using Frequency Sweep Analysis

November 2024

·

2 Reads

This paper presents a methodology for enhancing actuator performance in older devices or retrofitting devices with haptic feedback actuators. The approach is versatile, accommodating various actuator and mounting positions. Through a frequency sweep analysis, the system's characteristics are captured, enabling the creation of location-specific transfer functions to accurately transform input signals into command signals for a precise output at the target location. This method offers fast and simple collection of the system properties and generation of location-specific signals.



Deep encoder–decoder network based data-driven method for impact feedback rendering on head during earthquake

January 2024

·

28 Reads

In safety training simulators, realistic haptic feedback is essential to make people construct accurate situation awareness through experiencing. In this regard, this paper presents a new and innovative system that provides the haptic experience of falling objects on user’s head during an earthquake. Special focus was on the accurate reproduction of impact feedback when various objects fall on the head. To this end, we propose a novel data-driven approach. This approach first collects 3-axis acceleration signals during real collision under several impact velocities. Afterward, 3D acceleration data is abstracted to a 1D acceleration profile using our novel max–min extraction approach. The impact signal for an arbitrary velocity is interpolated using a deep convolutional bidirectional long short-term memory encoder–decoder model. Rendering hardware is also implemented using high performance voice-coil vibrotactile actuator. Numerical and subjective evaluations are carried out to evaluate the performance of the proposed approach.Kindly check and confirm the edit made in the title.I confirm the edit is okay.Please confirm if the author names are presented accurately and in the correct sequence (given name, middle name/initial, family name). Authors Given name: [Joolekha Bibi] Last name: [Joolee], Given name: [Mohammad Shadman] Last name: [Hashem]. Also, kindly confirm the details in the metadata are correct.Yes, the author names are presented accurately and in the correct sequence.


Pneumatically Controlled Wearable Tactile Actuator for Multi-Modal Haptic Feedback

January 2024

·

46 Reads

·

2 Citations

IEEE Access

This paper introduces a wearable pneumatic actuator, designed for providing multiple types of tactile feedback using a single end-effector. To this end, the actuator combines a 3D-printed framework consisting of five 0.5 DOF soft silicon air cells with a pneumatic system to deliver a range of tactile sensations through a single end-effector. The actuator is capable of producing diverse haptic feedback, including vibration, pressure, impact, and lateral force, controlled by an array of solenoid valves. The design’s focus on multimodality in a compact and lightweight form factor makes it highly suitable for wearable applications. It can produce a maximum static force of 8.3 N, vibrations with an acceleration of up to 3.15 g, and lateral forces of up to 3.3 N. The efficacy of the actuator is demonstrated through two distinct user studies: one focusing on perception, where users differentiated between lateral cues and vibration frequencies, and another within a first-person shooter gaming scenario, revealing enhanced user engagement and experience. The actuator’s adaptability to body sites and rich multimodal haptic feedback enables it to find applications in virtual reality, gaming, training simulations, and more.


Figure 2: Overall framework. Texture dataset: a dataset containing real-world textures was prepared. Data preparation: establishing haptic attribute space and physical signal space using the dataset. Network training: a CNN-LSTM based model learning the relationship between the two spaces. System evaluation: an evaluation conducted to see the predictability of haptic attributes for a new/unseen texture.
Figure 3: The real-world texture dataset used in this study.
Figure 5: The mean adjective ratings ranked by 12 human participants. The counterpart of each pair shows the extreme of the entity on the y-axis, e.g., in a Rough-Smooth pair ’0’ represents the extremely rough surface whereas 100 represents the extremely smooth surface.
Figure 8: Comparison of the proposed model with other ap- proaches.
Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE) of 5 Adjective-pairs for the proposed CNN- LSTM model
Predicting Perceptual Haptic Attributes of Textured Surface from Tactile Data Based on Deep CNN-LSTM Network

October 2023

·

42 Reads

·

2 Citations

This paper introduces a framework to predict multi-dimensional haptic attribute values that humans use to recognize the material by using the physical tactile signals (acceleration) generated when a textured surface is stroked. To this end, two spaces are established: a haptic attribute space and a physical signal space. A five-dimensional haptic attribute space is established through human adjective rating experiments with the 25 real texture samples. The physical space is constructed using tool-based interaction data from the same 25 samples. A mapping is modeled between the aforementioned spaces using a newly designed CNN-LSTM deep learning network. Finally, a prediction algorithm is implemented that takes acceleration data and returns coordinates in the haptic attribute space. A quantitative evaluation was conducted to inspect the reliability of the algorithm on unseen textures, showing that the model outperformed other similar models.


Establishing haptic texture attribute space and predicting haptic attributes from image features using 1D-CNN

July 2023

·

115 Reads

·

8 Citations

The current study strives to provide a haptic attribute space where texture surfaces are located based on their haptic attributes. The main aim of the haptic attribute space is to come up with a standardized model for representing and identifying haptic textures analogous to the RGB model for colors. To this end, a four dimensional haptic attribute space is established by conducting a psychophysical experiment where human participants rate 100 real-life texture surfaces according to their haptic attributes. The four dimensions of the haptic attribute space are rough-smooth, flat-bumpy, sticky-slippery, and hard-soft. The generalization and scalability of the haptic attribute space is achieved by training a 1D-CNN model for predicting attributes of haptic textures. The 1D-CNN is trained using the attribute data from psychophysical experiments and image features collected from the images of real textures. The prediction power granted by the 1D-CNN renders scalability to the haptic attribute space. The prediction accuracy of the proposed 1D-CNN model is compared against other machine learning and deep learning algorithms. The results show that the proposed method outperforms the other models on MAE and RMSE metrics.


Citations (69)


... The concept of modularity in haptic interfaces entails the use of customizable haptic modules designed to deliver specific types of tactile feedback tailored for diverse applications. Recent studies have developed modular actuating solutions capable of generating multimodal tactile feedback using either single or multiple actuators for various uses [3], [4], [5]. Although the modular nature of these solutions broadens their applicability, a frequent issue arises with their ability to provide a variety of feedback within a single module [6], often leading to bulkier systems [7]. ...

Reference:

Pneumatically Controlled Tactile Actuating Modules for Enhanced VR Safety Training
Pneumatically Controlled Wearable Tactile Actuator for Multi-Modal Haptic Feedback

IEEE Access

... Recently, deep learning approaches have been employed for processing haptic data in various tasks such as surface texture classification [50], synthesis of high-frequency vibration signals [51], haptic attributes estimation using tactile information [52], and perceptual similarity learning based on haptic data [53]. Given this, the main aim of this study is to develop a hybrid CNN-LSTM model to predict perceptual attributes of car doors based on the dynamics offered by the door hinge which we recorded as force signals (see Section V). ...

Predicting Perceptual Haptic Attributes of Textured Surface from Tactile Data Based on Deep CNN-LSTM Network

... Advanced algorithms can identify patterns and relationships that may be difficult for humans to discern [18], [19], [20]. Machine learning models, particularly deep learning architectures, have shown promise in modeling complex, nonlinear relationships between input data and user perceptions [21], [22], [23]. However, there has been limited research on leveraging these techniques to predict human perception of car door attributes based on vehicle data. ...

Establishing haptic texture attribute space and predicting haptic attributes from image features using 1D-CNN

... The benefit of using this algorithm is that it can produce not only perceptually but also physically accurate acceleration signals for any arbitrary interaction. For instance, the Goodness-of-Fit Criterion (GFC) value for the estimated power spectrum of acceleration is greater than 0.9 for most of the textures as claimed by the authors [1,2], which is considered a very accurate match of the measured and synthesized acceleration signal [3]. Moreover, this algorithm is accompanied by a pre-made haptic texture library consisting of 100 real-world texture surfaces including textures used in this study. ...

Model-Mediated Teleoperation for Remote Haptic Texture Sharing: Initial Study of Online Texture Modeling and Rendering

... In this state, musicians cease to be self-conscious and enter into a total connection with their instrument, allowing them to reach their full performance potential (Chirico et al. 2015). However, performing music can become a source of fear, tension and discomfort if the performer suffers from performance anxiety (Lee et al. 2023). ...

VR unseen gaze: inducing feeling of being stared at in virtual reality

... Pneumatic actuators have demonstrated their versatility as soft wearable haptic devices with many applications. [15][16][17][18][19][20][21][22][23] The most common type of application is simple indentation on the skin. [16][17][18][19][20][21][22] Vibrotactile feedback using pneumatic vibrators has also been studied. ...

Multi-Mode Soft Haptic Thimble for Haptic Augmented Reality Based Application of Texture Overlaying
  • Citing Article
  • July 2022

Displays

... They demonstrated that by combining their segmentation framework and a user interface, which guides the data collection process to create a human-in-loop system, the approximation quality of the model increases. In [91], they trained a neural network with contact acceleration data collected through a manual scanning stylus. They used attention-aware 1D CNNs and encoder-decoder networks with Bi-LSTM to capture spatial and temporal dynamics. ...

Deep multi-model fusion network based real object tactile understanding from haptic data

Applied Intelligence

... Still, within the framework of the mapping of medical data, GORRELL et al [26] introduced the new Bio-YODIE system, which consists of two main components: the pipeline that annotates documents that contain a UMLS Concept Unique Identifier (CUI) along with other pertinent UMLS data, and the component in charge of preparing the resources that process the UMLS and the other necessary information resources at runtime. In the same context, we retain the intervention of ABBAS et al [27] who proposed an algorithm that implements the UMLS Terminology Services (UTS) and personalized it to extract concepts for all the expressions and terms used in recitals and determine their semantic and entity types to find an exact categorization of the concepts. This led us to the conclusion that multiple information extraction methodologies, in combination with UMLS, have been used to annotate and extract clinically significant information from medical data sources in several domains of medicine, such as psychology [28] and Cancer [29]. ...

Explicit and Implicit Section Identification from Clinical Discharge Summaries

... Pneumatic actuators have demonstrated their versatility as soft wearable haptic devices with many applications. [15][16][17][18][19][20][21][22][23] The most common type of application is simple indentation on the skin. [16][17][18][19][20][21][22] Vibrotactile feedback using pneumatic vibrators has also been studied. ...

Soft Pneumatic Fingertip Actuator Incorporating a Dual Air Chamber to Generate Multi-Mode Simultaneous Tactile Feedback

Applied Sciences

... Due to the recent breakthroughs on computer graphics research, the transition from traditional 2D visual content to adaptive 3D mixed reality worlds is straightforward and showing promising results 1 . In haptics, most of the existing works for vibrotactile feedback rendering or sound rendering use a tool-based texture interaction approach, where the user moves the tool over the virtual texture, and vibrotactile feedback is rendered via an actuator [2][3][4] , or sound is rendered through headphones 5 . Independently, both vibrotactile feedback rendering and sound rendering demonstrated sufficient reconstruction accuracy when applying the data-driven paradigm. ...

Data-Driven Haptic Texture Modeling and Rendering Based on Deep Spatio-Temporal Networks

IEEE Transactions on Haptics