Figure - uploaded by Sooyeon Lee
Content may be subject to copyright.

Participants Demographics
Source publication
Locating and grasping objects is a critical task in people’s daily lives. For people with visual impairments, this task can be a daily struggle. The support of augmented reality frameworks in smartphones can overcome the limitations of current object detection applications designed for people with visual impairments. We present AIGuide, a self-cont...
Contexts in source publication
Context 1
... ages range from 22 years to 45 years old. The participant table (Table 1) lists demographic and personal details for each participant. All of our participants have a significant level of visual impairment (see Table 1) and are users of iPhone with Voiceover. ...Context 2
... participant table (Table 1) lists demographic and personal details for each participant. All of our participants have a significant level of visual impairment (see Table 1) and are users of iPhone with Voiceover. All reported that they have normal hearing except for one participant, who uses a hearing aid. ...Similar publications
Designing human-computer interaction, as the name suggests, is a branch of the study of how humans and machines (or systems) communicate or operate to better adapt machines to humans. We must first have a complete understanding of the human body. Only by fully understanding your own body’s ability to stretch and range of activities can you be confi...
O presente trabalho tem como objetivo apresentar uma proposta de interface digital que permita avaliar a proficiência, no Português Europeu (PE), de crianças lusodescendentes na diáspora, e que forneça material informativo sobre a aquisição precoce de línguas de herança. No desenvolvimento deste trabalho foram expostos e discutidos os obstáculos à...
Citations
... Many of the analysed guidance-based assistive systems use haptics in parallel to audio rendering. Most of these solutions guide the user to change direction [40,59,61,72], or announce an event such as reaching the destination [63], identifying/losing the desired object in/from the camera's field of view [13,76]. earlier systems mainly utilised vibrations to inform about obstacle direction, as seen in [40]. ...
... vibration patterns are used in tandem with audio -they signal to stop, move forward and warn of an obstacle. [76] 2022 smartphone it provides guidance on how to reach certain objects that are in the camera's field of view (e.g., "item is 2 feet away, 30 degrees left"). it also provides instructions if the object gets out of the camera's view. ...
Purpose:
Visually impaired people (VIP) find it challenging to understand and gain awareness of their surroundings. Most activities require the use of the auditory or tactile senses. As such, assistive systems which are capable of aiding visually impaired people to understand, navigate and form a mental representation of their environment are extensively being studied and developed. The aim of this paper is to provide insight regarding the characteristics, as well as the advantages and drawbacks of different types of sonification strategies in assistive systems, to assess their suitability for certain use-cases.
Materials and methods:
To this end, we reviewed a sizeable number of assistive solutions for VIP which provide a form of auditory feedback to the user, encountered in different scientific databases (Scopus, IEEE Xplore, ACM and Google Scholar) through direct searches and cross-referencing.
Results:
We classified these solutions based on the aural information they provide to the VIP - alerts, guidance and information about their environment, be it spatial or semantic. Our intention is not to provide an exhaustive review, but to select representative implementations from recent literature that highlight the particularities of each sonification approach.
Conclusions:
Thus, anyone who is intent on developing an assistive solution will be able to choose the desired sonification class, being aware of the advantages/disadvantages and at the same time having a fairly wide selection of articles from the representative class.
... Similar to work on VR solutions, AR systems have also explored approaches to support improved environmental awareness via translating visual information into text and speech (Granquist et al., 2021). Furthermore, research has presented novel interaction methods for users who are blind or experience low vision for collision avoidance , presenting magnification and visual enhancements to the virtual environment (Coughlan & Miele, 2017), and voicecontrolled interfaces (Lee et al., 2022). ...
Augmented and virtual reality experiences present significant barriers for disabled people, making it challenging to fully engage with immersive platforms. Whilst researchers have started to explore potential solutions addressing these accessibility issues, we currently lack a comprehensive understanding of research areas requiring further investigation to support the development of inclusive AR/VR systems. To address current gaps in knowledge, we led a series of multidisciplinary sandpits with relevant stakeholders (i.e., academic researchers, industry specialists, people with lived experience of disability, assistive technologists, and representatives from disability organisations, charities, and special needs educational institutions) to collaboratively explore research challenges, opportunities, and solutions. Based on insights shared by participants, we present a research agenda identifying key areas where further work is required in relation to specific forms of disability (i.e., across the spectrum of physical, visual, cognitive, and hearing impairments), including wider considerations associated with the development of more accessible immersive platforms.
... For users with visual impairments, research has illustrated how immersive technologies can be used as visual aids to support environmental awareness and promote sensory substitution [46]. Furthermore, work has focused on novel interaction methods for expanding users' spatial awareness [47][48][49] and for developing novel user interaction techniques which combine object localisation and spatial audio [50,51] and echolocation [52]. Haptic interactions have also been used with immersive technology for users with visual impairments in novel interfaces to support sensory substitution [53][54][55]. ...
Augmented and virtual reality (AR/VR) hold significant potential to transform how we communicate, collaborate, and interact with others. However, there has been a lack of work to date investigating accessibility barriers in relation to immersive technologies for people with disabilities. To address current gaps in knowledge, we led two multidisciplinary Sandpits with key stakeholders (including academic researchers, AR/VR industry specialists, people with lived experience of disability, assistive technologists, and representatives from national charities and special needs colleges) to collaboratively explore and identify existing challenges with AR and VR experiences. We present key themes that emerged from Sandpit activities and map out the interaction barriers identified across a spectrum of impairments (including physical, cognitive, visual, and auditory disabilities). We conclude with recommendations for future work addressing the challenges highlighted to support the development of more inclusive AR and VR experiences.
... For users with visual impairments, research has illustrated how immersive technologies can be used as visual aids to support environmental awareness and promote sensory substitution [46]. Furthermore, work has focused on novel interaction methods for expanding users' spatial awareness [47][48][49] and for developing novel user interaction techniques which combine object localisation and spatial audio [50,51] and echolocation [52]. Haptic interactions have also been used with immersive technology for users with visual impairments in novel interfaces to support sensory substitution [53][54][55]. ...
Augmented and virtual reality (AR/VR) hold significant potential to transform how we communicate, collaborate, and interact with others. However, there has been a lack of work to date investigating accessibility barriers in relation to immersive technologies for people with disabilities. To address current gaps in knowledge, we led two multidisciplinary sandpits with key stakeholders (including academic researchers, AR/VR industry specialists, people with lived experience of disability, assistive technologists, and representatives from national charities and special needs colleges) to collaboratively explore and identify existing challenges with AR and VR experiences. We present key themes that emerged from sandpit activities and map out the interaction barriers identified across a spectrum of impairments (including physical, cognitive, visual, and auditory disabilities). We conclude with recommendations for future work addressing the challenges highlighted to support the development of more inclusive AR and VR experiences.
For blind and visually impaired (BVI) amputees, the combined loss of vision and grasping abilities turns the seemingly simple task of reaching and grasping into a significant challenge. This paper introduces a novel multi-sensory prosthesis system designed for BVI amputees to assist in perception, navigation, and grasping tasks. The system integrates voice interaction, environmental perception, grasp guidance, collaborative control, and auditory/tactile feedback. Specifically, it processes user commands, provides environmental data via auditory/tactile channels, and manages collaborative control of grasp gestures and wrist angles for stable object handling. The prototype, viiathand, was experimentally tested with eight able-bodied and four blind subjects performing reach-and-grasp tasks, showing that users could accurately reach (average time:15.24s) and securely grasp objects (average time:17.23s) in an indoor setting. The system also proved to be user-friendly, requiring minimal training for users to become adept
Computer vision technologies have altered Product design and development, which have replaced old manual procedures. This study thoroughly examines computer vision’s role in affecting product design, examining its historical history, applications, and ramifications. The background emphasizes the constraints of traditional design, underlining the necessity for creative alternatives. Integrating computer vision aligns with Industry 4.0 trends, which call for smart and automated design procedures. The investigation delves into the growth of computer vision, its applications in quality control, design optimization, augmented/virtual reality, user interface design, and cultural/language acquisition. The article also looks into the relationship between computer vision and consumer analysis. The content analysis reduces 285 author keywords to 31 interrelated keywords that comprise five clusters. The conclusion emphasizes the social effect, promising more accessible, efficient, and innovative design processes. This multidisciplinary study dives into computer vision in product design and development by doing a thorough analysis of a variety of datasets. Examining three datasets with 285, 1066, and 1190 terms yields important results. The findings highlight the importance of “Product Design” and “Computer Vision” with changing patterns and concentrations across datasets. Thematic studies reveal repeating focus elements in titles and abstracts, such as “Design” and “Vision,” indicating a technological emphasis, human-centric concerns, and practical consequences. Network investigations reveal complex linkages and clustering within keyword networks, allowing for more in-depth knowledge of specific areas. These findings help to understand the dynamic interaction of computer vision and product design, driving future research and innovation in this rapidly growing sector.