
Johan Kildal- Researcher
- Researcher at TEKNIKER
Johan Kildal
- Researcher
- Researcher at TEKNIKER
About
74
Publications
27,824
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
1,800
Citations
Introduction
Current institution
Additional affiliations
March 2008 - present
Publications
Publications (74)
Safety is the main concern in human-robot collaboration (HRC) in work environments. Standard safety measures based on reducing robot speed affect productivity of collaboration, and do not inform workers adequately about the state of the robot, leading to stressful situations due to uncertainty. To grant the user control over safety, we investigate...
For smooth human–robot cooperation, it is crucial that robots understand social cues from humans and respond accordingly. Contextual information provides the human partner with real-time insights into how the robot interprets social cues and what action decisions it makes as a result. We propose and implement a novel design for a human–robot cooper...
Collaborative robots are increasing their presence in the industrial production context due to their flexibility in the tasks they can perform. However, programming and teleoperating them could be a difficult task from the perspective of an unqualified worker. In cases where the worker has to collaborate remotely with a robotic arm, the probability...
Collaborative robots, designed to work alongside humans in industrial manufacturing, are becoming increasingly prevalent. These robots typically monitor their distance from workers and slow down or stop when safety thresholds are breached. However, this results in reduced task execution performance and safety-related uncertainty for the worker. To...
Many industries are transitioning to Industry 4.0 production models by adopting robots in their manufacturing processes. In parallel, Extended Reality (XR) technologies have reached sufficient maturity to enter the industrial applications domain, with early success cases often related to training workers, remote assistance, access to contextual inf...
Collaborative robots, designed to work alongside humans in industrial manufacturing, are becoming increasingly prevalent. These robots typically monitor their distance from workers and slow down or stop when safety thresholds are breached. However, this results in reduced task execution performance and safety-related uncertainty for the worker. To...
Safety is the main concern in human-robot collaboration (HRC) in work environments. Standard safety measures based on reducing robot speed affect productivity of collaboration, and do not inform workers adequately about the state of the robot, leading to stressful situations due to uncertainty. To grant the user control over safety, we investigate...
It is difficult to estimate the boundaries of the hazard zones generated around autonomous machines and robots when navigating a space shared with them. We investigated the use of multimodal (auditory and/or visual) mixed-reality (MR) displays to warn users about invading such hazards zones and to help them return to safety. Two single-modality aud...
Nowadays, modern industry has adopted robots as part of their processes. In many scenarios, such machines collaborate with humans to perform specific tasks in their same environment or simply guide them in a natural, safe and efficient way. Our approach improves a previously conducted work on a multi-modal human-robot interaction system with differ...
Assembly production activities consist of arrays of actions many of which can be executed by workers with cognitive disabilities (WCDs). However, figuring out sequences of assembly from technical documentation is an unsurmountable barrier for most WCDs to access assembly jobs. Partnering with a collaborative robot (cobot) can provide the assistance...
In addition to production-oriented robots, service robots with social skills can also perform a role in industrial environments, providing on-demand ancillary services that support production activities. In this paper, a robot that provides way-finding services within an industrial facility (e.g., finding a person or place) and is able to naturally...
In the context of industrial production, a worker that wants to program a robot using the hand-guidance technique needs that the robot is available to be programmed and not in operation. This means that production with that robot is stopped during that time. A way around this constraint is to perform the same manual guidance steps on a holographic...
Nowadays, modern industry has adopted robots as part of their processes. In many scenarios, such machines collaborate with humans to perform specific tasks in their same environment or simply guide them in a natural, safe and efficient way. Our approach improves a previously conducted work on a multi-modal human-robot interaction system with differ...
The ACE Factories White Paper on 'Human-centred factories from theory to industrial practice. Lessons learned and recommendations'
http://ace-factories.eu/wp-content/uploads/ACE-Factories-White-Paper.pdf
" Navigating a space populated by fenceless industrial robots while carrying out other tasks can be stressful, as the worker is unsure about when she is invading the area of influence of a robot, which is a hazard zone. Such areas are difficult to estimate and standing in one may have consequences for worker safety and for the productivity of the r...
Collaborative robots and personalised human-automation load balancing strategies may be able to empower workers with cognitive disabilities to carry out complex assembly tasks. To design such assembly cells, it is necessary to investigate the human factors associated with this user group in relation to the technology proposed. We present a qualitat...
People who have cognitive disabilities can achieve personal fulfilment and social integration when they access the job market. In the case of working on industrial assembly lines, they can perform at the highest standards when assembly sub-tasks have been adequately adapted. However, the arrival of new production paradigms for the factory of the fu...
Museum guide robots (asexamplesof social robotsfor public spaces)have been recurrently chosenforresearchin robotics and HRI. As a resultof this, guide robotshave developed many skillsforautonomy and social interaction,but very few peoplehave the opportunity to enjoy their services: only the visitors that arecollocated with them at the museum.We pro...
Collaborative human-robot multi-component systems intended for industrial processes are designed and implemented to meet requirements of productivity in the process and safety in the workplace. If these systems are to gain broad adoption and acceptance in the workplace, it is also essential that they provide the best possible user experience (UX) f...
In industrial collaborative robotics, operators and robots perform complex tasks working together without physical barriers. Under this premise, the availability of a flexible, robust and fast interaction system between the robot and the workers is a necessity.
Human beings use voice and gestures to achieve a natural interaction. Taking into accoun...
Human-robot collaboration is a key factor for the development of factories of the future, a space in which humans and robots can work and carry out tasks together. Safety is one of the most critical aspects in this collaborative human-robot paradigm. This article describes the experiments done and results achieved by the authors in the context of t...
This article presents a semantic approach for multimodal interaction between humans and industrial robots to enhance the dependability and naturalness of the collaboration between them in real industrial settings. The fusion of several interaction mechanisms is particularly relevant in industrial applications in which adverse environmental conditio...
While good physical health receives more attention, psychological wellbeing is an essential component of a happy existence. An everyday source of psychological wellbeing is the voluntary practice of skilled activities one is good at. Taking musical creation as one such skilled activity, in this work we employ an interaction method to monitor varyin...
The level of engagement of a musician performing on an instrument is related to the degree of satisfaction derived from that activity. With our work, we aim to assist musicians performing live on a new musical instrument, Network of Interactive Sonic Agents (NOISA), by helping them maintain or increase their level of engagement with the activity. T...
One of the reasons of why some musical instruments more successfully continue their evolution and actively take part in the history of music is partially attributed to the existing compositions made specifically for them, pieces that remain and are still played over a long period of time. This is something we know, performing these compositions kee...
This paper describes the development and evaluation of Undulating Covers (UnCovers), mobile interfaces that can change their surface texture to transmit information. The Pin Array UnCover incorporates sinusoidal ridges controlled by servomotors, which can change their amplitude and granularity. The Mylar UnCover is a more organic interface that exp...
This paper presents the creation of two music pieces using motivic through-composition techniques applied to a previously presented NIME: a Network of Intelligent Sound Agents. The compositions take advantage of the characteristics of the system, which is designed to monitor, predict and react to the performer's level of engagement. In the set of v...
In this paper we present the new development of a semi-autonomous response module for the NOISA system. NOISA is an interactive music system that predicts performer's engagement levels, learns from the performer, decides what to do and does it at the right moment. As an improvement for the above, we implemented real-time adaptive features that resp...
Physical representations of data have existed for thousands of years. Yet it is now that advances in digital fabrication, actuated tangible interfaces, and shape-changing displays are spurring an emerging area of research that we call Data Physicalization. It aims to help people explore, understand, and communicate data using computer-supported phy...
Physical representations of data have existed for thousands of years. However, it is only now that advances in digital fabrication, actuated tangible interfaces, and shape-changing displays can support the emerging area of 'Data Physicalization' [6]: the study of computer-supported, physical representations of data and their support for cognition,...
Shape-retaining freely-deformable interfaces can take innumerable distinct shapes, and creating specific target configurations can be a challenge. In this paper, we investigate how audio can guide a user in this process, through the use of either musical or metaphoric sounds. In a formative user study, we found that sound encouraged action possibil...
While smartphones are increasing in size and complex features, new form factors for simple communication devices are emerging. In this paper, we present the design process for a wrist worn communication device, which enables the user to send text messages over a paired mobile phone. The process includes concept design, user evaluation, design itera...
Clean and minimalistic industrial designs dominate
current multi-device ecosystems. One intended feature
of such designs is that a device can be picked up and
used in any orientation. However, the persistent
presence of a few physical buttons forces most users to
look for them
by
rotating the device before using it. We
propose that tangible but vir...
Device deformation allows new types of gestures to be used in interaction. We identify that the gesture/use-case pairings proposed by interaction designers are often driven by factors relating improved tangibility, spatial directionality and strong metaphorical bonds. With this starting point, we argue that some of the designs may not make use of t...
The four projects shown here are from CHI 2013 Interactivity in Paris, France, selected to reflect the diversity of approaches to embodied interaction. HCI research commonly features contributions more keenly experienced through embodied engagement rather than via a presentation or video. CHI Interactivity provides the venue for such work, allowing...
Augmented reality (AR) systems enable new user experiences while the user is interacting with virtual objects in the physical space. The virtual objects have mostly been presented visually, overlaid on the physical world. In this paper, we present an explorative user study of a prototype system AHNE with the aim to understand the user experience an...
We present a study that investigates the potential of combining, within the same interaction cycle, deformation and touch input in a handheld device. Using a flexible, input-only device connected to an external display, we compared a multitouch input technique and two hybrid deformation-plus-touch input techniques (bending and twisting the device,...
Display technology developments mean the next generation of visual output devices will extend beyond the rigid, flat surfaces with which we are familiar to those that the user or the machine can deform. These will allow users to physically push, pull, bend, fold or flex the display and facilitate a range of self-deformation to better represent on-s...
Deformable User Interfaces (DUIs) often require external confirmation of the status of the interface, which is normally provided visually. We propose that tactile cues can also be employed for this end. In a user study that presents both visual and tactile cues in redundancy, we found that both channels can be combined with no loss in user experien...
MARSUI is a hardware deformable prototype exhibiting plastic (shape-retaining) behavior. It can track the shape that the user creates when deforming it. We envision that a set of predefined shapes could be mapped onto particular applications and functions. In its current implementation, we present three shapes that MARSUI can be deformed into: circ...
An overview of emerging topics, theories, methods, and practices in sonic interactive design, with a focus on the multisensory aspects of sonic experience.
Sound is an integral part of every user experience but a neglected medium in design disciplines. Design of an artifact's sonic qualities is often limited to the shaping of functional, representa...
There has been little discussion on how the materials used to create deformable devices, and the subsequent interactions, might influence user performance and preference. In this paper we evaluated how the stiffness and required deformation extent (bending up and down bi-manually) of mobile phone-shaped deformable devices influenced how precisely p...
While gesture taxonomies provide a classification of device-based gestures in terms of communicative intent, little work has addressed the usability differences in manually performing these gestures. In this primarily qualitative study, we investigate how two sets of iconic gestures that vary in familiarity, mimetic and alphabetic, are affected und...
Technological developments in display technologies allow us to explore the design of mobile devices that extend beyond the rigid, flat screen surfaces with which we are familiar. The next generation mobile devices will instead include deformable displays that users can physically push, pull, bend or flex or have those actions performed by the devic...
Deformable User Interfaces (DUIs) are increasingly being proposed for new tangible and organic interaction metaphors and techniques. To design DUIs, it is necessary to understand how deforming different materials manually using different gestures affects performance and user experience. In the study reported in this paper, three DUIs made of deform...
Interactive virtual environments are often focused on visual representation. This study introduces embodied and eyes-free interaction with audio-haptic navigation environment (AHNE) in a 3-dimensional space. AHNE is based on an optical tracking algorithm that makes use of Microsoft-Kinect and virtual objects are presented by dynamic audio-tactile c...
Traditionally only speech communicates emotions via mobile phone. However, in daily communication the sense of touch mediates emotional information during conversation. The present aim was to study if tactile stimulation affects emotional ratings of speech when measured with scales of pleasantness, arousal, approachability, and dominance. In the Ex...
Kooboh is a handheld, tangible user interface (TUI) that can display various mechanical properties to the hand that grasps it. Made of rigid material, its cuboid shape and size never change. However, when manually squeezed or pressed, the user perceives in the hand that the object is compliant and deformable. This is achieved through a non-visual h...
Current mobile navigation systems often require visual attention. This may lead to both inconvenient and unsafe use while walking. In this paper, we are introducing orientation inquiry, a new haptic interaction technique for non-visual pedestrian navigation. In a pilot experiment, the orientation inquiry technique was compared to tactile icons used...
We present an introduction to the user centered research that we are conducting using functional deformable research prototypes. This work has recently crystallized in the demonstration of the Nokia Kinetic Device (figure 1). In the large design space that opens before us around deformable user interfaces, we have chosen to focus on mobile personal...
This paper investigates how the perceived physicality of the action of applying force with a finger on a rigid surface (such as on a force-sensing touch screen) can be enhanced using real-time synthesized audio feedback. A selection of rich and evocative audio designs was used. Additionally, audio-tactile cross-modal integration was encouraged, by...
Creating realistic virtual friction forces requires using complex hardware setups. In simpler mobile systems, friction is often suggested by mimicking textures with vibration, based on the position on the screen. Even in the simplest implementations, this paper proposes that force sensing should also be used to modulate vibration. In this way, Coul...
Suitability of current haptic three-dimensional user interface (3D-UI) technologies is low for mobile interaction. 3D-Press in reviewed in this paper: a technique to create the haptic illusion that when pressing on a rigid surface is feels compliant. The fact that the illusion is intramodal (haptics only involved in creating it), and that the techn...
This paper reports a new intramodal haptic illusion. This illusion involves a person pressing on a rigid surface and perceiving that the surface is compliant, i.e. perceiving that the contact point displaces into the surface. The design process, method and conditions used to create this illusion are described in detail. A user study is also reporte...
Sonification of data via the mapping of values to frequency of sound is an auditory data analysis technique commonly used
to display graph information. The goal for any form of graph is to display numerical information with accuracy and neutrality
while exploiting perceptual and cognitive processes. Conveying information in frequency of sound is su...
This thesis investigates the problem of obtaining overview information from complex tabular numerical data sets non-visually. Blind and visually impaired people need to access and analyse numerical data, both in education and in professional occupations. Obtaining an overview is a necessary first step in data analysis, for which current non-visual...
Exploring any new data set always starts with gathering overview in- formation. When this process is done non-visually, interactive sonification techniques have proved to be effective and efficient ways of getting overview information, particularly for users who are blind or visually impaired. Under certain conditions, however, the process of data...
In non-visual interfaces, using non-speech audio can be a more effective and efficient way of obtaining overview information than using speech. However, users who are blind regularly use speech-based tools to access information in computers, and often prefer this technology over others that pose steeper learning curves. This paper proposes a techni...
Obtaining an overview is an important first step in the analysis of data sets, which cannot be easily done non-visually with current accessibility tools. We present TableVis, a multimodal interface to obtain overview information from numerical data tables non-visually, with the use of an interactive sonification technique controlled from a tangible...
This paper describes the design and preliminary testing of an interface to obtain overview information from complex numerical data tables non-visually, which is something that cannot be done with currently available accessibility tools for the blind and visually impaired users. A sonification technique that hides detail in the data and highlights i...
TableVis was developed to support computer users who are blind or visually impaired in tasks that involve obtaining quick overviews of tabular data sets. Previous work has covered the evaluation of this interface and its associated techniques of interactive data sonification and support exploratory processes. This paper examines the exploratory str...
Tackling the problem of browsing complex tabular numerical data sets non-visually, TableVis is an interface that provides techniques to obtain quick overviews through data sonification, as well as to obtain details on demand in speech. For certain overview tasks and some data configurations, the need for external memory aids was identified. This pa...
This paper proposes that current approaches to haptic graph visualisation (by simply presenting a simple haptic equiva-lent of a visual graph) are inadequate. We present a short background to graph theory perception and identify the prob-lems of current graph haptic visualisation. We then propose ideas influenced by both information visualisation r...
A very common way of presenting numerical information is organising it in 2-dimensional (2D) tables. When a user first approaches such a data table, he/she usually wants to quickly skim through it to get an overall idea about the data contained in it. Then the user will typically proceed to analyse more in detail the most interesting data pieces th...
When first approaching a two-dimensional (2D) data table, a user often wants to get a quick overview of the data before analysing them in more detail. Blind and visually impaired people cannot do this task at all using visual displays. Furthermore, speech synthesisers do not help sufficiently with browsing and pattern identification. The approach t...