[Show abstract][Hide abstract] ABSTRACT: Immersive environments that approximate natural interaction with physical 3D objects are designed to increase the user's sense of presence and improve performance by allowing users to transfer existing skills and expertise from real to virtual environments. However, limitations of current Virtual Reality technologies, e.g., low-fidelity real-time physics simulations and tracking problems, make it difficult to ascertain the full potential of finger-based 3D manipulation techniques. This paper decomposes 3D object manipulation into the component movements, taking into account both physical constraints and mechanics. We fabricate five physical devices that simulate these movements in a measurable way under experimental conditions. We then implement the devices in an immersive environment and conduct an experiment to evaluate direct finger-based against ray-based object manipulation. The key contribution of this work is the careful design and creation of physical and virtual devices to study physics-based 3D object manipulation in a rigorous manner in both real and virtual setups.
[Show abstract][Hide abstract] ABSTRACT: Interactive paper technologies offer new opportunities for supporting the highly individual practices of creative artists, such as contemporary music composers, who express and explore their ideas on both paper and the computer. We introduce PaperComposer, a graphical interface builder that allows users to create a personalized interactive paper interface that they can connect to their own computer-based musical data. We also present an API that facilitates the development of interactive paper components for PaperComposer. We describe one public demonstration of a novel musical interface designed for children and our collaborations with composers to create two novel interactive music interfaces that reflected their individual composition styles.
[Show abstract][Hide abstract] ABSTRACT: Contemporary music composition is a highly creative and disciplined activity that requires free expression of ideas and sophisticated computer programming. This paper presents a technique for structured observation of expert creative behavior, as well as Polyphony, a novel interface for systematically studying all phases of computer-aided composition. Polyphony is a unified user interface that integrates interactive paper and electronic user interfaces for composing music. It supports fluid transitions between informal sketches and formal computer-based representations. We asked 12 composers to use Polyphony to compose an electronic accompaniment to a 20-second instrumental composition by Anton Webern. All successfully created a complete, original composition in an hour and found the task challenging but fun. The resulting dozen comparable snapshots of the composition process reveal how composers both adapt and appropriate tools in their own way.
[Show abstract][Hide abstract] ABSTRACT: The advent of ultra-high resolution wall-size displays and their use for complex tasks require a more systematic analysis and deeper understanding of their advantages and drawbacks compared with desktop monitors. While previous work has mostly addressed search, visualization and sense-making tasks, we have designed an abstract classification task that involves explicit data manipulation. Based on our observations of real uses of a wall display, this task represents a large category of applications. We report on a controlled experiment that uses this task to compare physical navigation in front of a wall-size display with virtual navigation using panand- zoom on the desktop. Our main finding is a robust interaction effect between display type and task difficulty: While the desktop can be faster than the wall for simple tasks, the wall gains a sizable advantage as the task becomes more difficult. A follow-up study shows that other desktop techniques (overview+detail, lens) do not perform better than pan-andzoom and are therefore slower than the wall for difficult tasks.
[Show abstract][Hide abstract] ABSTRACT: Effectively planning a large multi-track conference requires an understanding of the preferences and constraints of organizers, authors, and attendees. Traditionally, the onus of scheduling the program falls on a few dedicated organizers. Resolving conflicts becomes difficult due to the size and complexity of the schedule and the lack of insight into community members' needs and desires. Cobi presents an alternative approach to conference scheduling that engages the entire community in the planning process. Cobi comprises (a) communitysourcing applications that collect preferences, constraints, and affinity data from community members, and (b) a visual scheduling interface that combines communitysourced data and constraint-solving to enable organizers to make informed improvements to the schedule. This paper describes Cobi's scheduling tool and reports on a live deployment for planning CHI 2013, where organizers considered input from 645 authors and resolved 168 scheduling conflicts. Results show the value of integrating community input with an intelligent user interface to solve complex planning tasks.
Proceedings of the 26th annual ACM symposium on User interface software and technology; 10/2013
[Show abstract][Hide abstract] ABSTRACT: This paper presents Arpège, a progressive multitouch input technique for learning chords, as well as a robust recognizer and guidelines for building large chord vocabularies. Experiment one validated our design guidelines and suggests implications for designing vocabularies, i.e. users prefer relaxed to tense chords, chords with fewer fingers and chords with fewer tense fingers. Experiment two demonstrated that users can learn and remember a large chord vocabulary with both Arpège and cheat sheets, and Arpège encourages the creation of effective mmnemonics.
Proceedings of the 2013 ACM international conference on Interactive tabletops and surfaces; 10/2013
[Show abstract][Hide abstract] ABSTRACT: As computing environments that combine multiple displays and input devices become more common, the need for applications that take advantage of these capabilities becomes more pressing. However, few applications are designed to support such multi-surface environments. We investigate how to adapt existing applications without access to their source code. We introduce HydraScope, a framework for transforming existing web applications into meta-applications that execute and synchronize multiple copies of applications in parallel, with a multi-user input layer for interacting with it. We describe the Hydra-Scope architecture, validated with five meta-applications.
Proceedings of the 2nd ACM International Symposium on Pervasive Displays; 06/2013
[Show abstract][Hide abstract] ABSTRACT: CHI 2013 offers over 500 separate events including paper presentations, panels, courses, case studies and special interest groups. Given the size of the conference, it is no longer practical to host live summaries of these events. Instead, a 30-second Video Preview summary of each event is available. The CHI'13 Interactive Schedule helps attendees navigate this wealth of video content in order to identify events they would like to attend. It consists of a number of large display screens throughout the conference venue which cycle through a video playlist of events. Attendees can interact with these displays using their mobile devices by either constructing custom video playlists or adding on-screen content to their personal schedule.
CHI '13 Extended Abstracts on Human Factors in Computing Systems; 04/2013
[Show abstract][Hide abstract] ABSTRACT: There are many visions that touch on the future of human computer interaction from a trans-human future to a post-technological UI. However visions related to the progress of technology are not new. Creative and insightful visionaries from Denis Diderot to Vannevar Bush have been postulating visions of possible futures or technology for centuries. Some idealised views end up discredited with advances in knowledge, while others now appear remarkably prescient. The question is, do visions and the process of creating them have a place in CHI, or are they simply flights of fancy? This SIG meeting provides a forum for visionaries; researchers and practitioners looking to consider the place and importance of visions within CHI. Can visions, the process of visioning and forming new visions help us refine, advance or develop new research or forms of interaction. And if visions are important to us, then are they part of the regular academic process? If so, should CHI provide venues for publishing new visions?
CHI '13 Extended Abstracts on Human Factors in Computing Systems; 04/2013
[Show abstract][Hide abstract] ABSTRACT: Creating a good schedule for a large conference such as CHI requires taking into account the preferences and constraints of organizers, authors, and attendees. Traditionally, the onus of planning is placed entirely on the organizers and involves only a few individuals. Cobi presents an alternative approach to conference scheduling that engages the entire community to take active roles in the planning process. The Cobi system consists of a collection of crowdsourcing applications that elicit preferences and constraints from the community, and software that enable organizers and other community members to take informed actions toward improving the schedule based on collected information. We are currently piloting Cobi as part of the CHI 2013 planning process.
CHI '13 Extended Abstracts on Human Factors in Computing Systems; 04/2013
[Show abstract][Hide abstract] ABSTRACT: We introduce BodyScape, a body-centric design space that allows us to describe, classify and systematically compare multi-surface interaction techniques, both individually and in combination. BodyScape reflects the relationship between users and their environment, specifically how different body parts enhance or restrict movement within particular interaction techniques and can be used to analyze existing techniques or suggest new ones. We illustrate the use of BodyScape by comparing two free-hand techniques, on-body touch and mid-air pointing, first separately, then combined. We found that touching the torso is faster than touching the lower legs, since it affects the user's balance; and touching targets on the dominant arm is slower than targets on the torso because the user must compensate for the applied force.
Proceedings of the 31st international conference on Human factors in computing systems; 04/2013
[Show abstract][Hide abstract] ABSTRACT: Teaching abstract concepts is notoriously difficult, especially when we lack concrete metaphors that map to those abstractions. Combinatorix offers a novel approach that combines tangible objects with an interactive tabletop to help students explore, solve and understand probability problems. Students rearrange physical tokens to see the effects of various constraints on the problem space; a second screen displays the associated changes in an abstract representation, e.g., a probability tree. Using participatory design, college students in a combinatorics class helped iteratively refine the Combinatorix prototype, which was then tested successfully with five students. Combinatorix serves as an initial proof-of-concept that demonstrates how tangible tabletop interfaces that map tangible objects to abstract concepts can improve problem-solving skills.
Proceedings of the 2012 ACM international conference on Interactive tabletops and surfaces; 11/2012
[Show abstract][Hide abstract] ABSTRACT: Tonnetze are space-based musical representations that lay out individual pitches in a regular structure. They are primarily used for analysis with visualization tools or on paper and for performance with button-based tablet or tangible interfaces. This paper first investigates how properties of Tonnetze can be applied in the composition process, including the two-dimensional organization of pitches, based on a chord or on a scale. We then describe PaperTonnetz, a tool that lets musicians explore and compose music with Tonnetz representations by making gestures on interactive paper. Unlike screen-based interactive Tonnetz systems that treat the notes as playable buttons, PaperTonnetz allows composers to interact with gestures, creating replayable patterns that represent pitch sequences and/or chords. We describe the results of an initial test of the system in a public setting, and how we revised PaperTonnetz to better support three activities: discovering, improvising and assembling musical sequences in a Tonnetz. We conclude with a discussion of directions for future research with respect to creating novel paper-based interactive music representations to support musical composition.
[Show abstract][Hide abstract] ABSTRACT: We present paper substrates, interactive paper components that support the creation and manipulation of complex musical data. Substrates take different forms, from whole pages to movable strips, and contain or control typed data representations. We conducted participatory design sessions with five professional musicians with extensive experience with music creation tools. All generated innovative uses of paper substrates, manipulating their data, linking multiple representation layers and creating modular, reusable paper elements. The substrates reflect the structure of their computer-based data, but in a much more flexible and adaptable form. We use their prototypes to provide concrete examples of substrates, identify their roles, properties and functions. Finally, we explore their physical and interaction design with an interactive prototype.
[Show abstract][Hide abstract] ABSTRACT: Despite the demonstrated benefits of bimanual interaction, most tablets use just one hand for interaction, to free the other for support. In a preliminary study, we identified five holds that permit simultaneous support and interaction, and noted that users frequently change position to combat fatigue. We then designed the BiTouch design space, which introduces a support function in the kinematic chain model for interacting with hand-held tablets, and developed BiPad, a toolkit for creating bimanual tablet interaction with the thumb or the fingers of the supporting hand. We ran a controlled experiment to explore how tablet orientation and hand position affect three novel techniques: bimanual taps, gestures and chords. Bimanual taps outperformed our one-handed control condition in both landscape and portrait orientations; bimanual chords and gestures in portrait mode only; and thumbs outperformed fingers, but were more tiring and less stable. Together, BiTouch and BiPad offer new opportunities for designing bimanual interaction on hand-held tablets.
Proceedings of the 30th international conference on Human factors in computing systems; 05/2012
[Show abstract][Hide abstract] ABSTRACT: Augmented Reality (AR) has been proved useful to guide operational tasks in professional domains by reducing the shift of attention between instructions and physical objects. Modern smartphones make it possible to use such techniques in everyday tasks, but raise new challenges for the usability of AR in this context: small screen, occlusion, operation "through a lens". We address these problems by adding real-time feedback to the AR overlay. We conducted a controlled experiment comparing AR with and without feedback, and with standard textual and graphical instructions. Results show significant benefits for mobile AR with feedback and reveals some problems with the other techniques.