Conference Paper

La plasticité des IHM en action: un exemple de téléprocédure plastique.

DOI: 10.1145/1629826.1629887 Conference: Proceedings of the 21st International Conference of the Association Francophone d'Interaction Homme-Machine, Grenoble, France, October 13-16, 2009
Source: DBLP

ABSTRACT The plasticity property has been defined ten years ago in Human-Computer Interaction. It denotes the capacity of User Interfaces (UI) to adapt to the context of use () while preserving user-centered properties. This paper presents a first real application: a plastic e-service for declaring incidents in public areas. The development has been funded by the french ANR MyCitizSpace (2007--2010) project devoted to the dematerialisation of the french government. All the dimensions of plasticity are exemplified.

0 Bookmarks
 · 
95 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The migration of a user interface (UI) is the action of transferring a UI from one device to another, for example from a desktop computer to a handheld device. A UI is said to be migratable if it has the ability to migrate. This paper describes how the QTk toolkit has been extended to provide a migratable UI and the application programming interface (API) provided to the developers. Basically, an indirection layer has been introduced between the application and the actual representation of the UI. The migration of a UI is achieved by firstly creating a clone of the state of the site displaying the UI, secondly by changing the indirection to point to this clone. The API provides a way to specify if (the entirety of) a window can be migrated or not at construction time. A migratable window returns a universal reference that can be given to any site with whom a network connection is possible. This reference can be used by a receiver widget to migrate the window there. Interestingly, a migratable window can itself contain a receiver widget configured to display the content of another migratable window: all windows are transparently migrated. Also a window (stationary or migratable) may contain one or more receiver widgets : it is possible to dynamically compose a Ul from several different UIs.
    Mobile and Ubiquitous Systems: Networking and Services, 2004. MOBIQUITOUS 2004. The First Annual International Conference on; 09/2004
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Pervasive devices are becoming popular and smaller. Those mobile systems should be able to adapt to changing requirements and execution environments. But it requires the ability to reconfigure deployed codes, which is considerably simplified if applications are component-oriented rather than monolithic blocks of codes. So, we propose a middleware approach called WComp which federates an event-driven component-oriented approach to compose services for devices. This approach is coupled with adaptation mechanisms dealing with separation of concerns. In such mechanisms, aspects (called Aspects of Assembly) are selected either by the user or by a self-adaptive process and composed by a weaver with logical merging of high-level specifications. The result of the weaver is then projected in terms of pure elementary modifications of components assemblies with respect to blackbox properties of COTS components. Our approach is validated by analyzing the results of different experiments drawn from sets of application configurations randomly generated and by showing its advantages while evaluating the additional costs on the reaction time to context changing.
    01/2007;
  • [Show abstract] [Hide abstract]
    ABSTRACT: This paper describes an environment able to support migratory multimodal interfaces in multidevice environments. We introduce the software architecture and the device-independent languages used by our tool, which provides services enabling users to freely move about, change device and continue the current task from the point where they left off in the previous device. Our environment currently supports interaction with applications through graphical and vocal modalities, either separately or together. Such applications are implemented in Web-based languages. We discuss how the features of the device at hand, desktop or mobile, are considered when generating the multimodal user interface.
    Proceedings of the 7th International Conference on Multimodal Interfaces, ICMI 2005, Trento, Italy, October 4-6, 2005; 01/2005

Full-text

Download
4 Downloads
Available from