Koichiro Tanikoshi’s research while affiliated with Hitachi, Ltd. and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (11)


Communi-Board-A Notice Board for Seamless Communication with Dispersed Personnel
  • Article

February 1999

·

10 Reads

IEEJ Transactions on Electronics Information and Systems

Koichiro Tanikoshi

·

Kimiya Yamaashi

·

Yoshibumi Fukuda

·

Masayasu Futakawa

Supervisory systems such as in power plants require complex tasks for users to communicate with each other. During a maintenance period or in case of system malfunctions, skilled staff members have to go to the actual field of the plant involved and check equipment directly, and operators in the central control room have to support for staff members. But, it is very difficult for operators to contact the appropriate staff member and encourage collaboration, because there is no way to know current location of the staff member and to contact him/her directly. In addition, supervisory system users need to use several different styles of communication which depend on the phase of the users tasks. For example, an operator may get collaboration between staff members by personal intervention as well as need the help of other operators to do difficult task . Also, they have to switch styles of communication frequently. We propose a new extended user interface model to overcome these problems and we propose a system based on the model. Communi-Board, which is a notice board system, provides seamless communication with dispersed staff members by two functions. One is quick contacting to dispersed staff members by chasers which indicate member locations, and the other is automatic changing of communication styles covering all situations by simple action.


Turning Your Video Monitor into a Virtual Window

January 1998

·

12 Reads

·

3 Citations

A video conference system that allows a video attendee to look around an entire conference room simply by moving his or her head is described. In order to locate the attendee's head, a differential image is produced by removing a reference view from the current video image. The orientation of a motorized camera is then determined directly by the head position of the video attendee. The increased affordances of this mechanism are discussed. I. INTRODUCTION In conventional video-conference settings, video attendees often feel a lack of presence in meetings because they are only provided with the view from a stationary camera. These disengaged visitors feel as if they are watching the meeting through a peep-hole rather than attending as full participants. To address this problem, we have built a camera control system that uses the video image of a person's head to control the pan, tilt, and zoom of a camera in a remote location. With no equipment besides a video camera and monitor at t...


Talking your way around a conference: a speech interface for remote equipment control.

January 1995

·

6 Reads

·

Shahir Daya

·

·

[...]

·

Videoconferencing enables people to attend and participate in meetings from remote locations. The key problem faced by electronic attendees is the limited sense of engagement offered by the audio-visual channel. The attendee is typically restricted to a single view of the room and has no ability to interact with presentation technology at the conference site.As a first step to improving the situation we want to assign electronic attendees a view of the room appropriate to their particular "social roles," which may include presenting a topic, listening to a talk, or participating in a discussion. However, attendees may change roles during a meeting, thus requiring a different position and view more suited to the new role. This involves switching video inputs and outputs to new cameras and monitors.One possible method to enable video attendees to effect these changes independently is to provide them with the same graphical user interface (GUI) that the central site has to control the equipment. Unfortunately, using state-of-the-art systems for such control is often confusing and complex. Furthermore, this solution requires the attendees to have "extra" computer equipment (i.e. equipment not already required for videoconferencing) and learn how to operate the GUI.Instead, using speech recognition and video overlay technologies, we are able to provide a non-technical interface to equipment in the meeting room. In doing so, we do not require any extra equipment at the attendees' sites. Our approach provides attendees with the means of controlling their own view of the meeting, changing electronic seats, and manipulating equipment remotely, all through simple voice commands.



Courtyard: integrating shared overview on a large screen and per-user detail on individual screens

April 1994

·

14 Reads

·

40 Citations

The operation of complex real-world systems, such as industrial plants, requires that multiple users cooperate in monitoring and controlling large amounts of information to supervise complex processes. The Court yard system supports such cooperative work by integrating an overview on a shared large screen and detail on individual screens. This integration is realized by two approaches: (1) providing an implicit way of transferring mouse and keyboard control between the shared and individual screens, and (2) supporting association between the overview on the shared screen and per-user detail on individual screens. Courtyard allows a user to move a mouse pointer between the shared and individual screens as though they were contiguous, and to access per-user detailed information on the user’s individual screen simply by pointing to an object on the shared screen. Courtyard selects the detailed information according to the tasks assigned to the pointing user under a division of labor. The former approach results in an interface that is as simple, intuitive and consistent to use as that for a single screen. The latter enables a user to retrieve easily and quickly detailed information needed for performing the assigned tasks without being distracted by information for others.



Hyperplant: Interaction with Plant through Live Video

July 1993

·

7 Reads

IFAC Proceedings Volumes

This paper proposes an Object-Oriented Video technique, which enables operators to interact with objects in a live video just as they interact with graphic objects, and introduces a prototype system, HyperPlant, which has been implemented to demonstrate effectiveness of this technique. The Object-Oriented Video technique conveys realities of plants to operators in a control center and allows them to work on tasks in real spatial context. This context helps operators to intuitively grasp what they are doing and what is going on as the result of their actions.


Fisheye videos: distorting multiple videos in space and time domain according to users' interests

January 1993

·

4 Reads

·

2 Citations

Many applications, such as tele-conference systems and plant control systems need to display a large number of videos. In those applications, displaying multiple video windows overwhelms limited computing resources (e.g., network capacity, processing power) due to the vast amount of information. This paper describes a technique allows multiple videos to display in the limited computing resources. This technique distorts multiple videos according to users’ interest. Users are not interested in all videos simultaneously. They only look at a part of them in detail and get the global context of other videos. The technique displays videos of interest in more detail by degrading other videos to allow an efficient use of limited computing resources, which we call the Fisheye Videos technique, The technique distorts a video in the space and time domain (e.g., spatial resolution, frame rate) according to users’ interests, which are estimat@ based on the window conditions such as its distance from a focused window and the amount of masked area by other windows.


Object-Oriented Video: Interaction with Real-World Objects Through Live Video.

January 1992

·

99 Reads

·

58 Citations

Graphics and live video are widely employed in remotely-controlled systems like industrial plants. Interaction with live video is, however, more limited compared with graphics as users cannot interact with objects being observed in the former. Object-Oriented Video techniques are described allowing object-oriented interactions, including the use of real-world objects in live video as reference cues, direct manipulation of them, and graphic overlays based on them, which enable users to work in a real spatial context conveyed by the video. Users thereby understand intuitively what they are operating and see the result of their operation.


Direct Manipulation Technique for Plant Control Centers: Do it Through Cameras

December 1991

·

1 Read

·

1 Citation

IEEJ Transactions on Industry Applications

This paper proposes a new man-machine technique for operator's consoles at plant control centers. The technique is direct manipulation of objects in real motion pictures taken by monitor cameras. Operators can manipulate devices placed in a plant, such as buttons and sliders, with pick and drag operations on the real motion pictures. They can also get information related to objects which they pick on the real motion pictures. The virtues of the technique are: (1) The operators intuitively understand what they are doing and what is going on as the result of their manipulation because of the realty of the pictures, as compared with graphic interfaces. (2) The consistent interface to real motion pictures and to graphics is established. Operators can manipulate objects shown either in real motion pictures or in graphics in the same direct manipulation manner. A prototype man-machine interface with the technique is presented to show how the technique is applied to plant control centers. © 1991, The Institute of Electrical Engineers of Japan. All rights reserved.


Citations (5)


... Une autre technique appelée Fisheye Videos avaitété proposé dès 1993 par Yamaashi et ses collègues pour la présentation de plusieurs vidéos simultanées [Yamaashi et al. 1993]. Elle répondait surtout au problème de l'époque concernant les limites des ressources de calcul et de mise en mémoire de nombreuses vidéos. ...

Reference:

Comparison and combination of visual aud audio renderings to conceive human-computer interfaces : from human factors to distortion-based presentation strategies
Fisheye videos: distorting multiple videos in space and time domain according to users' interests
  • Citing Conference Paper
  • January 1993

... An early attempt to teleoperate physical objects through live video is the work of Tani et al. in 1992 [12]. They used a common monitor-mouse-keyboard interface to manipulate the live video of real-world objects for remotely controlling an electric power plant, including clicking button images for controlling and dragging the 2D or 3D model of a physical object for positioning. ...

Object-Oriented Video: Interaction with Real-World Objects Through Live Video.
  • Citing Conference Paper
  • January 1992

... • Object locking [51] • Explicit notifications [45,51] • Input merging [22,101,109,111] • Territoriality [51,72,87,116,121,130,133] • Complementary input [10] • Extended field of view [58] • Show Through [4] • Specialized views [1] • Shared control [78] • Augmented group navigation [78] • Shared WIM [9] ...

Courtyard: integrating shared overview on a large screen and per-user detail on individual screens
  • Citing Conference Paper
  • April 1994

... The notion of a new space opening up as the congruence of real and virtual in smart environments can be traced back to the computing sciences literature. Ma et al. (2005) note that since the early 1990s when researchers first introduced concepts such as "reactive environment" (Cooperstock et al., 1995), there have been many definitions of smart space. The core feature of all these definitions is for smart space to be a "merger of physical and digital spaces". ...

Evolution of a Reactive Environment.
  • Citing Conference Paper
  • January 1995

... The sound of interior construction, washing machines, dishwashings, fans, airplanes, cars; coffee shop background music; subway announcement -To filter out surrounding noise and deliver only the user's voice -To be notified of surrounding noises being broadcasted -To easily notice and be excused by other participants for the noise 4. Usability of the videoconferencing platform (10) Distracting experiences created by technological, functional, systematic, and visual features of the videoconference tool 4.1 Low quality in audio output (8) Providing low-quality audio output by default 3,9,16,17), inhibited their ability to concentrate on the meeting (P3, 6), and made them miss the meeting's contents (P6, 9). ...

Turning Your Video Monitor into a Virtual Window
  • Citing Article
  • January 1998