Ian E. McDowall’s research while affiliated with MPI Research and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (38)


Open virtual reality
  • Conference Paper

March 2013

·

77 Reads

·

10 Citations

Mark Bolas

·

·

·

[...]

·

The ICT Mixed Reality Lab is leveraging an open source philosophy to influence and disrupt industry. Projects spun out of the lab's efforts include the VR2GO smartphone based viewer, the inVerse tablet based viewer, the Socket HMD reference design, the Oculus Rift and the Project Holodeck gaming platforms, a repurposed FOV2GO design with Nokia Lumia phones for a 3D user interface course at Columbia University, and the EventLab's Socket based HMD at the University of Barcelona. A subset of these will be demonstrated. This open approach is providing low cost yet surprisingly compelling immersive experiences.


Recording and controlling the 4D light field in a microscope using microlens arrays

September 2009

·

393 Reads

·

270 Citations

Journal of Microscopy

By inserting a microlens array at the intermediate image plane of an optical microscope, one can record four-dimensional light fields of biological specimens in a single snapshot. Unlike a conventional photograph, light fields permit manipulation of viewpoint and focus after the snapshot has been taken, subject to the resolution of the camera and the diffraction limit of the optical system. By inserting a second microlens array and video projector into the microscope's illumination path, one can control the incident light field falling on the specimen in a similar way. In this paper, we describe a prototype system we have built that implements these ideas, and we demonstrate two applications for it: simulating exotic microscope illumination modalities and correcting for optical aberrations digitally.


HeadSPIN: a one-to-many 3D video teleconferencing system

August 2009

·

35 Reads

·

11 Citations

When people communicate in person, numerous cues of attention, eye contact, and gaze direction provide important additional channels of information, making in-person meetings more efficient and effective than telephone conversations and 2D teleconferences. Two-dimensional video teleconferencing precludes the impression of accurate eye contact: when a participant looks into the camera, everyone seeing their video stream sees the participant looking toward them; when the participant looks away from the camera (for example, toward other participants in the meeting), no one sees the participant looking at them. In this work, we develop a one-to-many teleconferencing system which uses 3D acquisition, transmission, and display technologies to achieve accurate reproduction of gaze and eye contact. In this system, the face of a single remote participant is scanned at interactive rates using structured light while the participant watches a large 2D screen showing an angularly correct view of the audience. The scanned participant's geometry is then shown on the 3D display to the audience.


Achieving Eye Contact in a One-to-Many 3D Video Teleconferencing System

July 2009

·

158 Reads

·

191 Citations

ACM Transactions on Graphics

We present a set of algorithms and an associated display system capable of producing correctly rendered eye contact between a three-dimensionally transmitted remote participant and a group of observers in a 3D teleconferencing system. The participant's face is scanned in 3D at 30Hz and transmitted in real time to an autostereoscopic horizontal-parallax 3D display, displaying him or her over more than a 180° field of view observable to multiple observers. To render the geometry with correct perspective, we create a fast vertex shader based on a 6D lookup table for projecting 3D scene vertices to a range of subject angles, heights, and distances. We generalize the projection mathematics to arbitrarily shaped display surfaces, which allows us to employ a curved concave display surface to focus the high speed imagery to individual observers. To achieve two-way eye contact, we capture 2D video from a cross-polarized camera reflected to the position of the virtual participant's eyes, and display this 2D video feed on a large screen in front of the real participant, replicating the viewpoint of their virtual self. To achieve correct vertical perspective, we further leverage this image to track the position of each audience member's eyes, allowing the 3D display to render correct vertical perspective for each of the viewers around the device. The result is a one-to-many 3D teleconferencing system able to reproduce the effects of gaze, attention, and eye contact generally missing in traditional teleconferencing systems.


Imaging and Display Applications using Fast Light

February 2009

·

27 Reads

Proceedings of SPIE - The International Society for Optical Engineering

The unique qualities of the TI DLP devices have enabled a number of interesting applications. The DLP is essentially a fast binary light modulator and using the power of modern graphics processors these devices can be driven with images computed on the fly at rates of several thousand frames per second. A number of these applications have been developed by the University of Southern California where fast light is exploited to create a light field display. In another application, fast light is coupled with a synchronized high speed camera to extract the 3D shape of an object in real time.




An Interactive 360° Light Field Display

August 2007

·

175 Reads

·

56 Citations

While a great deal of computer generated imagery is modeled and rendered in 3D, the vast majority of this 3D imagery is shown on 2D displays. Various forms of 3D displays have been contemplated and constructed for at least one hundred years [Lippman 1908], but only recent evolutions in digital capture, computation, and display have made functional and practical 3D displays possible.


Rendering for an interactive 360° light field display

July 2007

·

230 Reads

·

411 Citations

ACM Transactions on Graphics

We describe a set of rendering techniques for an autostereoscopic light field display able to present interactive 3D graphics to multiple simultaneous viewers 360 degrees around the display. The display consists of a high-speed video projector, a spinning mirror covered by a holographic diffuser, and FPGA circuitry to decode specially rendered DVI video signals. The display uses a standard programmable graphics card to render over 5,000 images per second of interactive 3D graphics, projecting 360-degree views with 1.25 degree separation up to 20 updates per second. We describe the system's projection geometry and its calibration process, and we present a multiple-center-of-projection rendering technique for creating perspective-correct images from arbitrary viewpoints around the display. Our projection technique allows correct vertical perspective and parallax to be rendered for any height and distance when these parameters are known, and we demonstrate this effect with interactive raster graphics using a tracking system to measure the viewer's height and distance. We further apply our projection technique to the display of photographed light fields with accurate horizontal and vertical parallax. We conclude with a discussion of the display's visual accommodation performance and discuss techniques for displaying color imagery.


Stereoscopic Displays and Virtual Reality Systems XIV

January 2007

·

235 Reads

·

5 Citations

Proceedings of SPIE - The International Society for Optical Engineering

The papers included in this volume were part of the technical conference cited on the cover and title page. Papers were selected and subject to review by the editors and conference program committee. Some conference presentations may not be available for publication. The papers published in these proceedings reflect the work and thoughts of the authors and are published herein as submitted. The publishers are not responsible for the validity of the information or for any outcomes resulting from reliance thereon.


Citations (32)


... [102] demonstrated a similar system over a wide area network but achieved only limited resolution and frame rate with the technology of the day. University of Southern California used a technically demanding set-up with 3D face scanning and an autostereoscopic 3D display to generate multiple 'face tracked' viewpoints [157]. This had the disadvantage of displaying a disembodied head. ...

Reference:

Telethrone: a situated display using retro-reflection based multi-view toward remote collaboration in small dynamic groups
Headspin: A one-to-many 3d video teleconferencing system
  • Citing Article
  • January 2009

... University of Southern California used a technically demanding real-time set-up with 3D face scanning and an autostereoscopic 3D display to generate multiple 'face tracked' viewpoints [308]. This had the disadvantage of displaying a disembodied head. ...

HeadSPIN: a one-to-many 3D video teleconferencing system
  • Citing Conference Paper
  • August 2009

... Firstly, 50 applications from the Oculus Store (Meta, 2023c) were chosen due to the historical significance of the Oculus tethered PC VR HMD devices, such as the Oculus Rift, prior to the shift from Oculus/Meta to standalone HMDs in 2019. Sideloading is possible with the majority of consumer-level HMDs, with recent Google Ventures investment in the space (Siegler, 2022) showcasing wider industry interest in the practice, therefore warranting the inclusion of 50 applications from the largest independent sideloading and early access VR application platform SideQuest (2023). Finally, although representing only a small portion of the VR application market, and therefore with only 30 applications included, the largest VR application subscription service Viveport Infinity from the HTC Corporation (2023) was included to ensure complete coverage of all the ways PC VR and standalone VR applications can be accessed. ...

Open virtual reality
  • Citing Conference Paper
  • March 2013

... On the other hand, latencies and tracking-errors tend to provoke odd, uneasy sensations through conflicting visual stimulation and postural feedback, and adaptation can lead to reciprocal after effects. To release the user from the encumbrance of wearing an HMD a device named as Binocular Omni-Orientation Monitor (BOOM) has been designed [5]. Here the observer regards the stereo images as if looking through a pair of binoculars. ...

Proliferation of counterbalanced, CRT-based stereoscopic displays for virtual environment viewing and control
  • Citing Article
  • April 1994

Proceedings of SPIE - The International Society for Optical Engineering

... Greuel et al., therefore, undertook the first research with IVR. This article explores using music to control object behaviors and create visually active immersive virtual environments [24]. The article by Bell and Smith, [25] presented a behavior mapping strategy to measure the impact of spatial changes in special care units, using an Alzheimer unit as an example, to improve quality of care, and Shepley and Wilson [26] used behavior mapping to gather information about the design of a new HIV/AIDS skilled nursing facility. ...

Sculpting 3D worlds with music: advanced texturing techniques
  • Citing Article
  • April 1996

Proceedings of SPIE - The International Society for Optical Engineering

... Pictured inFigure 3 is a soldier reviewing the inside of a new concept design in a CAVE while wearing a special glove that provides haptic (touch) control of virtual objects. For additional CAVE information see [18]. [19]) is pictured inFigure 4 (being used by a commercial aircraft review team). It is a rigid, flat vertical high-resolution 3D video and audio environment. ...

Virtual model displays
  • Citing Article
  • May 1997

Proceedings of SPIE - The International Society for Optical Engineering

... Virtual reality systems have been implemented in the past with what is widely considered the first virtrual reality headmounted display (HMD) being created in 1968 [1]. Since then, various virtual reality systems have been developed and demonstrated for a wide variety of applications including medical imaging and minimally invasive surgery [2]. Consumer-oriented VR devices have been available since the 1990s with little commercial success. ...

Computer animation for minimally invasive surgery: computer system requirements and preferred implementations
  • Citing Article
  • April 1994

Proceedings of SPIE - The International Society for Optical Engineering

... Finally, this paper describes rapid prototyping of devices, with foundations on the design of 3D devices [1]. It also encompasses the reasoning on the usage of different materials for devices, which has similarities to the design of some haptic devices [7] ...

Interaction Devices for Hands-On Desktop Design
  • Citing Article
  • Full-text available
  • May 2003

Proceedings of SPIE - The International Society for Optical Engineering