Loading…

Autostereoscopic and haptic visualization for space exploration and mission design

We have developed a multi-modal virtual environment set-up by fusing visual and haptic images through the use of a new autostereoscopic display and a force-feedback haptic device. Most of the earlier visualization systems that integrate stereo displays and haptic devices have utilized polarized or s...

Full description

Saved in:
Bibliographic Details
Main Authors: Basdogan, C., Lum, M., Salcedo, J., Chow, E.
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We have developed a multi-modal virtual environment set-up by fusing visual and haptic images through the use of a new autostereoscopic display and a force-feedback haptic device. Most of the earlier visualization systems that integrate stereo displays and haptic devices have utilized polarized or shutter glasses for stereo vision (see, for example, Veldkamp et al., 1998, Chen et al., 1998, and Brederson et al., 2000). In this paper, we discuss the development stages and components of our set-up that allows a user to touch, feel, and manipulate virtual objects through a haptic device while seeing them in stereo without using any special eyewear. We also discuss the transformations involved in mapping the absolute coordinates of virtual objects into visual and haptic workspaces and the synchronization of cursor movements in these workspaces. Future applications of this work will include a) multi-modal visualization of planetary data and b) planning of space mission operations in virtual environments.
DOI:10.1109/HAPTIC.2002.998968