back
straight up
"Three-dimensional sound cues and speech-synthesis technologies were used to enhance the operator's overall situational awareness of the virtual data environment. "
VIEWlab HMD electronics system

previous next

Demonstration of the VIEW system


VIEW <1985>

The interactive Virtual Interface Environment Workstation (VIEW) was developed as a new kind of media-based display and control environment that is closely matched to human sensory and cognitive capabilities.

I proceeded with designs to grow the prototype HMD (VIVED) into an interactive, multisensory "system" (VIEW) that could be used as a "generic" user interface for a range of space station applications. This included designing the specifications for a dataglove input device based on Tom Zimmerman's sensor invention and negotiating the contract with VPL (their first) to build them for us. I ordered a head-tracker from Polhemus like the one we used at MIT and Atari, started working with Beth Wenzel to add 3D sound and speech I/O to the system, and began looking around for other hackers who could help put this all together, including Mark Bolas, Scott Foster, Steve Bryson, and Warren Robinett.

Advanced data display and manipulation concepts for information management were also developed with the VIEW system technology. Efforts included use of the system to create a display environment in which data manipulation and system monitoring tasks are organized in virtual display space around the operator. Through speech and gesture interaction with the virtual display, the operator could rapidly call up or delete information windows and reposition them in 3-space. The system also had the capability to display reconfigurable, virtual control panels that respond to glove-like tactile input devices worn by the operator.