View Column AV-42
Keywords: virtual reality, computing, telepresence, simulation, human interface
Published in the November-1990 issue of Analog Science Fiction & Fact Magazine;
This column was written and submitted 4/10/90 and is copyrighted ©1990, John G. Cramer. All rights reserved.
No part may be reproduced in any form without the explicit permission of the author.
Last Saturday I made my first journey into virtual reality. I walked with giant strides around a city called Seattle. I leaped the Columbia Center, the tallest building in the city, with a single bound. I dove beneath the surface of Puget Sound and watched a pod of whales heading north toward Canada. I hovered above the Space Needle, then dropped inside to enjoy its panoramic view and to examine its structural details. I raced a Washington State ferry across the Sound from Seattle to Bremerton. I caught up with it just as it approached the Bremerton dock and was able to watch the docking operation from a perch on the roof of the bridge. Then I flew back across the Sound and peered down into the Kingdome, looking in vain for a winning team. Finally, I pointed my magic glove across the Sound to the Olympic Peninsula, flew there, and examined a group of towering snow-capped peaks from all angles. I did all of this without leaving a small building on the campus of the University of Washington.
No, I wasn't in some testing laboratory for mind-altering drugs. I was in the new Human Interface Technology Laboratory (HITL) headed by Prof. Thomas A. Furness, III, and Dr. Robert Jacobson. I was experiencing one of their newly created "virtual environments" called Virtual Seattle. I was wearing "Eyephones®" and a Data Glove and was interacting directly with several hundred thousand dollars worth of computer hardware ... But let me start at the beginning.
My Alternate View column in last September's Analog ["Telepresence: Reach Out and Grab Someone"] was about telepresence, the emerging technology that will soon allow us to electronically connect our senses, particularly the senses of sight and touch, to some distant location and to participate in an activity that may be happening on the other side of the planet, on the bottom of the ocean, or even in low-earth orbit. Many wonderful possibilities will emerge as this technology develops and becomes economically accessible to all of us. My AV column described some of them. Since that column was written, however, telepresence has become even more important. A NASA committee recently added up the number of EVA hours that will be required to maintain the space station and found a factor of 10 too many hours to be feasible. Telepresence-controlled EVA may be NASA's only hope for making the present space station design viable.
The technological twin of telepresence, virtual reality, offers an equally wonderful spectrum of possibilities. If telepresence can put us into direct contact with a larger segment of the real world, virtual reality can put us into similar contact with worlds that never were, that might have been, or that can be created to order. The hardware interface to the user's senses is the same used in telepresence. But the sensory stimuli come not from the real world but from "virtual" world that exists only in the logic circuits, programming, and data storage of a computer. If telepresence will allow us to more fully explore the real world, virtual reality could allow us to explore Barsoom, Oz, Wonderland, and Middle Earth.
Instead of waxing ecstatic about the possibilities of virtual reality, however, let me describe in some detail the equipment that I used, its operation, and the results. Let's start with the Data Glove, which I've already mentioned in the telepresence article. The Data Glove gives the user a way of interacting with the virtual world he sees. A computer-drawn representation of a disembodied hand is usually a part of the user's view of the virtual world. When he moves his Gloved hand or fingers, he can see the virtual hand and fingers move in the same way. He can sometimes grasp virtual objects and move them from one place to another. By pointing with his hand in a certain direction he can make his virtual persona "fly" in that direction.
The Data Glove that I used is a commercial product. It was made of thin black nylon. "Tendons" of optical fibers were attached along the top of each of my fingers. The fibers made a loop near each finger tip and doubled back to the control box from which they came, joining a bundle of such fibers that ran from the glove to the gray rack-mounted control chassis. Light from a light-emitting diode in the control unit was carried on a round trip up the fiber, around the loop, and back to a phototransistor in the same control unit. Each fiber had been deliberately surface-damaged at flex-points passing over the finger joints. Bending a finger caused a bend in the fiber, which then leaked light from the region where the surface had been abraded. This reduced the light transmission of the fiber, telling the control unit that a finger was bent rather than straight.
The Data Glove's control unit is connected to a fast computer. A computer model of the human hand is used to interpret the fiber optics signals in terms of finger movements. The system could tell whether a given finger of my hand was extended or closed or whether my thumb was extended or bent toward my palm.
The virtual reality simulation also used the location of my Data Gloved hand, as measured with a pair of plastic-encased triple-axis coils. These receiver coils, attached to the back of the Glove, are oriented in three perpendicular spatial directions. A nearby three-axis transmitter coil is pulsed, one axis at a time, and the response of each receiver coil to each transmitter coil provides nine independent signal-strength measurements. This information is used to deduce the receiver-to-transmitter distance and direction and the spatial orientation of the receiver coils. The virtual reality simulation computers were provided with rapidly updated information on the location and orientation of my hand and my finger positions.
A similar set of coils was used to monitor the position and orientation of my head. This head-position measurement was a part of the Eyephones®, a binocular-vision video unit that I wore. It presented each of my eyes with a separate back-lighted color LCD displays similar to those in miniature Watchman® personal color television receivers. The pixel density of these units was not high and the refresh rate of the pictures was slow enough to be a bit jerky, but the effect they created was sufficient for the illusion that I was immersed in a complex three dimensional environment of large and colorful solid objects.
The pictures in the eyephones were generated by two large Silicon Graphics® display processors, very fast graphics-oriented computers running at top speed to keep the two Eyephone displays current. As I turned my head, or crouched down, or stood on tiptoes the head sensor informed the computer of this and the views presented to my eyes changed accordingly. The changing displays as my head moved and the vision of my hand and the stylized yellow representation of the Data Glove at the position of my real hand, hanging is space before me with no visible means of support, somehow convinced my brain's vision centers that this was the Real Thing. The "feel" of reality was uncanny. It must be experienced to be appreciated.
I had been participating in a special demonstration at the HITL laboratories arranged partly with computer equipment borrowed from VPL, Inc., after a week-long conference in Seattle on computer visualization. The staff and graduate students of the HITL had prepared several virtual environments in the months before the demonstration. One, as I mentioned, was a virtual representation of Seattle. It was not a detailed model. Downtown Seattle was represented by a rough representation of 2nd Avenue, with large red and orange parallelopipeds representing the office buildings on each side. This schematic landscape was populated with a few prominent Seattle landmarks, notably the Space Needle, the Kingdome, and the Washington State Ferry Terminal, complete with Sound-crossing Ferry. Puget Sound was represented by a flat blue sheet, and when I ventured below its surface by crouching down, I could see the two spindle shaped "whales" that moved from south to north. The lower parts of the "buildings" and the "mountains" could also be seen below the plane of the Sound.
A second virtual environment created by the HITL was an underwater world. Originally it had been intended to connect to the Virtual Seattle environment as the region below Puget Sound, but it had not yet been attached. It contained rocks, a shipwreck, a school of fish, an octopus, and a treasure chest complete with a bag of gold. I used the Data Glove to reach into the chest, grab the bag of gold, and lift it out. Another environment featured "statues" of giant figures in colorful folk-costumes.
After ten minutes of very active exploration of virtual worlds, I had to return to the real one and remove the Eyephones and Data Glove so the next "virtual explorer" could have her turn. It was rather disappointing to return to the real world, even if it does have superior image resolution. The colors are less bright. The world seems too cluttered. I could no longer point in the direction I wished to move and fly there. I could only walk or drive, and at such a low altitude that the view was terrible. I was no longer able to walk through walls to see what was on the other side. I had to find the doorways. I was reduced to a state of reduced control that did not feel completely comfortable or secure. I've heard the the VPL engineers find virtual reality entertaining and even habit forming. Now I understand why. We may be in for a generation of long-haired emaciated "virties" and "phone-heads" who have abandoned the real world for a virtual one.
Of what use is virtual reality? My brief experience reminds me of my first introduction to a microcomputer, back in the 1970s. Clearly a technological revolution was brewing, but I wasn't enough of a visionary to foresee the full spectrum of applications implicit in the little box. I saw the possibilities for computer games and word processing, but I completely missed spreadsheets and desktop publishing and AutoCAD® and MIDI interfaces for computer-generated music.
Virtual reality may bring a similar revolution. There will surely be a sizable demand for virtual-reality-based computer games. But I think that will not be a big part of the ultimate market (although you may some day have to pry the Eyephones and Data Suit away from your kids when you want to use the family VR set.) I think virtual reality, when it becomes cheap enough for every home, is going to change the way we learn, the way we organize information, the way we work, and even the way we think.
It`s clear that the potential for education is enormous. Imagine the Professor of Medicine conducting his students on a tour of a 100:1 scaleup of the human circulatory or nervous system. Imagine an architect showing his clients around a virtual model of the building he has just designed, allowing them to experience the layout and decor of the rooms, to try out various furniture arrangements, and to examine the view from the windows. Imagine being able to walk around on the Martian landscape, exploring a VR environment based on new data from NASA's latest radar-mapping Mars probe.
Or consider the effect on children. The right-brain/left-brain psychobabble that we hear these days in part concerns the relative use of the vision-processing centers of the brain and the speech-processing centers. With a keyboard-and-screen word processor like the Mac SE I'm using to write this, the speech centers are dominant. But with activities organized around virtual reality, the vision and kinetic centers of the brain will play the dominant role. I suspect that children who grow up learning from and interacting with the VR set instead of the TV set will have different skills and better integrated personalities. They should be active explorers, not be passive observers.
There's a story that Queen Victoria once paid a visit to the laboratory of Michael Faraday, a pioneer of physics who made many of the key discoveries of 19th century electricity and magnetism. Faraday was the finest experimental physicist of his day. He proudly conducted his Queen around the laboratory, demonstrating many amazing things for her with his coils and Leyden jars and Wimshurst machines and disk dynamos.
"But what," asked the Queen finally, "is the use of all this, Mr. Faraday?"
Faraday, clearly disappointed, studied the Queen. "Madam," he asked quietly, "of what use is a baby?"
John G. Cramer's 2016 nonfiction book (Amazon gives it 5 stars) describing his transactional interpretation of quantum mechanics, The Quantum Handshake - Entanglement, Nonlocality, and Transactions, (Springer, January-2016) is available online as a hardcover or eBook at: http://www.springer.com/gp/book/9783319246406 or https://www.amazon.com/dp/3319246402.
SF Novels by John Cramer: Printed editions of John's hard SF novels Twistor and Einstein's Bridge are available from Amazon at https://www.amazon.com/Twistor-John-Cramer/dp/048680450X and https://www.amazon.com/EINSTEINS-BRIDGE-H-John-Cramer/dp/0380975106. His new novel, Fermi's Question may be coming soon.
Alternate View Columns Online: Electronic reprints of 212 or more "The Alternate View" columns by John G. Cramer published in Analog between 1984 and the present are currently available online at: http://www.npl.washington.edu/av .
Mind Children, Hans Moravec, Harvard University Press, Boston (1989);
Data Glove and Eyephones:
"Interfaces for Advanced Computing," James D. Foley, Scientific American 257, #4, pp. 126-135, (Oct. - 1987).
Exit to website.