Real-time scenegraph creation and manipulation in an immersive environment using an iPhone
Is Version Of
Virtual reality (VR) display systems have undergone significant research and development since their introduction. Early systems used a head mounted display to provide users with a means of viewing a virtual environment. With the development of the CAVE Automatic Virtual Environment (CAVETM) that used multiple projectors and display surfaces, users gained a three-dimensional (3D) sense of the virtual environment and a sense of depth and immersion in the synthetic environment without bulky headwear.
One of the key challenges with creating VR environments is the creation and manipulation of 3D models to generate immersive scenes. Traditionally these models and scenes have been created on a desktop computer, using a two-dimensional display system. Although these systems have seen widespread adoption throughout academia and industry, they have significant drawbacks. When creating 3D models, the need to understand model size and spatial relationships between models is critical. This can be difficult to perceive on a 2D display system.
Another important challenge is controlling applications running in an immersive environment. Devices such as gamepads and wands are small and lightweight, making them easily carried inside an immersive environment. However, these devices require users to remember what behavior is tied to each physical button on the device. Other devices, such as Tablet PCs, overcome this limitation by offering a rich user interface, at the expense of being larger and usually requiring two hands to operate. Early handheld devices, such as PDAs, were investigated for use in immersive environments and provided users with a graphical interface in a small device, but were limited by low resolution screens and poor hardware capabilities.
This thesis presents a two part solution to these issues, in the form of a VR application, known as iSceneBuilder, and a controlling iPhone application. Built using VR Juggler and OpenSceneGraph, iSceneBuilder allows users to create and manipulate a scenegraph -- a common data structure for managing a 3D scene. By using a custom animation engine, iSceneBuilder smoothly animates changes to the scene, helping users understand how changes are being applied. iSceneBuilder was designed to run effectively on a large computer cluster and can take advantage of multiple processing cores by being designed for concurrency.
The iPhone application, which communicates with iSceneBuilder via a TCP/IP socket, provides users with a means of controlling the immersive environment. Built using Cocoa Touch, the application offers a rich user interface on a small, handheld device that, because of iPhone's capacitive touch screen, can be controlled with no additional hardware. This application allows users to browse the remote filesystem to load models into the immersive application. It also displays the scenegraph, allowing users to select a node to manipulate. Available manipulations include translation, rotation and scaling, as well as changing the transparency of a node. Additionally, users can navigate inside the immersive environment by using iPhone's built-in accelerometer.
Several uses for this system were demonstrated by creating new scenes, with varying levels of complexity. Both scenes were constructed inside an immersive environment, which allowed users to immediately perceive the size of models and their spatial relationships to other models. The first use case involved loading several models, then moving and rotating them into their final locations. The completed scene was saved as a single file that can be used in other applications. The second use involved creating several smaller scenes, then combining those smaller scenes into a larger scene. This use took advantage of iSceneBuilder's ability to manipulate components inside a larger scenegraph. Finally, this system shows promise for future development into an application that can support engineering design work.