Interactive Immersive VR-Environment

oculus workplace

Interactive Immersive VR-Environment

When visualizing complex product structures, buildings or engineering designs, Virtual Reality (VR) is still the best way of presenting the concepts to a broader audience for discussion in the design phase.

Products such as the VR-Wall excel at visualizing 3D-models towards groups of people at the cost of missing first-person immersion in the scene. Observers are able to see depth in the image, while it is still recognized as inside a frame (similar to viewing outside of a window). Most tracking solutions enable to detect movements of a single viewer to adjust the viewport accordingly for further immersion, but the 180-degree plane of the wall still restricts the angles.

Head-mounted displays (HMD) are best suited for fully immersing into the scene. Due to recent advances in display and sensor technologies, HMDs are produced for a broader audience and at a cheaper price as previous solutions. High-resolution accelerometers, magnetometers and optical tracking components enable to calculate the head position in the virtual environment to present a natural way of interaction with the model data. Additional sensors for hand and body detection give users  the ability of manipulating scene settings and picking specific parts of a model.


Phase One

The first phase of this research project is aimed towards exploring the combination of state-of-the-art technologies and software to build a highly immersive VR-Environment with included hand- and finger-detection for natural interactions:

Oculus Rift

 assemblycamera dk2


(Source: Oculus)

The Oculus Rift is a cost-efficient HMD, originally aimed towards game developers and gamers. In it's current iteration, Development Kit 2 (DK2), it features a 75hz 1920x1080 pixel, low-persistance AMOLED display. Instead of using two mounted monitors, optical lenses separate the image of this larger screen for each eye to gain depth perception with a 100° field of view.

By combining the display with inertia sensors and an additional IR-camera, it is possible to track the user's head position and rotation. This data can then be used to calculate the camera position inside a virtual environment.

Leap Motion

leap oculus

(Source: LeapMotion)


The Leap Motion is a hand tracking device which can be affixed on the front of the Oculus Rift. It uses two stereo infrared cameras, in combination with infrared LEDs, to calculate the position and rotation of each hand and finger.

It also features passthrough imaging of the camera images to enable Augmented Virtuality (AV) features (e.g. overlaying the Leap's camera images with hand models or virtual objects). By using collision detection algorithms of traditional 3D-engines, it is possible to interact with entities in the simulated 3D-space or to select entries in a floating menu.

 leap particles

 Sensor Fusion

Using multiple low-cost sensors offers benefits in tracking diversity. While the Leap Motion is the most flexible solution for hand tracking available, it offers little beyond additional lower arm tracking out of inverse kinematics. Augmenting the Leap's data with additional rotational joint information supplied by Microsoft's Kinect 2 system offers the benefit of giving full body tracking.


Instead of only using joint positional data, which ties the limb position to a similar skeleton in virtual reality, rotational data enables the animation of skeletons with a different topology for virtual avatars, if needed.

 Phase Two

Phase two of the projects is evaluating usage of Game Engines for virtual reality applications in construction and engineering. As current project data should be reused, Building Information Modeling (BIM) is a key factor in creating realistic scenarios of buildings and construction sites. While proprietary solutions exist for the conversion, we favor open source projects. 

Additionally, georeferencing solutions and further interaction elements are studied.

Unreal Engine 4

The prototype importer for Unreal Engine 4 is using the OpenSource BIMServer as central hub for project organization and polling geometry data, when needed. The import is using a binary geometry format for faster transmission of geometry data and available in the Unreal Engine Editor. 

figure 6


A native wrapper for Unity was developed in C#, using the IFC Engine DLL for parsing IFC files, exported from common planning and geometry software. This allows to import IFC data while the current scenario is running, allowing a more dynamic approach for level building. 

Unity IFC Import


Material associative elements inside the scene (material database) allows for templating of materials to specific IFC elements, such as IfcWindow and a glass texture assignment.

Phase Three

In phase three, different scenarios are being evaluated on their feasibility in engineering and construction.

Project Data

Title: Interactive Immersive VR-Environment

Type: Internal project

Researcher: Thomas Hilfert, M.Sc.

Chair of Computing in Engineering
Bldg. IC, Room 6-59  
Universitätsstraße 150 
44780 Bochum