It's difficult to truly understand virtual reality without putting on a headset. Mixed reality video, a combination of physical and virtual image capture, is the most effective way to communicate the VR experience without the use of a headset.

Pulse Design Group utilizes virtual reality as a design and communication tool. Visualizing an environment early in the design process allows users to understand an environment and make real-time design decisions.

Emily Grotenhuis, Marketing Coordinator at Pulse Design Group making her acting debut.

This video demonstrates how a user can walk through a virtual environment and manipulate objects in real-time with realistic image quality. The audience is able to visualize the space around the user to fully understand the environment. The production of this video involved careful coordination of physical and virtual elements in an effective programming solution. Disclaimer; we are about to dive into some crazy technical information.


Credit: The Verge

Credit: The Verge

HTC recently announced their new Vive Tracker device to be used in conjunction with the HTC Vive. With this tracker, users are able to track real physical objects in VR. By mounting the device on a real object (yes, that includes cats and children), users are able to increase the level of immersion and realism. The Vive Tracker will be released late 2017 and available for $100 at the Vive website.

Pulse Design Group has had the opportunity to utilize this device to explore new possibilities in VR; not just with mixed reality video, but with healthcare training simulation and architectural applications. There are incredible possibilities for this device in applications other than gaming and entertainment.

Hooray, we're halfway there!

Hooray, we're halfway there!

Using a 3D printed hot shoe to 1/4" screw mount, I mounted the Vive Tracker to a DSLR camera. In Unreal Engine, our chosen VR development program, I then designated the Vive Tracker's position and orientation to correlate to the real physical camera's position and orientation. Simple enough, right?  I used to think so.

Let's talk about what happens in Unreal Engine (version 4.15.1) to get the desired result. Unreal Engine does not offer official support for mixed reality development (as of March 2017), although they plan on integrating this functionality in a later version of the program. Unity does have a better workflow for mixed reality video production, but we use Unreal Engine for the photo-realistic capabilities and accessibility. Compared to what will be possible, our solution is complicated but effective.

Caution: Unreal Engine mumbo-jumbo below.

The Vive Tracker seamlessly integrates into UE4. After pairing the device and activating it in SteamVR, I added a 3D model for the tracker to my player character. In blueprints, I assign the tracker object to set the location and orientation to those of the tracked device (by default, our Vive Tracker is registered as device ID 5.) The simple solution at this point would be to apply a camera to be attached to that tracked device and make a second player view of that camera's view. Unfortunately, a parented camera will still inherit the position and orientation of the Vive head-mounted display. Did that make sense? Because now it gets weird.

"If necessity is the mother of invention, then laziness is the father." I think desperation is a distant uncle.

"If necessity is the mother of invention, then laziness is the father." I think desperation is a distant uncle.

I assigned a "Scene Capture Component" to be parented to the Vive Tracker. That component updates a texture on tick (90 times per second) that is then displayed on a cube outside the scene. That "canvas" image is attached to a camera which is designated as the second player view. That canvas image, with the constantly updating texture, follows the camera as it seeks to follow the position and orientation of the first player's Vive headset. The end result is that the first player can walk and manipulate the VR environment with full functionality while a second player can move the Vive Tracker around that space. Complex, but effective.

Blueprints

Let's walk through some of the blueprints we used to accomplish this goal. Below, you can see the nodes we set up to call and update the tracker position and orientation in the player controller. This happens "on tick" (90 times per second.)

Remember that the "Scene Capture Component" must be a child of the tracker in the player controller. To account for the distance between the camera sensor and the base of the tracker, you may need to adjust the relative position of the child component (we moved our capture component down 7cm.)

To the right, you see the hierarchy for the "player 2" camera object. The "projected image" object contains the texture being captured from the above "scene capture component." In order to avoid light spill or edge artifacts, I added a solid black backdrop. Be sure to remove post-process effects from the camera in this blueprint. If you want depth of field, motion blur, or other cinematic effects, consider adding these to the previously mentioned capture component.

After choosing two players and beginning the simulation, we configured Open Broadcaster Software (OBS) to capture the second player view while recording the physical image with our DSLR camera.

Phew.


Now that we had the ability to capture and synchronize both the virtual and physical images, the rest of the pieces began to fall into place. We had the opportunity to utilize the green screen studio at nearby FinditKC, a video and media production studio. We took care to match the lighting, camera field of view, shutter speed, and focus to match the Unreal Engine equivalents. After recording both physical and virtual footage, I was able to composite the different elements in Adobe After Effects to create the final product.

So what's next?

At the upcoming AIA Conference on Architecture 2017, we're leading a session called "Virtual and Augmented Reality as Impactful Communication Tools." We'll showcase this video in our presentation before inviting audience members to try the headset for themselves. We are confident that this mixed reality video will provide a better understanding of a VR experience without actually using a headset. Hopefully, Unreal Engine will soon have official support for mixed reality video production, which would enable architecture firms to easily create mixed reality architectural visualization applications.

We're also very excited about opportunities to use the Vive Tracker. We plan on using this device to augment healthcare simulations and applications. Over the next few months, we will start hearing about a wide array of applications for this device that do not necessarily apply to gaming applications. If you are interested in reading more about the Vive Tracker, RoadtoVR demonstrated several different applications for the device. If you are developing VR applications on Unity, Vive has released a helpful guide to get started to integrate the device into your program.

If you would like to hear more and discuss the topic, contact Steve Biegun at sbiegun@pulsedesigngroup.com .