Pulse Design Group’s virtual reality development team is constantly researching new and innovative approaches to create quality, dynamic content. One recent focus has been animation. The current method of producing animation is an inefficient, step-by-step workflow known as “keyframing.” Keyframing utilizes an animator posed model, then adds a keyframe (or point of reference) to the current pose of the model. This method can be tedious for complex and realistic animations. Keyframing has been the industry standard for years, dating back to the film Snow White, which was produced with a form of keyframing. Technology is advancing towards real-time, actor performance capture of motion, or “mocap." Mocap is a huge asset in the animation industry because it streamlines efficiencies, creates realistic, quality work, and can be done in real-time with programs like Unreal or Unity3d.
Several tools can be utilized to get started with mocap, one of which is the HTC Vive. A program called Ikinema can track the Vive's headset and controllers to capture an actor’s performance. There are some limitations as the only tracked points are the actor’s hands and the headset. This Vive tracking method produces smoother captured movements. Motion can also be captured using a Microsoft Kinect. Kinect is a combination of various sensors: a wide-angle time-of-flight camera, infrared sensor and emitter, 1080p video camera, and microphone. This device is capable of tracking the user's heart rate, facial expressions, the weight put on limbs, and position and orientation of 25 individual joints. Kinect, unlike the Ikinema setup with the default configuration of the Vive, is able to track the user's feet in real-time. This can be accomplished with the Vive, but the process requires additional trackers.
Through research and innovative design, we are now able to see how the full body integrates into virtual reality using a Microsoft Kinect. With this early development, we have the ability to integrate this type of motion capture into our existing workflow. After setting up the environment, we were able to get a version of mocap running! The video below is the result of a working mocap version.
Post processing will include smoothing the movements and reducing jitter in real-time to enhance the animation realism. Realistic character movement in virtual reality applications is a crucial step to delivering a wider array of immersive experiences, and this exciting development will lead to further research and design.
For any information and questions, please contact Steve Biegun or Andrew London.