ARKitTA-692719332.jpg

Augmented Reality (AR)  is coming quickly to your pocket. This year’s developer conferences for Google, Facebook, and Apple all had a very heavy focus on AR and VR technology; it is a safe assumption that augmented reality will be the next “mobile.”

At The Apple Worldwide Developers Conference (WWDC) 2017 in June, Apple introduced a new software development library/kit that enables developers to turn iPhones and iPads into augmented reality devices without the burden of custom implementing a Simultaneous Location and Mapping (SLAM) solution. In other words, Pokemon Go is just the beginning. There are approximately 300,000 iOS app developers in the United States market alone, greatly affecting the Apple platform by making it easy to develop with augmented technologies.

Pulse Design Group  is focused on the comprehensive field of immersive computing from Virtual Reality to Augmented Reality and Mixed Reality. We started building with the toolkit immediately to investigate potential use cases, and to get a better understanding of feasibility and timelines. There are great possibilities for Augmented Reality on iOS devices, and remarkable use cases for training simulation and architectural applications. The following is an example of our early proof of concepts and provides a brief rundown of the process.

SteveSquat.gif

In this gif you can see how the iPad is able to move freely around the room while presenting content that appears to persist in the virtual space. Apple accomplishes this through what they call “Visual-Inertial-Odometry,” which is the blending/cross referencing input from the camera and the internal motion sensors in the mobile device.

The current hardware and software requirement is any iPhone or iPad with an A9 processor or better, and iOS 11. We installed the update and got started! We used the Unity development engine in this project due to its current support for ARKit. Below are the techie details for how we implemented one of our models into an ARKit project.

A current version of Unity needed to be installed to begin development for ARKit. The link to download can be found here.

After installing of Unity create a new project, then download the plugin for ARKit. 

 

Now the fun begins!

Initially, we took one of our existing architectural models of an operating room and imported the .fbx file from Autodesk 3DS Max into one of the example ARKit scene samples that were provided in the above link to the asset store, which worked immediately! One difficult thing to account for is how ARKit will read the real world space, therefore it is necessary to package and export your project to the device and run the software on compatible hardware. It is hard to get an exact sense of what is happening in the app from the Unity viewport/simulator alone. The ARKit plugin for Unity performs most of its magic in relation to the camera viewport, which doesn’t exactly come across in the in editor viewport.

At this point, we built and ran the program on the device and found that our model was inhabiting a surface plane that was detected and properly used by ARKit! However, this was not enough and we quickly realized that we needed to implement some sort of simple control scheme to navigate the model. We concluded that the best method was the tried and true “pinch to zoom” and “tap to place” control scheme. To do this, we implemented a C# library called “LeanTouch” which allowed us to set some of this behavior in Unity. One issue we ran into was separating a “one finger tap” from a “two finger pinch.” This was an issue because users typically do not remove both pinching fingers from the screen at the exact same time, leaving a brief moment where one finger was down, registering with Unity the “one finger tap” command, and thereby moving the model to the last registered tap. We are currently getting around this with a three finger requirement for tap to place, so it will not interfere with the pinch to scale implementation.

Special note time: Pulse Design Group has a branded RGB shade of green that is used for all promotional marketing and other visual elements. By simply finding and changing the RGB value of the particle emission system, we were effectively able to customize the look of the application to fit our company’s branding.

An ARKit demo that features a flyover map application.

An ARKit demo that features a flyover map application.

The result is an impressive demo to show and introduce ARKit to our clients. Additionally, we are able to showcase our work in a 1:1 scale, or in a smaller scale dollhouse view, as long as we are using an iOS device!

At Pulse Design Group, we will continue to improve the ARKit demo, add functionality and interaction, investigate the best possible use-case for the software, and build internal concepts. ARKit is very promising and appears to make AR development much easier than it was just three months ago.  


To learn more or further discuss AR technology, contact Andrew London at alondon@pulsedesigngroup.com or Steve Biegun at sbiegun@pulsedesigngroup.com

 

1 Comment