Cardinal Health worked hand-in-hand with Pulse Design Group to launch a new Virtual Reality customer experience for Cardinal Health’s Inventory Management Sales Solution. This exciting new tool is specifically designed to increase sales, shorten the sales cycle, and further position Cardinal Health as an innovative leader in the healthcare industry.
Viewing entries tagged
Augmented Reality (AR) is coming quickly to your pocket. This year’s developer conferences for Google, Facebook, and Apple all had a very heavy focus on AR and VR technology; it is a safe assumption that augmented reality will be the next “mobile.”
At The Apple Worldwide Developers Conference (WWDC) 2017 in June, Apple introduced a new software development library/kit that enables developers to turn iPhones and iPads into augmented reality devices without the burden of custom implementing a Simultaneous Location and Mapping (SLAM) solution. In other words, Pokemon Go is just the beginning. There are approximately 300,000 iOS app developers in the United States market alone, greatly affecting the Apple platform by making it easy to develop with augmented technologies.
Pulse Design Group is focused on the comprehensive field of immersive computing from Virtual Reality to Augmented Reality and Mixed Reality. We started building with the toolkit immediately to investigate potential use cases, and to get a better understanding of feasibility and timelines. There are great possibilities for Augmented Reality on iOS devices, and remarkable use cases for training simulation and architectural applications. The following is an example of our early proof of concepts and provides a brief rundown of the process.
In this gif you can see how the iPad is able to move freely around the room while presenting content that appears to persist in the virtual space. Apple accomplishes this through what they call “Visual-Inertial-Odometry,” which is the blending/cross referencing input from the camera and the internal motion sensors in the mobile device.
The current hardware and software requirement is any iPhone or iPad with an A9 processor or better, and iOS 11. We installed the update and got started! We used the Unity development engine in this project due to its current support for ARKit. Below are the techie details for how we implemented one of our models into an ARKit project.
A current version of Unity needed to be installed to begin development for ARKit. The link to download can be found here.
After installing of Unity create a new project, then download the plugin for ARKit.
Now the fun begins!
Initially, we took one of our existing architectural models of an operating room and imported the .fbx file from Autodesk 3DS Max into one of the example ARKit scene samples that were provided in the above link to the asset store, which worked immediately! One difficult thing to account for is how ARKit will read the real world space, therefore it is necessary to package and export your project to the device and run the software on compatible hardware. It is hard to get an exact sense of what is happening in the app from the Unity viewport/simulator alone. The ARKit plugin for Unity performs most of its magic in relation to the camera viewport, which doesn’t exactly come across in the in editor viewport.
At this point, we built and ran the program on the device and found that our model was inhabiting a surface plane that was detected and properly used by ARKit! However, this was not enough and we quickly realized that we needed to implement some sort of simple control scheme to navigate the model. We concluded that the best method was the tried and true “pinch to zoom” and “tap to place” control scheme. To do this, we implemented a C# library called “LeanTouch” which allowed us to set some of this behavior in Unity. One issue we ran into was separating a “one finger tap” from a “two finger pinch.” This was an issue because users typically do not remove both pinching fingers from the screen at the exact same time, leaving a brief moment where one finger was down, registering with Unity the “one finger tap” command, and thereby moving the model to the last registered tap. We are currently getting around this with a three finger requirement for tap to place, so it will not interfere with the pinch to scale implementation.
Special note time: Pulse Design Group has a branded RGB shade of green that is used for all promotional marketing and other visual elements. By simply finding and changing the RGB value of the particle emission system, we were effectively able to customize the look of the application to fit our company’s branding.
The result is an impressive demo to show and introduce ARKit to our clients. Additionally, we are able to showcase our work in a 1:1 scale, or in a smaller scale dollhouse view, as long as we are using an iOS device!
At Pulse Design Group, we will continue to improve the ARKit demo, add functionality and interaction, investigate the best possible use-case for the software, and build internal concepts. ARKit is very promising and appears to make AR development much easier than it was just three months ago.
It's difficult to really understand what virtual reality is like without putting a headset on your face. Mixed reality video, a combination of physical and virtual image capture, is probably the best means to explain what a VR experience is like without the use of a headset.
Let's talk about what that means and how we're utilizing exciting new technology like the Vive Tracker to improve communication with our audience.
Medical equipment planning is the strategic placement of equipment in buildings that require equipment; from hand sanitizer dispensers, to MRI machines. Planning and coordination of medical equipment can be one of the biggest challenges of healthcare design.
In this post, we discuss some of the ways that we utilize medical equipment planning to benefit our clients and to coordinate with teams to properly equip medical facilities.
Pulse Design Group upcoming presentation at the AIA Conference on Architecture 2017, "Virtual and Augmented Reality as Impactful Communication Tools," was recently listed as one of the hottest "can't miss" session at this year's upcoming conference
I am standing in the auditorium of an old abandoned German hospital. I can see dust across the room as it falls through a beam of light coming through a window. Above me, I can see the rafters of the abandoned architecture as if they were actually thirty feet above me. Scuff marks, upended tiles, and piles dirt across the floor indicate that the building has been uninhabited for several decades. An old abandoned piano is situated in the center of the room. What I see around me is unequivocally real. Well, as real as photographic reality capture can get. The program created by Realities.io hosts recreations of several different environments from across the globe.
Simply put, "photogrammetry" is the science of making measurements from photographs. Over the past few years, photogrammetry has become a popular method for recreating real objects, locations, and people as 3D models. Recognizing that the lines between real and virtually real blur more and more everyday, we sought out to improve our photogrammetry capabilities to allow us to take real elements into less-than-real environments. Realistically captured elements could be used in architecture to show complicated designs in virtual reality or to preserve past creations as you would with a photograph.
Before discussing how we went through this process, let's talk about drawbacks. The photo capture process can take a significant amount of time. Some photos may need to be eliminated, retouched, or entirely removed from the series of photos in order for the final photographic recreation to be realistic. Additionally, the 3D mesh that is constructed by the process can have significant errors, artifacts, and glitches. Some objects are easier to capture than others; reflective surfaces, people, and objects shot with moving backgrounds are notoriously difficult to recreate. That being said, the software used to stitch these objects together is rapidly improving.
Our goal was to use photogrammetry to recreate my self in VR as a 3d object so that I could be used as a stand-in character for realistic simulations. Rather than using mannequins, cut-out character silhouettes, or stylistic video game-y assets, we thought that using realistically captured 3D figures would have a greater impact.
Don't worry, it gets weirder.
In this article by Tested, the crew assembles a series of flat lights that provide clean, even, and balanced lighting on the model's face. For reality capture, you want your original 3D model to be lit as flat as possible. Lights, shadows, and reflections are all calculated in the VR simulation program (Unreal Engine, Unity, Cryengine, etc.)
Avoid reflections, bright lights, and moving elements. Any sort of movement in the scene can cause issues with the photo calculation. Basically, avoid everything that we did in our first attempt:
In our first attempt, Callum captured close to 200 pictures from varying angles around the room. The most difficult part of this attempt was getting me to keep my arms parallel to the floor for the duration of the shoot - a much more difficult feat than I had imagined. I did not initially understand how difficult it can be to remain perfectly still for several minutes. Optimally, the person being captured should not blink, talk, or turn their head. Breathing is acceptable though not encouraged. The model that was stitched together in Autodesk Remake had serious calculation issues caused by movement in our setting, so we gave it a second attempt.
In our second attempt, we propped my hands with tripods to ensure that they would be perfectly parallel to the floor. After the capture, we simply removed the tripods from the 3D file. The 3D model had a significantly higher level of quality and resolution than our first attempt.
After exporting to an FBX file format to 3DS Max, we processed and rigged the 3D model for VR. The file is extremely high-poly (solid 3D models are composed of meshes of polygons. This had hundreds of thousands of polygons.) For optimization, we could manually reduce quality from parts of the image. In a scene with multiple photogrammetry captured characters, it would be essential to reduce the polygon resolution of as many models as possible. Additionally, I was able to process and touch-up the created photographic textures in Photoshop to ensure that lighting and shadows were even - especially across the front of the 3D model.
In the future, photogrammetry will become easier as the means for positional calculation become more advanced. If we were to have captured our images with higher resolution cameras and with uncompressed file formats, we would have had less problems with compression and color depth. Perhaps new phones with multiple camera sensors will bring photogrammetry to the masses with easy-to-use tools for development and distribution.
You may have heard of a new app taking the smart phone gaming world by storm. Since Pokemon Go’s release, Nintendo’s valuation increased by over $9 billion. It quickly surpassed Twitter and Instagram in daily active users. On top of that, local businesses have capitalized on the growing trend by offering deals and discounts to users who play the game.
Everyone seems to be hopping on board the Pokemon Go train. What is it about this new game that is so engaging that it can have such a tremendous impact? If I had to venture a guess, it doesn't have anything to do with Nintendo's trademark yellow rodent.
At its core, "Pokemon Go" is an augmented reality smartphone game. The same technology that powers this wildly successful game has the capability to become an integral tool for architecture and construction over the next several years.
Augmented reality (AR) has become the hot new buzzword in the tech industry. Simply put, AR is a form of technology that superimposes computer-generated data on top of what a user would typically see in the real world. If AR turns out to be even half as exciting as I anticipate, AR could revolutionize every industry from architecture and construction to retail and education.
Saying that Pokemon Go is an AR application is certainly not incorrect; the image that is displayed on the user’s smart phone is, in fact, computer generated and it is, in fact, overlaid on what the smart phone camera sees. However, I would argue that the whole idea of augmented reality is far more broad than what we have seen from this game. I was recently a guest on KCUR 89.3 to talk about AR, Pokemon Go, and how this new type of technology will change the world.
When we talk about AR as it can be used for practical (not gaming) applications, we are typically referring to hardware rather than software. Products like the Microsoft Hololens, Magic Leap, and the Meta headset are all exciting uses of AR technology that we will hear more about over the next few years. Companies like Apple, Google, and Samsung have been investing billions into new AR technology in anticipation of this new wave. AR has burst onto the tech scene and it is certainly here to stay.
The Microsoft Hololens is a head-mounted AR headset that overlays holograms onto what the user sees. It uses cameras to scan the room the user’s area so that it can overlay elements with careful precision. The image in the headset is projected onto a reflective transparent panel in front of the user’s eyes. This enables the user to see through the image while also seeing the overlaid information.
In my opinion, the Hololens is a fantastic use of technology. The headset sits at a lofty $3,000 and the field of view is extremely small (imagine holding a deck of cards at arm’s length. That is roughly how large the overlay image is in the Hololens), but there is a massive amount of potential packed into this first generation headset. With the Hololens, a user can manipulate holograms and visual elements in their AR display by making gestures with their hands. NASA currently uses the Microsoft Hololens to communicate and direct the astronauts on the International Space Station.
The Meta 2 headset is another head-mounted AR system that is new to the scene. Boasting a 2560 x 1440 high-dpi display and a 90-degree field of view, the headset seems like a fantastic alternative to the Hololens.
While the Hololens is an untethered device that allows the wearer to roam around their room with near-perfect tracking, the Meta headset remains tethered to a PC. This allows for the Meta to have (arguably) greater flexibility with processing power, battery usage, display resolution, and cost. In my opinion, the future of AR won't involve cables.
There is very little public information about Magic Leap. After several rounds of investments, Magic Leap is currently valued at $4.5 billion. They are currently partnered with Lucasfilm’s ILMxLAB and they have received investments from giants such as Disney, Google, J.P. Morgan, and China’s e-commerce powerhouse Alibaba.
We can expect to hear more (or rather anything) about Magic Leap in the next year. Magic Leap is rumored to publicly unveil their new device at CES in early 2017. Expect a blog post either about how exciting the new hardware seems to be or about how disillusioned I was about Magic Leap.
What we know is that Magic Leap is creating an AR system that seems to overlay images with better quality than what is seen with the Microsoft Hololens or the Meta 2 headset. Some tech enthusiasts speculate that the Magic Leap system might literally project light onto the user’s eyes. It might also be as simple as a pair of glasses that the user wears in a non-obtrusive manner. Whatever the case may be, the visual quality seems to be miles beyond anything that we have seen from previous augmented reality headsets. We can expect to hear more about what exactly Magic Leap is and when it will be released over the next year. Surely, I won’t shut up about it any time soon.
Augmented reality will be the next big wave of technology after virtual reality. Over the next 5 years, we can reasonably expect to see a surge in investment and anticipation for new wearable computers. Honestly, I wouldn't be surprised if we replace computer monitors in the future with VR/AR systems that do a better job of providing real-time information. Before long, we may view "Pokemon Go" as an extremely primitive use of AR from a more simple time. Either that, or they will be projected directly onto our eyeballs - a concept that I find as equally intriguing as I do terrifying.
We launched our new website! Check it out at www.pulsedesigngroup.com.
The first time that you put a virtual reality headset on your face is a very memorable moment. The Matrix was released only less than 2 decades ago, and already we're experiencing the same feeling that Neo had when he jacked into the matrix for the first time. As I write this article, the first generation of consumer by HTC and Oculus are being manufactured and delivered to eager enthusiasts and developers. An estimated 200 million VR headsets will be sold by 2020. http://fortune.com/2016/01/21/200-million-vr-headsets-2020/. This year, millions of those consumers will experience their first "Neo moment."
For the next couple thousand words, I'll talk about where the virtual reality industry is at, what to expect, and why I probably won't shut up about VR anytime soon. There are misconceptions, fears, and there is more speculation than both. I'll also tell you about the projects that I've had the opportunity to work on at Pulse Design Group.
In 2011, Palmer Luckey created a head-mounted display (HMD) in his parents' garage using duct tape and cardboard. Soon after, he launched a Kickstarter campaign to begin developing a headset called "Oculus Rift." The Kickstarter campaign was wildly successful and reached ten times the original goal. Later that year, Facebook acquired Oculus for 2 billion dollars. Since then, Oculus has released two iterations of developer kit headsets.
In 2015, HTC announced that they were producing their own virtual reality headset. The news about their new product, the HTC Vive, was met with mixed reception. Why was HTC, a phone manufacturing company creating a VR headset? How would it compare to the Oculus Rift? Does anyone else think that 'Vive' is a silly name?
The HTC Vive, unlike the Oculus Rift, promised room-scale tracking, tracked hand controllers, and the "chaperone system," a system designed to prevent the user from walking into walls and objects in their real environment. Insert image of chaperone Oculus will likely include each of these features in the future, but the first generation of Vive products will include these when the first consumer version launches. The Oculus Rift currently costs $599, HTC Vive costs $799.
As consumers begin receiving their new virtual reality headsets, we'll see more and more headlines about VR and new technology. This will be the first year that millions of people will experience virtual reality. Users will have the opportunity to walk through castle ruins in Germany, fly through a cityscape like an eagle , and experience space combat with a whole new level of immersion.
What could go wrong? These are the most common hesitations about virtual reality:
I get motion sick from roller coasters and airplanes. I'll probably get motion sick in virtual reality, so I won't try it.
I don't play video games, so I'm not interested in virtual reality.
How expensive!? This is cool stuff, but I won't buy in.
These are all very valid concerns. Let's go through each of them and dive into the details.
I get motion sick from roller coasters and airplanes. I'll probably get motion sick in virtual reality, so I won't try it.
The most common fear for first-time users of Virtual Reality is nausea. Most adults have experienced some sort of motion sickness from roller coasters and airplane turbulence, so wouldn't they also likely experience motion sickness from virtual roller coasters and airplanes?
When it comes to virtual reality, what we're concerned with is actually called "simulator sickness." Sim sickness was first discovered (what a miserable thing to discover, right?) when helicopter pilots would train in virtual simulations. Simply put, sim sickness is confusing movement rather than too much movement. This is definitely a valid concern. I've experienced sim sickness and I understand that it is more than enough to prevent someone from ever trying VR for a second time. If it makes you sick for the first time, you'll probably never put a VR headset on your head ever again.
The most common cause of sim sickness is unanticipated movement. Sim sickness can also be caused by technical issues, such as low framerate, poor visual quality, and unnatural perception.
Most instances of sim sickness are easily preventable by the developer. If the user gets queasy, it is probably because of some disorienting factor in the program itself - and I say that as a developer. The proverbial Golden Rules for VR development are constantly changing, but here are some great resources for what not to do as a developer: 1, 2, 3.
I don't play video games, so I'm not interested in virtual reality.
Video games are certainly paving the way for virtual reality development. Unity and Unreal Engine are the two most common programs used to develop for virtual reality, and they're both video game engines. Aside from Oculus Rift and HTC Vive, Sony also recently opened pre-orders for their Playstation VR headset that exclusively targets gamers.
According to a report by Goldman Sachs, nearly 46% of virtual reality development by the year 2020 will be for video games. However, VR is expected to positively impact several other industries in new and exciting ways.
Virtual reality is being used to diagnose mental health conditions like PTSD and depression. Films are currently being produced exclusively for VR audiences. Athletes can train in new immersive ways. Architecture firms and real estate companies allow users to visualize buildings before construction begins. If we believe that VR will be limited to gaming, we'll be surpassed by our competitors in the industry who embrace the technology.
This is cool stuff, but I won't buy in to this.
You will need to have a PC that is capable enough to work with a VR headset, so you're looking at a starting price of nearly $1,500 for either the Oculus Rift or HTC Vive. A new VR capable PC costs roughly $600, but you can expect those prices to drop over the next few years. Here are the listed recommended specs for the different headsets.
Moore's Lawis the observation that, over the history of computing hardware, computer power has doubled roughly every two years. Additionally, improvements in rendering techniques such as foveated rendering and asynchronous timewarp will ultimately lower the barrier for hardware requirements.
The first Apple Video iPod launched in 2005 for $399. If you've been waiting for that price to drop, congratulations! You can currently purchase one on Ebay for as cheap as $50
That was just for the first generation of Video iPods. Naturally, the price has dropped dramatically over time as manufacturing and hardware improved. We can reasonably expect the same thing to happen with virtual reality headsets.
These are all very valid concerns. If we can get over those three barriers, and if virtual reality reaches mass consumer adoption, then we can reasonably assume that VR will dramatically impact how we experience and interact with technology on a daily basis. If you're still with me, maybe you're already thinking about some great potential possibilities.
"Wouldn't it be great if…"
What if we could walk through buildings before they are constructed? At Pulse Design Group, we're able to virtually visualize exactly what a proposed space will look like with sub-millimeter precision and with an unprecedented level of interaction. Real estate companies are able to preview what a house will look like for potential buyers. It is not difficult to imagine that this will be the norm for architecture, real estate, and 3D design in the future.
With cameras and some clever photogrammetry trickery, developers can allow users to virtually explore real-world locations. This morning, I walked through Cluny Abbey in a program created by realities.io before talking a stroll across the surface of Mars.
VR can now be used in health care to help surgeons train for complex surgeries. Users can even walk through expanded views of internal organs - a concept that I find as exciting as I do icky.
With a Leap Motion sensor, a user is able to track their hands in virtual reality and interact with virtual objects and interfaces. Yes, just like Minority Report. We can imagine a future where employees will have virtual workspaces with floating monitors instead of typical cubicles with flat computer screens. This opens up new possibilities with shared remote workspaces and telepresence.
With programs like LectureVR and Altspace, users can even give virtual presentations and attend virtual seminars. Imagine attending a seminar about theoretical physics instructed by a virtual Albert Einstein or a presentation by Neil DeGrasse Tyson about astrophysics.
Films like Henry and Gary the Seagull are being created exclusively for VR audiences. The metaphorical book on VR filmmaking is still being written, but we can look forward to a greater level of interaction and presence in these new types of experiences. Imagine being in the scene standing next to the action stars as the story progresses around you.
"If that's all true, why couldn't we also…"
Couldn't we experience what it would be like to walk across the surface of alien planets? In some virtual reality programs, you can walk around on Mars and check out the International Space Station. Could we remotely work in offices across the world? Could we collaborate with other designers and developers in an entirely virtual environment?
Couldn't we also create 3D content in virtual reality? In Google's Tilt Brush, users can paint in 3D and share their art with other users. As a 3D artist, I would be far more productive if I could reach out my hand and control the position and orientation of every vertex on a geometric shape.
Perhaps we will be able to remotely control robots and machines; taking control of their arms and seeing what they see. Imagine the impact that this would have on the engineering, construction, health care, and military industries. Tele-presence and remote working would become common practices in our daily lives. This would have a truly profound impact on how companies operate in dangerous environments as companies are able to save time and money while completing hazardous tasks.
Naturally, there is a huge potential for investment in virtual reality. This technology could potentially disrupt every industry from gaming to architecture, and we can only speculate what it will look like in 10 years. The only things more common than VR startups are studies from economists that boast about how massive VR will be in the future.
I haven't even talked about augmented reality yet. AR is technology that superimposes computer generated information onto the user's vision. Seeing the potential for huge profit, there are several new products on the market and currently in production, like the Epson BT-300 smart glasses, Meta headset, Microsoft Hololens, and the Daqri Smart Helmet. We'll hear more about augmented reality in the next few years as production continues and the cost of production continues to lower. I'll follow up this post with a post about augmented reality and what it will mean for you and your work.
So where are we now?
We're seeing the rise of new entrepreneurs and new product ideas. The VR community is proving itself to be an invaluable resource to enthusiasts and developers, and I have had the opportunity to meet some of the most friendly and productive people in the industry.
I work as a virtual reality developer at the aforementioned Pulse Design Group, a healthcare architecture firm based in Kansas City. We develop VR experiences using Leap Motion, HTC Vive, and Oculus Rift. I also have the immense opportunity to introduce first-time users to virtual reality experiences. By using virtual reality as a communication and design tool, we're able to allow a user to walk through a proposed design before construction even begins. This allows us to make necessary changes and communicate design intent much earlier in the design process. As a result, clients have an unparalleled understanding of what their proposed space looks like, and there will be no chance for surprises when they enter a finished space that they previously viewed in VR.
What is currently going on behind the scenes in the industry?
Apple has acquired as many as 23 different VR and AR companies, all of whom are now completely silent about what they are currently developing. I think it is fair to assume that they are developing something, and that we'll hear something about it soon (also, that something will be white and chrome.) Google is continuing to develop their Google Tango system, which is a system that is able to track spaces and recreate them in VR with immense clarity. Magic Leap, a promising company that has raised more than $2.5 billion to create a new augmented reality product, has released little to no information about their product. We can expect to have a collective "whoa" moment as they begin releasing information in the coming years.
We're in the future. Buckle up.
Everyone in the virtual reality community is extremely friendly and welcoming. If you want to experience virtual reality for yourself and hear more about what is going on in your local community, find a local meetup group or get in touch with a local developer. Your best resource is the people around you. In Kansas City, I have the opportunity to co-organize the KCVR meetup group alongside Andrew London. As VR becomes the norm, I expect that communities like this will become breeding grounds for innovation, development, and collaboration. If you're ever in Kansas City, get in touch and check out our virtual reality lab. We would love to show you what we have been up to!