Viewing entries tagged
healthcare architecture

Shawnee Mission Health Hybrid Operating Room Opens to Patients

The Shawnee Mission Health Hybrid Operating Room opened to patients on February 7th.  

The 3,550 square foot surgical space, the largest hybrid operating room in Kansas City, was converted from an outdated clinical area and was designed to enhance patient safety and outcomes. The multidisciplinary space allows healthcare professionals from different specialties to treat patients in the same location.

With the help of virtual reality, Pulse Design Group developed a flexible design for the Shawnee Mission Health Hybrid Operating Room that could be modified in the field to manage the equipment required for an operating room, along with the technology for a radiology room.

Click here for more insight on the Shawnee Mission Health Hybrid Operating Room from MetroWire Media.

Implications for health systems regarding USP 797/800 compliance standards

Millions of prescriptions are compounded by pharmacists, nurses and doctors each year in the United States to meet the unique needs of patients who otherwise may not have access to the required medicine in the right concentration or dosage. Understanding of the inherent risks of compounding and incorporating established USP standards into everyday practices is essential for patient and staff safety.

New regulations set forth by the United States Pharmacopeia’s USP 797 and USP 800 were recently established and adopted to include workroom air pressure requirements, specialized work flows, isolation measures and sterility conditions related to compounding. What do these new standards involve and what do they mean for healthcare organizations?

USP 797 helps to ensure patients receive quality preparations that are free from contaminants and are consistent in intended identity, strength and potency. It describes a number of requirements including responsibilities of compounding personnel, training, environmental monitoring, storage and testing of finished preparation.

USP 800 provides standards for safe handling of hazardous drugs to minimize the risk of exposure to healthcare personnel, patients and the environment. USP 800 deals with product transport, product storage, compounding, preparation, and administration of products.

The regulations exact enforcement fluctuates by state, however current USP mandates require that compliant facilities and practices must be implemented by December 31, 2019. These new statutes affect public and private sector pharmacies and will require, in many cases, substantial capital investments in infrastructure and personnel to meet the new regulations.

The University of Kansas Health System Pharmacy 

The University of Kansas Health System Pharmacy 

USP 797 and 800 standards have pushed architectural firms to come up with creative design solutions to help healthcare systems meet compliance standards while being mindful of the organizations bottom line. One effective design solution for clean room pharmacy compliance and upgrades is the utilization of specialized pass-throughs, which is commonly used in pharmacies for drug preparation. Clean rooms are pressurized and sterile to ensure that drugs are safe. Pass-throughs are two-sided cabinets built into a wall to connect the pharmacy and the clean room for the transfer of supplies and drugs without jeopardizing sterility. When done properly, only one door of the pass-through can be open at a time, which keeps the pressurized system intact, preventing any contaminants from entering the clean room.

In the past, most pass-throughs were constructed of stainless steel and involved mechanical interlocks, but recently there have been newly developed systems that effectively utilize solid surface. The advantages of solid surface includes its properties of being nonporous, bacteria-resistant, stain resistant and durable, which makes it well-suited for sterile environments. Other benefits include its ability to be fully seamless for cleaning and is typically less expensive to fabricate than stainless steel. Moreover, the material’s flexibility easily integrates electrical fixtures better than other products.

The Joint Commission will enforce penalties to healthcare systems that do not meet compliance standards by the established date. Reference the Joint Commission article by clicking here for more information regarding USP 797 and 800 compliance and to understand the benefits of obtaining Medication Compounding Certification.

The extent of pharmacy design modifications relies heavily on the compliance of the current space. Not only will the pharmacy design and operations need to be modified to adhere to current regulations, many times the footprint is required to significantly grow. The timing required to implement these measures can be extensive and procedures and operations will likely be affected. Health systems should be preparing compliance plans early to ensure a smooth transition.

The University of Kansas Health System Pharmacy 

The University of Kansas Health System Pharmacy 

Experimenting with Apple's ARKit

1 Comment

Experimenting with Apple's ARKit


Augmented Reality (AR)  is coming quickly to your pocket. This year’s developer conferences for Google, Facebook, and Apple all had a very heavy focus on AR and VR technology; it is a safe assumption that augmented reality will be the next “mobile.”

At The Apple Worldwide Developers Conference (WWDC) 2017 in June, Apple introduced a new software development library/kit that enables developers to turn iPhones and iPads into augmented reality devices without the burden of custom implementing a Simultaneous Location and Mapping (SLAM) solution. In other words, Pokemon Go is just the beginning. There are approximately 300,000 iOS app developers in the United States market alone, greatly affecting the Apple platform by making it easy to develop with augmented technologies.

Pulse Design Group  is focused on the comprehensive field of immersive computing from Virtual Reality to Augmented Reality and Mixed Reality. We started building with the toolkit immediately to investigate potential use cases, and to get a better understanding of feasibility and timelines. There are great possibilities for Augmented Reality on iOS devices, and remarkable use cases for training simulation and architectural applications. The following is an example of our early proof of concepts and provides a brief rundown of the process.


In this gif you can see how the iPad is able to move freely around the room while presenting content that appears to persist in the virtual space. Apple accomplishes this through what they call “Visual-Inertial-Odometry,” which is the blending/cross referencing input from the camera and the internal motion sensors in the mobile device.

The current hardware and software requirement is any iPhone or iPad with an A9 processor or better, and iOS 11. We installed the update and got started! We used the Unity development engine in this project due to its current support for ARKit. Below are the techie details for how we implemented one of our models into an ARKit project.

A current version of Unity needed to be installed to begin development for ARKit. The link to download can be found here.

After installing of Unity create a new project, then download the plugin for ARKit. 


Now the fun begins!

Initially, we took one of our existing architectural models of an operating room and imported the .fbx file from Autodesk 3DS Max into one of the example ARKit scene samples that were provided in the above link to the asset store, which worked immediately! One difficult thing to account for is how ARKit will read the real world space, therefore it is necessary to package and export your project to the device and run the software on compatible hardware. It is hard to get an exact sense of what is happening in the app from the Unity viewport/simulator alone. The ARKit plugin for Unity performs most of its magic in relation to the camera viewport, which doesn’t exactly come across in the in editor viewport.

At this point, we built and ran the program on the device and found that our model was inhabiting a surface plane that was detected and properly used by ARKit! However, this was not enough and we quickly realized that we needed to implement some sort of simple control scheme to navigate the model. We concluded that the best method was the tried and true “pinch to zoom” and “tap to place” control scheme. To do this, we implemented a C# library called “LeanTouch” which allowed us to set some of this behavior in Unity. One issue we ran into was separating a “one finger tap” from a “two finger pinch.” This was an issue because users typically do not remove both pinching fingers from the screen at the exact same time, leaving a brief moment where one finger was down, registering with Unity the “one finger tap” command, and thereby moving the model to the last registered tap. We are currently getting around this with a three finger requirement for tap to place, so it will not interfere with the pinch to scale implementation.

Special note time: Pulse Design Group has a branded RGB shade of green that is used for all promotional marketing and other visual elements. By simply finding and changing the RGB value of the particle emission system, we were effectively able to customize the look of the application to fit our company’s branding.

An ARKit demo that features a flyover map application.

An ARKit demo that features a flyover map application.

The result is an impressive demo to show and introduce ARKit to our clients. Additionally, we are able to showcase our work in a 1:1 scale, or in a smaller scale dollhouse view, as long as we are using an iOS device!

At Pulse Design Group, we will continue to improve the ARKit demo, add functionality and interaction, investigate the best possible use-case for the software, and build internal concepts. ARKit is very promising and appears to make AR development much easier than it was just three months ago.  

To learn more or further discuss AR technology, contact Andrew London at or Steve Biegun at


1 Comment

Mixed Reality

1 Comment

Mixed Reality

It's difficult to really understand what virtual reality is like without putting a headset on your face. Mixed reality video, a combination of physical and virtual image capture, is probably the best means to explain what a VR experience is like without the use of a headset.

Let's talk about what that means and how we're utilizing exciting new technology like the Vive Tracker to improve communication with our audience.

1 Comment

Photogrammetry: About As Real As It Gets

I am standing in the auditorium of an old abandoned German hospital. I can see dust across the room as it falls through a beam of light coming through a window. Above me, I can see the rafters of the abandoned architecture as if they were actually thirty feet above me. Scuff marks, upended tiles, and piles dirt across the floor indicate that the building has been uninhabited for several decades. An old abandoned piano is situated in the center of the room. What I see around me is unequivocally real. Well, as real as photographic reality capture can get. The program created by hosts recreations of several different environments from across the globe.


Simply put, "photogrammetry" is the science of making measurements from photographs. Over the past few years, photogrammetry has become a popular method for recreating real objects, locations, and people as 3D models. Recognizing that the lines between real and virtually real blur more and more everyday, we sought out to improve our photogrammetry capabilities to allow us to take real elements into less-than-real environments. Realistically captured elements could be used in architecture to show complicated designs in virtual reality or to preserve past creations as you would with a photograph.

Before discussing how we went through this process, let's talk about drawbacks. The photo capture process can take a significant amount of time. Some photos may need to be eliminated, retouched, or entirely removed from the series of photos in order for the final photographic recreation to be realistic. Additionally, the 3D mesh that is constructed by the process can have significant errors, artifacts, and glitches. Some objects are easier to capture than others; reflective surfaces, people, and objects shot with moving backgrounds are notoriously difficult to recreate. That being said, the software used to stitch these objects together is rapidly improving.

Our goal was to use photogrammetry to recreate my self in VR as a 3d object so that I could be used as a stand-in character for realistic simulations. Rather than using mannequins, cut-out character silhouettes, or stylistic video game-y assets, we thought that using realistically captured 3D figures would have a greater impact.

Don't worry, it gets weirder.



In this article by Tested, the crew assembles a series of flat lights that provide clean, even, and balanced lighting on the model's face. For reality capture, you want your original 3D model to be lit as flat as possible. Lights, shadows, and reflections are all calculated in the VR simulation program (Unreal Engine, Unity, Cryengine, etc.)

Avoid reflections, bright lights, and moving elements. Any sort of movement in the scene can cause issues with the photo calculation. Basically, avoid everything that we did in our first attempt:

In our first attempt, Callum captured close to 200 pictures from varying angles around the room. The most difficult part of this attempt was getting me to keep my arms parallel to the floor for the duration of the shoot - a much more difficult feat than I had imagined. I did not initially understand how difficult it can be to remain perfectly still for several minutes. Optimally, the person being captured should not blink, talk, or turn their head. Breathing is acceptable though not encouraged. The model that was stitched together in Autodesk Remake had serious calculation issues caused by movement in our setting, so we gave it a second attempt. 


In our second attempt, we propped my hands with tripods to ensure that they would be perfectly parallel to the floor. After the capture, we simply removed the tripods from the 3D file. The 3D model had a significantly higher level of quality and resolution than our first attempt.

After exporting to an FBX file format to 3DS Max, we processed and rigged the 3D model for VR. The file is extremely high-poly (solid 3D models are composed of meshes of polygons. This had hundreds of thousands of polygons.) For optimization, we could manually reduce quality from parts of the image. In a scene with multiple photogrammetry captured characters, it would be essential to reduce the polygon resolution of as many models as possible. Additionally, I was able to process and touch-up the created photographic textures in Photoshop to ensure that lighting and shadows were even - especially across the front of the 3D model.

In the future, photogrammetry will become easier as the means for positional calculation become more advanced. If we were to have captured our images with higher resolution cameras and with uncompressed file formats, we would have had less problems with compression and color depth. Perhaps new phones with multiple camera sensors will bring photogrammetry to the masses with easy-to-use tools for development and distribution.


Website Launch

We launched our new website! Check it out at 

Virtual Reality?


Virtual Reality?

The first time that you put a virtual reality headset on your face is a very memorable moment. The Matrix was released only less than 2 decades ago, and already we're experiencing the same feeling that Neo had when he jacked into the matrix for the first time. As I write this article, the first generation of consumer by HTC and Oculus are being manufactured and delivered to eager enthusiasts and developers. An estimated 200 million VR headsets will be sold by 2020. This year, millions of those consumers will experience their first "Neo moment."

For the next couple thousand words, I'll talk about where the virtual reality industry is at, what to expect, and why I probably won't shut up about VR anytime soon. There are misconceptions, fears, and there is more speculation than both. I'll also tell you about the projects that I've had the opportunity to work on at Pulse Design Group.


In 2011, Palmer Luckey created a head-mounted display (HMD) in his parents' garage using duct tape and cardboard. Soon after, he launched a Kickstarter campaign to begin developing a headset called "Oculus Rift." The Kickstarter campaign was wildly successful and reached ten times the original goal. Later that year, Facebook acquired Oculus for 2 billion dollars. Since then, Oculus has released two iterations of developer kit headsets.

In 2015, HTC announced that they were producing their own virtual reality headset. The news about their new product, the HTC Vive, was met with mixed reception. Why was HTC, a phone manufacturing company creating a VR headset? How would it compare to the Oculus Rift? Does anyone else think that 'Vive' is a silly name? 

To be fair, the tech industry certainly has no shortage of silly names.

To be fair, the tech industry certainly has no shortage of silly names.

The Oculus Touch hand controllers, slated for a 2016 release, are extremely comfortable and easy to use.

The Oculus Touch hand controllers, slated for a 2016 release, are extremely comfortable and easy to use.

The HTC Vive, unlike the Oculus Rift, promised room-scale tracking, tracked hand controllers, and the "chaperone system," a system designed to prevent the user from walking into walls and objects in their real environment. Insert image of chaperone Oculus will likely include each of these features in the future, but the first generation of Vive products will include these when the first consumer version launches. The Oculus Rift currently costs $599, HTC Vive costs $799.

As consumers begin receiving their new virtual reality headsets, we'll see more and more headlines about VR and new technology. This will be the first year that millions of people will experience virtual reality. Users will have the opportunity to walk through castle ruins in Germany, fly through a cityscape like an eagle , and experience space combat with a whole new level of immersion.

What could go wrong? These are the most common hesitations about virtual reality:

I get motion sick from roller coasters and airplanes. I'll probably get motion sick in virtual reality, so I won't try it.

I don't play video games, so I'm not interested in virtual reality.

How expensive!? This is cool stuff, but I won't buy in.

These are all very valid concerns. Let's go through each of them and dive into the details.

I get motion sick from roller coasters and airplanes. I'll probably get motion sick in virtual reality, so I won't try it.

The most common fear for first-time users of Virtual Reality is nausea. Most adults have experienced some sort of motion sickness from roller coasters and airplane turbulence, so wouldn't they also likely experience motion sickness from virtual roller coasters and airplanes?

When it comes to virtual reality, what we're concerned with is actually called "simulator sickness." Sim sickness was first discovered (what a miserable thing to discover, right?) when helicopter pilots would train in virtual simulations. Simply put, sim sickness is confusing movement rather than too much movement. This is definitely a valid concern. I've experienced sim sickness and I understand that it is more than enough to prevent someone from ever trying VR for a second time. If it makes you sick for the first time, you'll probably never put a VR headset on your head ever again.

The most common cause of sim sickness is unanticipated movement. Sim sickness can also be caused by technical issues, such as low framerate, poor visual quality, and unnatural perception. 

Most instances of sim sickness are easily preventable by the developer. If the user gets queasy, it is probably because of some disorienting factor in the program itself - and I say that as a developer. The proverbial Golden Rules for VR development are constantly changing, but here are some great resources for what not to do as a developer: 1, 2, 3.

I don't play video games, so I'm not interested in virtual reality.

Eva Hoerth , VR Design Researcher and Community Builder at Ratlab LLC.

Eva Hoerth, VR Design Researcher and Community Builder at Ratlab LLC.

Video games are certainly paving the way for virtual reality development. Unity and Unreal Engine are the two most common programs used to develop for virtual reality, and they're both video game engines. Aside from Oculus Rift and HTC Vive, Sony also recently opened pre-orders for their Playstation VR headset that exclusively targets gamers.

According to a report by Goldman Sachs, nearly 46% of virtual reality development by the year 2020 will be for video games. However, VR is expected to positively impact several other industries in new and exciting ways.

Virtual reality is being used to diagnose mental health conditions like PTSD and depression. Films are currently being produced exclusively for VR audiences. Athletes can train in new immersive ways. Architecture firms and real estate companies allow users to visualize buildings before construction begins. If we believe that VR will be limited to gaming, we'll be surpassed by our competitors in the industry who embrace the technology.

This is an image of a Hybrid Operating Room that we created in virtual reality. Users are able to interact with their environment and use communication and design tools in the virtual space.

This is an image of a Hybrid Operating Room that we created in virtual reality. Users are able to interact with their environment and use communication and design tools in the virtual space.

This is cool stuff, but I won't buy in to this.

From left to right: Sony's Playstation VR, Oculus Rift, and HTC Vive.

From left to right: Sony's Playstation VR, Oculus Rift, and HTC Vive.

You will need to have a PC that is capable enough to work with a VR headset, so you're looking at a starting price of nearly $1,500 for either the Oculus Rift or HTC Vive. A new VR capable PC costs roughly $600, but you can expect those prices to drop over the next few years. Here are the listed recommended specs for the different headsets.

Moore's Lawis the observation that, over the history of computing hardware, computer power has doubled roughly every two years. Additionally, improvements in rendering techniques such as foveated rendering and asynchronous timewarp will ultimately lower the barrier for hardware requirements.



The first Apple Video iPod launched in 2005 for $399. If you've been waiting for that price to drop, congratulations! You can currently purchase one on Ebay for as cheap as $50

That was just for the first generation of Video iPods. Naturally, the price has dropped dramatically over time as manufacturing and hardware improved. We can reasonably expect the same thing to happen with virtual reality headsets.

These are all very valid concerns. If we can get over those three barriers, and if virtual reality reaches mass consumer adoption, then we can reasonably assume that VR will dramatically impact how we experience and interact with technology on a daily basis. If you're still with me, maybe you're already thinking about some great potential possibilities.

"Wouldn't it be great if…"

What if we could walk through buildings before they are constructed? At Pulse Design Group, we're able to virtually visualize exactly what a proposed space will look like with sub-millimeter precision and with an unprecedented level of interaction. Real estate companies are able to preview what a house will look like for potential buyers. It is not difficult to imagine that this will be the norm for architecture, real estate, and 3D design in the future.

With cameras and some clever photogrammetry trickery, developers can allow users to virtually explore real-world locations. This morning, I walked through Cluny Abbey in a program created by before talking a stroll across the surface of Mars.

VR can now be used in health care to help surgeons train for complex surgeries. Users can even walk through expanded views of internal organs - a concept that I find as exciting as I do icky.

With a Leap Motion sensor, a user is able to track their hands in virtual reality and interact with virtual objects and interfaces. Yes, just like Minority Report. We can imagine a future where employees will have virtual workspaces with floating monitors instead of typical cubicles with flat computer screens. This opens up new possibilities with shared remote workspaces and telepresence.

With programs like LectureVR and Altspace, users can even give virtual presentations and attend virtual seminars. Imagine attending a seminar about theoretical physics instructed by a virtual Albert Einstein or a presentation by Neil DeGrasse Tyson about astrophysics.

Films like Henry and Gary the Seagull are being created exclusively for VR audiences. The metaphorical book on VR filmmaking is still being written, but we can look forward to a greater level of interaction and presence in these new types of experiences. Imagine being in the scene standing next to the action stars as the story progresses around you. 

"If that's all true, why couldn't we also…"

Couldn't we experience what it would be like to walk across the surface of alien planets? In some virtual reality programs, you can walk around on Mars and check out the International Space Station. Could we remotely work in offices across the world? Could we collaborate with other designers and developers in an entirely virtual environment?

Couldn't we also create 3D content in virtual reality? In Google's Tilt Brush, users can paint in 3D and share their art with other users. As a 3D artist, I would be far more productive if I could reach out my hand and control the position and orientation of every vertex on a geometric shape.

Personally, I felt like 3D painting was more akin to sculpting than the method of traditional painting.

Personally, I felt like 3D painting was more akin to sculpting than the method of traditional painting.

Life continues to imitate sci-fi.

Life continues to imitate sci-fi.

Perhaps we will be able to remotely control robots and machines; taking control of their arms and seeing what they see. Imagine the impact that this would have on the engineering, construction, health care, and military industries. Tele-presence and remote working would become common practices in our daily lives. This would have a truly profound impact on how companies operate in dangerous environments as companies are able to save time and money while completing hazardous tasks.


Naturally, there is a huge potential for investment in virtual reality. This technology could potentially disrupt every industry from gaming to architecture, and we can only speculate what it will look like in 10 years. The only things more common than VR startups are studies from economists that boast about how massive VR will be in the future.

I haven't even talked about augmented reality yet. AR is technology that superimposes computer generated information onto the user's vision. Seeing the potential for huge profit, there are several new products on the market and currently in production, like the Epson BT-300 smart glassesMeta headset, Microsoft Hololens, and the Daqri Smart Helmet. We'll hear more about augmented reality in the next few years as production continues and the cost of production continues to lower. I'll follow up this post with a post about augmented reality and what it will mean for you and your work.

So where are we now?

We're seeing the rise of new entrepreneurs and new product ideas. The VR community is proving itself to be an invaluable resource to enthusiasts and developers, and I have had the opportunity to meet some of the most friendly and productive people in the industry.

I work as a virtual reality developer at the aforementioned Pulse Design Group, a healthcare architecture firm based in Kansas City. We develop VR experiences using Leap Motion, HTC Vive, and Oculus Rift. I also have the immense opportunity to introduce first-time users to virtual reality experiences. By using virtual reality as a communication and design tool, we're able to allow a user to walk through a proposed design before construction even begins. This allows us to make necessary changes and communicate design intent much earlier in the design process. As a result, clients have an unparalleled understanding of what their proposed space looks like, and there will be no chance for surprises when they enter a finished space that they previously viewed in VR.

What is currently going on behind the scenes in the industry?

Apple has acquired as many as 23 different VR and AR companies, all of whom are now completely silent about what they are currently developing. I think it is fair to assume that they are developing something, and that we'll hear something about it soon (also, that something will be white and chrome.) Google is continuing to develop their Google Tango system, which is a system that is able to track spaces and recreate them in VR with immense clarity. Magic Leap, a promising company that has raised more than $2.5 billion to create a new augmented reality product, has released little to no information about their product. We can expect to have a collective "whoa" moment as they begin releasing information in the coming years.

We're in the future. Buckle up.

Everyone in the virtual reality community is extremely friendly and welcoming. If you want to experience virtual reality for yourself and hear more about what is going on in your local community, find a local meetup group or get in touch with a local developer. Your best resource is the people around you. In Kansas City, I have the opportunity to co-organize the KCVR meetup group alongside Andrew London. As VR becomes the norm, I expect that communities like this will become breeding grounds for innovation, development, and collaboration.  If you're ever in Kansas City, get in touch and check out our virtual reality lab. We would love to show you what we have been up to!