We all have heard of a CT or an MRI scan at some point in our lives, be it personally having experienced being scanned by the donut shaped contraption or by means of casual conversations with family or friends. We may have a vague idea of what it does, but it stops there. I mean… why would you like to know more.. unless.. you’re a doctor who needs to know how and where to use it or a technician who is trying to figure out how this strange machine actually works. And you would not have gone through a simulation of the workings of this healthcare diagnostic equipment.

This is where we come to the problem statement we had to address. 

Be it a doctor or a technician trying their way around the usability, the intricacies of the machine, most of them time they are limited to books, printed documentation, and perhaps a few videos. Rarely do they get a chance to practice on an actual machine. But hey, this behemoth is not an average piece of hardware that colleges or professional institutes can afford to have their students train on and Hospitals can’t have time away from active billable usage. Thus, when these novice users get their hands on the actual machine at their first job or internship, it’s an overwhelming experience having to use an expensive piece of hardware to do their job. It’s akin to say, your friend giving you the keys to his or her new Ferrari and asking you to go for a test drive. Excitement, fear, and anxiety would probably top the list of emotions when you first get the keys. Over 9000!!! To sum it all up. (to all the DBZ fans out there).

The solution to this problem was staring right in-front of us: 

Virtual Reality! 

VR = Complete immersion + Unlimited freedom of expression + No breakage charges. 


That’s what we built.

Challenges were plenty! Considering an aggressive timeline we had to come up with something that looked good and was also intuitive to use. Just as Blizzard Entertainment’s products ooze their core principal, “Gameplay first”, Xarpie Labs strives to uphold its core values, “Decisive. Concise, Clear.”

Having said that, we had to crunch those values into a small and tangible POC that demonstrated our capability and enabled institutions to onboard field ready technicians to critical, expensive equipment.

We broke down the experience of simulation into two main categories:

  1. Operators to get oriented with how to use the CT scan machine, and 
  2. Maintenance module for technicians who wanted/needed practice. 

Both main categories had two sub categories, “Training” and “Evaluation”.  

We followed our standard experience design template, “Introduction – Tutorial – Module – Experience – Result – Exit”.

Recipe to our secret sauce

Immersion = What you see (Visual fidelity) + what you hear (3D Spatial Audio, hopefully raytraced in the future) + how you interact (Input: Try and integrate natural form factors)+ what you feel (Feedback).

Here’s where things get technical

The environment and props were made with standard and custom shaders using Unity HD Render pipeline with a few custom shaders and extensive post processing.

Hitting 90+fps per eye is no joke, even with John Carmack’s brilliant time warping.

In my view, the key to success in VR = High fidelity visuals on lower cost hardware + high frame rate.

Material optimization to reduce draw calls by maximizing texture atlasing and limiting unique shaders is something we strongly follow for non-tileable textures. We also intend to optimize it further for tileable textures by indexing them. Mesh detail over alphas, wherever possible, reduce draw calls further. For speed and quality, we use baked raytraced lightmaps combined with light and reflection probes. 

Alpha sorting was a challenge in the beginning while we were making 3D VR menus, an issue which was later sorted by setting the draw order in the shader. But, if not planned properly, you would end up overdrawing things on objects – something you did not intend to do. 

We used neural text-to-speech voices to assist users in their experience. This also gave them an added dimension to the immersion, in addition to realistic visuals that we were able to generate. 

The Xarpie Labs’ value = Post gimmick takeaway + tangible improvement in performance + replay value.

The business value

An average MRI machine could be down for 30-60 hours every year. Each scan, taking about an hour, costs approximately USD 2,000. Averaging it around the year, a hospital could lose revenues of about USD 60,000-120,000 every year due to machine downtime. At a conservative estimate if we assume 10% of the downtime is attributed to training (on-machine sessions plus related repairs and maintenance, cost of that excluded), we can expect a market opportunity to address this loss of about USD 9,000 per machine per year. This is the estimated data for one single machine. Multiplied by the number of machines/technicians a hospital invests into in the growing healthcare industry, the market opportunity for VR training for operators and maintenance teams suddenly grows to staggering numbers.

In our simulation program, user generated actions and other variables were being tracked and recorded in the backend. The idea was to generate user centric reports for the trainees and a comprehensive report to the manager by running analytics on the recorded data. This will surely improve efficiencies in operator onboarding and assessment as to whether the technician is machine-ready or not. Immersive knowledge transfer and timely operator onboarding can reduce the loss on each machine. We are optimistic, it has the potential to bring  improvements in such processes by at least 30% in terms of savings from damage, mishandling, loss in business due to attrition of experienced operators, field representatives, etc. This magic number of could increase significantly if we were to factor in the elimination of machine downtime for training/repairs, allowing for more billable use of the diagnostic machines. Some breakage, maintenance and learning on the field is expected anyway.

The above is just a teaser to what can be. Possibilities are endless. Real-time raytracing, ECS workflows, AI/ML assist are among our list of capabilities ‘under construction’ – all geared to address enterprise needs of efficiencies and net value addition. 

So, are you ready to go?

Co-author: Alpa Bansal


  • https://www.glassbeam.com/blog/how-much-does-medical-equipment-downtime-cost-hospitals
  • https://www.iitsec.org/-/media/sites/iitsec/link-attachments/agenda-details/10abstracts2.ashx?la=en
  • https://www.glassbeam.com/blog/how-much-does-medical-equipment-downtime-cost-hospitals
  • https://www.interplaylearning.com/hubfs/Blog/Case%20Studies/Development%20and%20Analysis%20of%20VR%20Technician%20Training%20Platform%20and%20Methods.pdf