Dr. Joshua Bederson

Augmented reality microscope gives neurosurgeon the accuracy of a top gun pilot

November 15, 2016
by David Dennis, Contributing Reporter
Last week Mount Sinai Health System announced the first use of new imaging technology that combines virtual reality with the eyepiece of a microscope for neurosurgery. The CaptiView image system consolidated with Brainlab Cranial 3.1 Navigation Software and a Leica M530 OH6 microscope allows the surgeon to view live and pre-operative anatomical images simultaneously or in any combination he selects by toggling controls with hand or feet.

Dr. Joshua Bederson, an expert in skull-base and cerebrovascular surgery who has performed more than 3,600 neurosurgical operations at Mount Sinai, was involved with the development of this technology and the first surgeon to deploy it in the operating room. HCB News spoke with Dr. Bederson about the impact of this advance in surgical navigation technology.

HCB News: These developments have been described as GPS or Pokeman Go for brain surgery. Do you consider these to be apt comparisons?

Joshua Bederson: What we first do is create a virtual world. Think of the movie Avatar. If you combine a CT scan, a histogram, and include difference sequences of an MRI scan, you will create a virtual rendition of the patient in which you can vary the transparency of the brain structures.

Then we link that virtual map to the patient’s actual anatomy. So we have an Avatar-like 3D environment that is linked in real time to the anatomy of the patient, sort of like a GPS read-out superimposed onto the “geography” of the brain. This combination is analogous to a GPS system.

Then, we “inject” all of these elements into a heads up display that is superimposed over what we are actually looking at through the microscope. You can think of this as similar to the heads up display a fighter pilot looks at in the glass of his windshield as he is trying to land a fighter jet onto an aircraft carrier. All he really sees is a dot in the distance, but given the heads up display he can see the approach pathways through the clouds, etc.

So, the GPS is tied together with an Avatar-like virtual reality environment and all that is projected, on command, into the eyepiece, layered over the surgeon’s vision of the patient’s anatomy.

HCB News: Does the “heads up display” make that much of a difference to the surgeon and the outcome of the procedure?

JB: It does! Getting back to the pilot analogy. As a pilot you learn very early on that the first thing you have to do is “fly the plane.” All of the instrumentation brings critical information into the pilot’s awareness, but you have to shift your attention away from ‘“flying the plane” to get that information.

When trained to scan instruments properly, you maintain a comprehensive understanding of where you are and what is happening. The same is true of this new surgical technology. It allows you to keep “flying the plane,” or operating, instead of having to stop the surgery, look over at your scans and simulations, and then start operating again.

In the following video Dr. Bederson describes the advantages of the technology and how it may benefit both physicians and their patients:



HCB News: So anything you can do to eliminate unnecessary movement is pivotal?

JB: That’s right. When a neurosurgeon settles in on the arm rest pad for a procedure, you are required to remain in a “yogic” position for three to four hours at a time.



So the foot pedals and the toggle switches on the new system contribute to that. The left foot pedal has about 16 different switches and a joy stick that controls the direction, zoom and depth the microscope. Your right foot operates four or five different controls. Of course, each hand does things and even your mouth controls something.

You have to be ready to activate all of those with all of your limbs at any moment, otherwise you lose valuable seconds. This is why access to information through the augmented eyepiece is important. You are already going on all four burners so if you have to stop in order to consider “outside” information and then return and regain situational awareness - going back and forth between the “GPS map” and what you are doing - it is exhausting and distracting.

HCB News: Since you have been involved in the development of this technology, what can you say about its health care business prospects?

JB: With the craze of Pokeman Go and other games like Oculus we are seeing a growing awareness of the practical application of virtual and augmented realities. What we are doing in the operation room is augmentation. That is clearly becoming mainstream.

Ten or fifteen years ago, when surgical navigation was just coming into the field, you had to be a devotee and a real believer to make it work. But it rapidly permeated the field and now it is a standard of care and you cannot imagine doing many of our procedures without it.

Augmented reality, automation, and the digital operating room are with us. Not everyone is going to be able to have it initially, but in my opinion, everyone is going to have it in a very short time. This is quickly going to become a standard of care.