throbber
Augmented Reality Goggles with an Integrated Tracking System for
`
` Navigation in Neurosurgery
`
`Ehsan Azimi (1), Jayfus Doswell (2), Peter Kazanzides (1)
`
`(1) Dept. of Computer Science, Johns Hopkins University
`
`(2) Juxtopia, LLC
`
`ABSTRACT
`
`image-guided
`in
`is crucial
`identification
`tumor
`Precise
`neurosurgical procedures. With existing navigation systems,
`the surgeon must turn away from the patient to view the
`imaging data on a separate monitor. In this study, an
`innovative system is introduced that illustrates the tumor
`boundaries precisely augmented on the spot where the tumor
`is located with regard to the patient. Additionally, it allows the
`surgeon to track the distal end of the tools contextually, where
`direct visualization is not possible. In this approach, the
`tracking system is compact and worn by the surgeon,
`eliminating the need for additional devices that are bulky and
`typically limited by line of sight constraints.
`
`KEYWORDS: Augmented reality, HMD, neurosurgery,
`surgical navigation
`
`1
`
`INTRODUCTION
`
` Surgical resection is one of the most common treatments for
`brain tumors. The treatment goal is to remove as much of the
`tumor as possible, while sparing the healthy tissue. Image
`guidance (e.g., with preoperative CT or MRI) is frequently
`used because it can more clearly differentiate diseased tissue
`from healthy tissue. Most image guidance devices contain
`special markers that can be easily detected by the tracker.
`Registering the tracker coordinate system to the preoperative
`image coordinate system gives the surgeon “x-ray vision.”
`
`It can, however, be challenging to effectively use a navigation
`system because the presented information is not physically co-
`located with the operative field, requiring the surgeon to look at
`a computer monitor rather than at the patient. This is especially
`awkward when the surgeon wishes to move an instrument
`within the patient while observing the display.
` Such
`ergonomic issues may increase operating times, fatigue, and the
`risk of errors. Furthermore, most navigation systems employ
`optical tracking, due to its high accuracy, but this requires line-
`of-sight between the cameras and the markers in the operative
`field, which can be difficult to maintain during the surgery.
`
`(1) 3400 N. Charles St., Baltimore MD, {pkaz, eazimi1}@jhu.edu
`(2) 1101 E 33rd St, B304, Baltimore MD, jayfus@juxtopia.com
`
`IEEE Virtual Reality 2012
`4-8 March, Orange County, CA, USA
`978-1-4673-1246-2/12/$31.00 ©2012 IEEE
`
` After observations of surgeries, particularly neurosurgeries,
`and discussions with surgeons, we identified a need to overlay
`a tumor margin (boundary) on the surgeon’s view of the
`anatomy. It was also desired to correctly track and align the
`distal end of the surgical instruments with the preoperative
`medical images. The aim of this paper is to investigate the
`feasibility of implementing a head-mounted tracking system
`with an augmented reality environment to provide the surgeon
`with visualization of both the tumor margin and the surgical
`instrument in order to create a more accurate and natural
`overlay of the affected tissue versus healthy tissue. It allows the
`surgeon to see the precise boundaries of the tumor for
`neurosurgical procedures, while at the same time providing
`contextual overlay of the surgical tools intraoperatively which
`are displayed on optical see-through goggles worn by the
`surgeon. This makes it feasible for augmented reality, as the
`overlay provides the most pertinent information without unduly
`cluttering the visual field. It provides the benefits of navigation,
`visualization, and all of the capabilities of the existing
`modalities and is expected to be comfortable and intuitive for
`the surgeon.
`
`The majority of related research has focused on augmented
`reality visualization with HMDs, usually adopting video see-
`through designs. Many of these systems have integrated one or
`more on-board camera subsystems to help determine head pose
`[1,2,3] and some have added inertial sensing to improve this
`estimate via sensor fusion [4,5,6]. None of these systems,
`however, attempt to provide a complete tracking system and
`continue to rely on external trackers. Other researchers [7]
`have implemented a video see-through augmented reality
`system that also includes a magnification capability, using both
`internal and external tracking systems. The internal tracking
`system is a single camera to provide orientation. The external
`tracking system has 4 cameras, and tracks a pointer, reference
`frame, and the display position.
`
`2
`
`SYSTEM DESCRIPTION AND METHODS
`
`2.1
`
`System Description
`This section provides an overview of the prototype setup.
`As illustrated in Figure 1, the user wears the optical see-
`through goggles (Juxtopia LLC, Baltimore, MD) and a helmet
`that supports a compact optical tracking system (Micron
`tracker, Claron Technology, Toronto, CA). The first step is a
`registration procedure (Fig 1), in which the surgeon uses a
`tracked probe to touch markers that were affixed to the patient
`
`123
`
`Medivis Exhibit 1011
`
`1
`
`

`

`Calibration
`2.3
` Since the position, orientation and view angle of the user's
`eyes is different from those of the tracker, an additional
`calibration step is necessary to correctly reorient the image on
`the goggles' display. This step differs from user to user, and
`therefore a modified version of SPAAM technique [8] is
`implemented so that the user can complete this subjective task.
`
`3 CONCLUSIONS AND FUTURE WORK
`
`To the best of our knowledge this is the first time that a
`head mounted tracking, registration and display system are
`integrated for surgical navigation. This would reduce the line of
`sight problem (because the tracker’s line-of-sight is the same as
`the surgeon’s), the difficulty in association of preoperative
`images, and the bulkiness that exists in the current systems.
`
`This was a limited study intended to demonstrate the
`feasibility of the approach. The tracking system, which works
`by processing the markers, currently provides 20Hz update
`rates and cannot keep up with sudden head movements. To
`overcome this problem, we plan to add inertial sensing (e.g.,
`accelerometers, gyroscopes) to the tracking system both as a
`backup and as concurrent sensors, using a Kalman filter for
`sensor fusion to improve the robustness with respect to marker
`occlusions and compensate for the delay of the optical tracking
`system.
`
`In the future, the system should include magnification to
`add the capability of the existing pure optical systems (surgical
`loupes). We are also considering an eye tracking system for
`more accurate overlay positioning.
`
`ACKNOWLEDGMENTS
`
`We thank Dr. George Jallo for his clinical guidance and
`Kamini Balaji and David A. Bosset for their technical
`contributions. This work was supported by NSF IIP-0646587.
`
`References
`
`[1] H. Fuchs, M.. Livingston, R. Raskar, D. Colucci, K. Keller, A. State, J.
`Crawford, P. Rademacher, S. Drake, A. Meyer, “Augmented reality
`visualization for laparoscopic surgery,” MICCAI, pp. 934-943, 1998.
`[2] W. Hoff, T. Vincent, Analysis of head pose accuracy in augmented
`reality, IEEE Trans. Visualization and Comp. Graphics, Vol. 6, 2000.
`[3] F. Sauer, F. Wenzel, S. Vogt, Y. Tao, Y. Genc, A. Bani-Hashemi,
`“Augmented Workspace: designing an AR testbed,” Proc. IEEE Intl.
`Symp. on Augmented Reality (ISAR), 2000.
`
`[4] R. Azuma, G. Bishop, “Improving Static and Dynamic Registration in an
`Optical See-through HMD”, SIGGRAPH, 1994.
`
`[5] S. You, U. Neumann, R. Azuma, “Hybrid Inertial and Vision Tracking
`for Augmented Reality Registration,” Proc. IEEE Conf. on Virtual
`Reality, pp 260-267, Houston, TX, March 1999.
`[6] L. Chai, WA Hoff, T Vincent, “Three-Dimensional Motion and
`Structure Estimation Using Inertial Sensors and Computer Vision for
`Augmented Reality,” Presence, 11(5):474-492, Oct 2002.
`[7] A Martin-Gonzalez, S Heining, N Navab, “Head-mounted virtual loupe
`with sight-based activation for surgical applications”. ISMAR, pp 207-
`208, 2009.
`
`[8] M. Tuceryan, N. Navab, “Single-Point Active Alignment Method
`(SPAAM) for Optical See-Through HMD Calibration for Augmented
`Reality,” Presence: Teleoperators and Virtual Environments, Vol. 11,
`No. 3, Pages 259-276, June 2002.
`
`Figure 1. User wearing the optical see-through goggles
`and head mounted tracker. Figure shows the registration
`procedure, where the surgeon touches markers using a
`tracked probe.
`
`prior to the preoperative imaging. A paired point rigid
`registration technique computes the transformation that aligns
`the preoperative data (i.e., tumor outline) to the real world.
`
`After registration, the surgeon can see a registered preoperative
`model overlayed on the real anatomy using the optical see-
`through goggles (Figure 2, which illustrates the concept with a
`skull model, rather than the tumor margin that would be used in
`an actual surgery).
`
`SoftwareDesign
`2.2
`The software is written in C++, using a component-based
`architecture. It has three main interconnected components: 1)
`tracking module, 2) registration module, and 3) 3D graphical
`rendering. The 3D graphical rendering component uses the
`Visualization Toolkit (VTK). The transformation matrix
`obtained from
`the registration step
`is applied
`to
`the
`preoperative model in order to represent a correctly-posed
`scene in the tracker reference frame (Figure 2). The model is
`rendered after defining a VTK actor and camera. Based on the
`tracker information, the relative position and orientation
`between the Micron and the tracked object (i.e. the skull) is
`known. We then have to reorient the VTK camera by adjusting
`its parameters to put it in the same relative position to the actor
`as the tracker is with respect to the skull. This task is achieved
`by calculating the focal point, position, view angle, and view-
`up direction of the camera. The resulting image is illustrated in
`Figure 2.
`
`Figure 2. Graphical display. Right image shows real
`video output from one Micron camera, while the
`overlayed model of the skull, after registration, is
`depicted on the left.
`
`124
`
`Medivis Exhibit 1011
`
`2
`
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket