throbber
Journal of Biomedical Informatics 55 (2015) 124–131
`
`Contents lists available at ScienceDirect
`
`Journal of Biomedical Informatics
`
`ELSEVIER
`
`j o u r n a l h o m e p a g e : w w w . e l s e v i e r . c o m / l o c a t e / y j b i n
`
`Development of a surgical navigation system based on augmented
`reality using an optical see-through head-mounted display
`Xiaojun Chen a,⇑, Lu Xu a, Yiping Wang a, Huixiang Wang b, Fang Wang b, Xiangsen Zeng b, Qiugen Wang b,
`
`Jan Egger c
`a Institute of Biomedical Manufacturing and Life Quality Engineering, State Key Laboratory of Mechanical System and Vibration, School of Mechanical Engineering, Shanghai Jiao
`Tong University, Shanghai, China
`b Shanghai First People’s Hospital Affiliated to Shanghai Jiao Tong University School of Medicine, Shanghai, China
`c Faculty of Computer Science and Biomedical Engineering, Institute for Computer Graphics and Vision, Graz University of Technology, Graz, Austria
`
`crossmark
`
`a r t i c l e
`
`i n f o
`
`a b s t r a c t
`
`Article history:
`Received 24 August 2014
`Revised 20 March 2015
`Accepted 9 April 2015
`Available online 13 April 2015
`
`Keywords:
`Surgical navigation
`Augmented reality
`Optical see-through HMD
`Intra-operative motion tracking
`
`The surgical navigation system has experienced tremendous development over the past decades for
`minimizing the risks and improving the precision of the surgery. Nowadays, Augmented Reality (AR)-based
`surgical navigation is a promising technology for clinical applications. In the AR system, virtual and actual
`reality are mixed, offering real-time, high-quality visualization of an extensive variety of information to
`the users (Moussa et al., 2012) [1]. For example, virtual anatomical structures such as soft tissues, blood
`vessels and nerves can be integrated with the real-world scenario in real time. In this study, an AR-based
`surgical navigation system (AR-SNS) is developed using an optical see-through HMD (head-mounted
`display), aiming at improving the safety and reliability of the surgery. With the use of this system,
`including the calibration of instruments, registration, and the calibration of HMD, the 3D virtual critical
`anatomical structures in the head-mounted display are aligned with the actual structures of patient in
`real-world scenario during the intra-operative motion tracking process. The accuracy verification
`experiment demonstrated that
`the mean distance
`and angular
`errors were
`respectively
`0.809 ± 0.05 mm and 1.038° ± 0.05°, which was sufficient to meet the clinical requirements.
`Ó 2015 Elsevier Inc. All rights reserved.
`
`1. Introduction
`
`During the past decades, computer-aided navigation system has
`experienced tremendous development for minimizing the risks
`and improving the precision of the surgery [2]. Nowadays, some
`commercially-available surgical navigation systems have already
`been tested and proved for clinical applications such as eNLight
`and NavSuite (Stryker Corporation, USA), Portable Nanostation
`(Praxim, France), and MATRIX POLAR (Scopis medical/XION,
`Germany). Meanwhile, many research groups also have presented
`their systems in the literature,
`for example, TUSS (Queen’s
`University, Canada), VISIT (University of Vienna, Austria), IGOIS
`(Shanghai Jiao Tong University, China), etc. [3–7]. However, all of
`these systems use computer screen to render the navigation
`information such as the real-time position and orientation of the
`surgical
`instrument, and virtual path of preoperative surgical
`
`⇑ Corresponding author at: Room 805, School of Mechanical Engineering,
`
`Shanghai Jiao Tong University, Dongchuan Road 800, Minhang District, Shanghai
`200240, China. Tel.: +86 13472889728, +86 21 34204851; fax: +86 21 34206847.
`E-mail address: xiaojunchen@163.com (X. Chen).
`
`http://dx.doi.org/10.1016/j.jbi.2015.04.003
`1532-0464/Ó 2015 Elsevier Inc. All rights reserved.
`
`planning, so that the surgeons have to switch between the actual
`operation site and computer screen which is inconvenient and
`impact the continuity of surgery.
`In recent years, due to the great development of Augmented
`Reality (AR) technology, more and more wearable AR devices have
`appeared like Google Glass, Skully AR-1 (An AR motorcycle helmet)
`[8], and etc. AR is an integrated technique of image processing, and
`in AR system, real objects and virtual (computer-generated)
`objects are combined in a real environment. Furthermore, real
`and virtual objects are aligned with each other, and run
`interactively in real time [1,9,10]. Due to the advantages of AR
`visualization, developing a surgical navigation system based on
`AR is a significant challenge for the next generation. For example,
`after the registration of the preoperative CT in relation to the
`intra-operative realistic scene, surgeons can superimpose the
`virtual CT data onto the patient’s anatomy [11]. In 2010, Liao
`et al. [12,13] developed a 3-D augmented reality navigation system
`for MRI-guided surgery by using auto-stereoscopic images, and the
`system creates a 3D image, fixed in space, which is independent of
`viewer pose.
`In addition, Navab et al.
`[14]
`from Technical
`University of Munich have demonstrated a very practical
`
`Medivis Exhibit 1009
`
`1
`
`

`

`X. Chen et al. / Journal of Biomedical Informatics 55 (2015) 124–131
`
`125
`
`application of AR. They developed an X-ray C-arm system equipped
`with a video camera, so that a fused image that combines a direct
`video view of a patient’s elbow with the registered X-ray image of
`the humerus, radius, and ulna bones was produced.
`This study presents an AR-based surgical navigation system
`(AR-SNS) using an optical see-through HMD (head-mounted dis-
`play), which encompasses the preoperative surgical planning, reg-
`istration, and intraoperative tracking. With the aid of AR-SNS, the
`surgeon wearing the HMD can obtain a fused image that virtual
`anatomical structures such as soft tissues, blood vessels and nerves
`integrated with the intra-operative real-world scenario, so that the
`safety and reliability of the surgery can be improved.
`
`2. Materials and methods
`
`2.1. The hardware architecture of AR-SNS
`
`The AR-SNS is constructed based on a high-performance graph-
`ical workstation (HP), a 2D LCD monitor (G2200W, BenQ), an opti-
`cal tracking device (Polaris Vicra, NDI Inc., Canada) and an optical
`see-through HMD (nVisor ST60, NVIS, United States) (Shown in
`Fig. 1). The workstation is equipped with a 4 GB memory card, a
`core i7 CPU and an nVIDIA Quadro FX4800 graphic card, running
`on the windows 7 operating system. As for HMD, it uses high-res-
`olution microdisplays featuring 1280  1024 24-bit color pixels
`per eye, for vivid visual rendering and integration with reality.
`
`2.2. The software framework of AR-SNS
`
`The AR-SNS is developed under the platform of the Integrated
`Development Environment (IDE) of VS2008. All of the functions
`are programmed in Microsoft Visual C++ and some famous toolkits
`are also involved, such as the Visualization Toolkit (VTK, an open
`source, freely available software system for 3D computer graphics,
`image processing, and visualization etc., http://www.vtk.org/),
`CTK, ITK, IGSTK and QT, and then integrated into the AR-SNS.
`
`Fig. 2 shows the framework of AR-SNS, and is described as fol-
`lows: on the basis of the preoperative CT data of a patient, image
`segmentation is conducted, so that 3D models including hard and
`soft tissues, especially critical anatomical structures such as
`blood vessels and nerves can be reconstructed. After 3D recon-
`struction, preoperative planning is implemented so that an opti-
`mized osteotomy trajectory can be obtained. Then, with the
`support of the optical tracking device, the calibration of the sur-
`gical instruments is performed, and the point-to-point registra-
`tion [15–17] and surface matching [18,19] methods are used to
`determine the spatial relationship between virtual coordinate
`system (VCS, refers to the computer screen coordinate system)
`and real coordinate system (RCS, refers to the patient coordinate
`system) [2]. In addition, an optical see-through head-mounted
`display is adopted so that an immersive augmented reality envi-
`ronment can be obtained and the virtual tissue can be integrated
`with the direct view. Finally, after calibration of the patient’s
`position in relation to the HMD, the position and orientation of
`the virtual model will change with corresponds to the movement
`of HMD and patient, and match the real anatomical structures
`during intra-operative navigation process, so that the preopera-
`tive plan rendered in HMD can be transferred to the real opera-
`tion site.
`
`2.3. 3D-reconstruction and preoperative surgical planning
`
`Based on the original CT data, the segmentation of the hard
`tissue is conducted by using a threshold and region growing
`combined method, and for the soft tissue in each image, semi-
`automatic region growing method is adopted, and if it is over-
`segmented or under-segmented, manual modification is also used.
`Then, 3D surface models can be reconstructed through the
`marching cubes algorithm [20]. Fig. 3 shows a 3D pelvis model
`and a bladder imported into AR-SNS after the 3D-reconstruction.
`All of the work including the image segmentation and 3D modeling
`is realized.
`
`Optical tracking device
`
`\E.
`
`• .......
`
`Optical see-through head mounted display
`
`Fig. 1. The hardware of AR-SNS.
`
`Medivis Exhibit 1009
`
`2
`
`

`

`126
`
`X. Chen et al. / Journal of Biomedical Informatics 55 (2015) 124–131
`
`Load of
`preoperative Cr
`data
`
`Image
`segmentation
`
`3D-Reconstruction
`
`Preoperati
`planning
`
`Threshold
`•
`• Region Growing
`• Manual modification
`
`• Marching Cubes
`• Decimation of Meshes
`•
`Standard Signal Processing
`Low-pass Filters
`Laplacian smoothing
`
`•
`
`• Multiple plane reconstruction
`•
`3D geometrical measurements
`•
`Plane cutting
`
`Connect optical
`tracking device
`
`•
`•
`
`Tracker Initialization
`Tracker Configuration
`
`Calibration of the
`surgical
`instruments
`
`Pivot calibration
`•
`• Axis calibration
`
`Registration
`
`•
`•
`
`Point-to-Point (SVD)
`Surface matching (ICP)
`
`Calibration of the
`patient's position in
`relation to the HMO
`
`Fig. 2. The framework of AR-SNS.
`
`3D Titanium miniscrews
`
`.n...O..8mo
`
` SR
`
`•
`
`-CQ
`rosIx OOE
`G21 0... CD
`smenteM
`—00 / 11,
`—00
`O 0
`
`tn...isdany
`
`:44zigrn'
`
`CD.E...(DC)
`0
`
`**
`
`VTK
`ITK
`IGSTK
`QT
`
`irtual drill trajectory
`
`3D bladder model
`
`Fig. 3. A 3D pelvis model and a bladder imported into the AR-SNS.
`
`On the basis of these CT images and the 3D model, the proce-
`dure of preoperative planning can be implemented and is describe
`as follows: percutaneous implantation of sacroiliac joint screw is a
`very common surgery in orthopedics. And, in order to avoid injur-
`ing the important anatomical structures like soft tissues, blood ves-
`sels and nerves, a virtual path for the surgical drill should be
`created and will be rendered on all of the 2D/3D views. An example
`for a preoperative surgical planning is shown in Fig. 4, so that the
`
`precision, safety and reliability of the implant surgery can be
`enhanced.
`
`2.4. Registration
`
`Since the Polaris Vicra optical tracking device only localizes the
`reference frame with mounted sphere-shaped retro-reflective
`markers, determining the spatial relationship between the surgical
`
`Medivis Exhibit 1009
`
`3
`
`

`

`X. Chen et al. / Journal of Biomedical Informatics 55 (2015) 124–131
`
`127
`
`Sao
`
`379
`
`Fig. 4. A virtual path for the surgical drill rendering on all of the 2D/3D views.
`
`\Vol- (I C ootcliante
`
`CM
`
`\
`
`C
`
`qa, lure
`
`en
`
`X, A
`
`A 2(R2, T2)
`
`Rc
`
`Y,
`
`1
`
`cal see-
`
`Y:1
`•
`P feien
`
`is model
`
`Fig. 5. The establishment of coordinate systems in AR-SNS.
`
`instrument and the mounted reference frame (this procedure is
`also named as ‘calibration’) needs to be done first, so that the
`movements of the reference frame can represent those of the sur-
`gical instrument [21]. In AR-SNS, the pivot calibration approach is
`adopted and the Image-Guided Surgery Toolkit (IGSTK, an open-
`source C++ software library) provides classes for performing it
`[22], detailed principle of this calibration, please refer to Ref. [21].
`The procedure of registration is an essential component for all
`computer-aided navigation systems, which brings two coordinate
`systems into spatial alignment [23]. In this study, the coordinate
`systems in AR-SNS are established in Fig. 5, and the VCS must be
`registered (aligned) with the reference frame coordinate system2
`(also referred to as the patient coordinate system). A variety of reg-
`including point-to-point, surface-based, and
`istration methods,
`template-based, are proposed. Among them, our AR-SNS integrates
`
`the fiducial point registration and surface registration together so
`that the registration accuracy is higher than traditional methods.
`As for the detailed description of the involved algorithms during
`this procedure, please refer to Ref. [2].
`
`2.5. Calibration of the patient’s position in relation to the HMD
`
`After registration, the preoperative images in HMD are aligned
`with the patient on the procedure table. Then, the calibration of
`HMD needs to be done. In the AR-SNS, the basic requirement is
`the calibration of the patient’s position in relation to the HMD,
`which means the movement of the patient or the HMD will not
`affect the initial correspondence of VCS and RCS. The principle is
`described as follows: first of all, two reference frames are respec-
`tively mounted on the HMD and patient. Then, in order to obtain
`
`Medivis Exhibit 1009
`
`4
`
`

`

`128
`
`X. Chen et al. / Journal of Biomedical Informatics 55 (2015) 124–131
`
`the calibration matrix after the initial registration of the patient’s
`position in relation to the image data set, we built the transforma-
`tion relationship shown in Fig. 5. Suppose:
`
`R is the initial transformation matrix of VCS to reference frame
`coordinate system2;
`A1 and A2 are respectively the transformation matrix of refer-
`ence frame coordinate system1 to the World Coordinate
`System before and after movement of the user head;
`A3 and A4 are respectively the transformation matrix of refer-
`ence frame coordinate system2 to the World Coordinate
`System before and after movement of the patient;
`B1 is the matrix of virtual model under the VCS before
`movement;
`B2 is the calibration transformation matrix of virtual model
`under the VCS;
`
`Since the relative position of one point on the computer virtual
`image is fixed before and after movement, setting X as the coordi-
`nate of this point in computer virtual image coordinate system.
`Then, X1 is the coordinate of X in VCS (also referred as the com-
`puter screen coordinate system) before movement and X2 is the
`coordinate of X in VCS after movement. Eq. (1) and Eq. (2) demon-
`strate the transformational relation:
`ð1Þ
`RB1X ¼ X1
`ð2Þ
`RB2X ¼ X2
`The purpose is to maintain the relative position of this point in
`patient coordinate system unchanged before and after movement,
`so setting X3 as the coordinate of X in reference frame coordinate
`system2. Eqs. (3) and (4) show the transformational relation before
`and after movement:
`ðA3Þ1A1X1 ¼ X3
`
`ð3Þ
`
`ð4Þ
`
`ðA4Þ1A2X2 ¼ X3
`Thus, based on Eqs. (1)–(4):
`B2 ¼ ½ðA4Þ1A2RŠ1ðA3Þ1A1RB1
`ð5Þ
`In the polaris optical tracking system, seven elements repre-
`senting the unit quaternion and the translation vector, are used
`to describe the transformation of a rigid body under world coordi-
`nate system. Therefore, A1, A2, A3, and A4 can be calculated on the
`basis of each rigid body’s seven elements provided by the Polaris,
`and R can be calculated through the point-based registration and
`surface registration respectively under patient coordinate system
`and virtual coordinate system.
`Since A1, A2, A3, A4, R, B1 (which can be supposed as an identity
`matrix) are all known, the calibration transformation matrix B2 can
`be calculated according to Eq. (5). As a result, the position and ori-
`entation of the virtual model will change with corresponds to the
`movement of HMD and patient, and match the real anatomical
`structures during intra-operative navigation procedure.
`The setup of the AR-SNS and the actual view seen from the HMD
`is also illustrated in Fig. 6.
`
`3. The accuracy verification of AR-based surgical navigation
`system
`
`The basic aim of using a surgical navigation system is to
`improve the surgical precision and prevent the intra-operative
`possible human errors. Therefore, the accuracy verification exper-
`iment for the AR-based surgical navigation system has been con-
`ducted. Fig. 7 shows a verification block designed in this study,
`
`which includes three components: a metal base, an organic glass
`substrate and a 3D-printed cranio-maxillofacial model. The design
`and manufacturing of the verification block is described as follows:
`Based on the original CT scanning data of a volunteer, the vir-
`tual cranio-maxillofacial model was 3D reconstructed and assem-
`bled with the virtual metal base model using the UG software.
`Then, according to these data, the real nylon cranio-maxillofacial
`model was 3D printed using the high precision (0.1 mm) thermo-
`plastics laser-sintering system (EOSINT P 395, Germany). The
`metal base was manufactured using the advanced 5-axis machine
`tool (DMU60, Germany) with the overall precision of 0.01 mm, and
`there were 175 taper holes (Diameter: 5 mm) and 40 through holes
`(Diameter: 4.2 mm) on its surface for measuring the distance error
`and angular error respectively.
`
`3.1. The scheme of precision verification experiment
`
`The precision verification process includes all procedures dur-
`ing the AR-based navigation surgery, for instance, CT scanning,
`3D reconstruction, calibration of surgical instruments, registration,
`calibration of head-mounted display, real-time motion tracking,
`etc. The details are described as follows:
`
`ix
`
`iy
`
`2
`
`;
`
`iz
`
`1. Mounting the reference frame on the metal verification block so
`that it can be tracked by the NDI Polaris Vicra tracking device.
`2. Calibrating the positioning probe (‘‘Pivot Calibration’’ and ‘‘Axis
`Calibration’’) through the calibration tool. Then, 5 or 6 fiducial
`landmarks can be collected on the surface of metal base and
`registered with the matched points on the virtual 3D model.
`3. On the basis of the point-based registration, the point cloud can
`be collected on the surface of 3D printing model to implement
`the surface-based registration so that the registration errors can
`be corrected.
`4. Calibrating the optical see-through head-mounted display
`when the user is wearing it. As a result, the 3D models are
`aligned with the real objects in the HMD during the intra-oper-
`ative motion tracking process (see Fig. 8).
`5. Using the probe to pick 100 target points in different regions of
`the metal base surface successively, and recording the actual
`coordinate of each point Pi (Pix, Piy, Piz). Then, the actual coordi-
`nate is calculated with the theoretical coordinate of each point
`ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffir
`
`
`
`
`
`Pi⁄ (Pix⁄ , Piy⁄ , Piz⁄ ) to obtain the distance error Pierr
`
`
`
`
`
`
`2 þ Piz P
`Pix P
`2 þ Piy P
`Pierr ¼
`ð6Þ
`ði ¼ 0; 1; 2; . . . ; 99Þ
`6. Inserting the probe into the 30 axial holes, and recording the
`actual axial direction Ai (Aix, Aiy, Aiz). Then, the actual axial
`direction is calculated with the theoretical axial direction
`
`q
`ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
`
`.
`
`
`
`
`Ai⁄ (Aix⁄ , Aiy⁄ , Aiz⁄ ) to obtain the angular error Aierr
`
`Aierr ¼ cos1
`
`
`Aix A ix þ Aiy A iy þ Aiz A
`
`ix þ A2iy þ A2
`ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiq
`
`A2
`;ði ¼ 0;1;2; . . . ;29Þ
`ix þ A 2iy þ A 2
`
`A 2
`
`
`
`iz
`
`iz
`
`iz
`
`ð7Þ
`
`3.2. The results of the accuracy verification experiment
`
`Before the experiment, the all 175 fiducial landmarks and 40
`axial holes on the metal verification block surface were, respec-
`tively, divided into different regions (see Fig. 7).
`Then, after the optical see-through head-mounted display was
`calibrated, we measured 100 target points and 30 axial holes in dif-
`ferent regions and calculated the distance and angular errors, the
`results are shown in Table 1. During the measuring experiment,
`
`Medivis Exhibit 1009
`
`5
`
`

`

`X. Chen et al. / Journal of Biomedical Informatics 55 (2015) 124–131
`
`129
`
`The pelvis model
`
`The virtual 3D model of the bladder
`(seen from I-D4D)
`
`Fig. 6. Wearing HMD to conduct a phantom experiment.
`
`Table 1
`The accuracy verification experiment.
`
`1 0 14.
`
`'II
`
`Region
`
`Distance error (mm)
`
`Angular error (°)
`
`4
`
`'Region 7
`
`•
`
`I
`
`9
`
`9
`9
`
`Region 9
`
`Fig. 7. The accuracy verification block.
`
`Fig. 8. The accuracy verification of AR-based surgical navigation system.
`
`the maximum distance error of 100 target points was 1.125 mm,
`and the mean distance error was 0.809 ± 0.05 mm. Meanwhile,
`the maximum angular error of 30 axes was 1.308°, and the mean
`angular error was 1.038° ± 0.05°. According to the all results above,
`it demonstrated that the maximum distance and angular errors can
`be maintained less than 1.2 mm and 1.5°, and the error distribution
`
`Maximum Minimum Mean Maximum Minimum Mean
`
`1
`2
`3
`4
`5
`6
`7
`8
`9
`
`Total
`
`1.056
`0.916
`1.118
`1.125
`1.066
`0.970
`1.108
`1.079
`1.048
`
`1.125
`
`0.542
`0.608
`0.660
`0.562
`0.596
`0.515
`0.624
`0.658
`0.592
`
`0.515
`
`0.855
`0.768
`0.868
`0.823
`0.718
`0.776
`0.812
`0.874
`0.791
`
`1.126
`1.218
`1.135
`1.072
`0.998
`1.116
`1.308
`1.158
`1.250
`
`0.809
`
`1.308
`
`0.958
`1.012
`0.972
`0.967
`0.916
`0.962
`0.856
`0.922
`0.908
`
`0.856
`
`1.026
`1.092
`1.018
`0.989
`0.954
`1.008
`1.113
`1.032
`1.106
`
`1.038
`
`was also relatively average in different regions. Therefore, the
`accuracy of this AR-based surgical navigation system was sufficient
`to meet the clinical requirements.
`
`4. Phantom experiment and cadaver experiment
`
`A phantom experiment was first conducted. With the original
`CT data (0.625 mm slice thickness, 512  512 matrix), a 3D pelvis
`model was reconstructed through the threshold segmentation,
`and fabricated using 3D printing technology. Then, the critical
`anatomical structures (especially the soft tissues, blood vessels
`and nerves) can be segmented using the manual modification
`method. In this experiment, a 3D bladder and some blood vessels
`and nerves were reconstructed. After calibration and registration,
`the AR-based surgical navigation was realized. Seen through the
`HMD, the 3D virtual anatomical structures were aligned with the
`real plastic pelvis model in any time regardless of the movement
`of HMD and model (see Fig. 6).
`Researchers have also conducted a cadaver experiment for per-
`cutaneous implantation of sacroiliac joint screw in an operating
`room. First of all, five titanium miniscrews were inserted into the
`pelvis before CT scanning for fiducial point registration. Then,
`based on the CT data, 3D-reconstruction and preoperative surgical
`planning were conducted, and the data were imported to AR-SNS.
`Fig. 3 shows the 3D- reconstruction, including a pelvis model, a
`bladder model and titanium miniscrews models. In addition, the
`optimization design of a virtual drill trajectory is the preoperative
`
`Medivis Exhibit 1009
`
`6
`
`

`

`130
`
`X. Chen et al. / Journal of Biomedical Informatics 55 (2015) 124–131
`
`planning result (see Fig. 4). Then, in order to show the position and
`orientation of the surgical drill model correctly, point calibration
`and axis calibration were carried out. After point-based and surface
`matching registration and calibration of the HMD, the 3D virtual
`bladder was aligned with the real cadaveric pelvis during the
`intra-operative process, and the 3D instrument model is displaying
`in HMD in real time, so that the operator can implant the sacroiliac
`joint screw into the pelvis along with the preoperative planned tra-
`jectory. (see Fig. 9)
`
`5. Conclusion and discussion
`
`The augmented reality technology has great potential to apply
`to the computer-aided surgical navigation system. Some examples
`of AR-based surgical applications have been presented in the liter-
`ature, for example, a hybrid tracking method for surgical aug-
`mented reality (University of Tubingen, Germany), an integral
`videography system (University of Tokyo, Japan), an alternative
`biopsy guidance system (University of Pittsburgh, America), an
`X-ray C-arm system (Technical University of Munich, Germany),
`etc. In addition, the calibration of AR device is also introduced in
`some reports. For instance, Kellner (University of Kiel, Germany)
`et al. proposed a calibration approach with optical see-through
`head mounted displays to improve the average distance judgment
`of users in 2012, but, there is a significant underestimation of dis-
`tances in the virtual environment [24]. Genc (Siemens Corporate
`Research Imaging and Visualization Department, America) et al.
`developed a method to calibrate stereoscopic optical see-through
`HMDs based on the 3D alignment of a target in the physical world
`with a virtual object in user’s view [25]. However, the calibration
`algorithm was only validated on a video see-through system, and
`the researchers did not validate it for the optical see-through
`system. Gilson (Department of Physiology, United Kingdom)
`placed a camera inside an optical see-through HMD to take
`pictures simultaneously of tracked object and features in the
`HMD display for performing camera calibration [26]. In this study,
`a method for calibrating an optical see-through HMD is introduced
`and validated on an optical see-through system, and a surgical nav-
`igation system based on AR has been developed. Furthermore,
`unlike the video see-through HMD that the real-world view is
`captured with two miniature video cameras mounted on the head
`gear, the real world is seen through semi-transparent mirrors
`placed in front of the user’s eyes with the optical see-through
`HMD. Therefore, the real- and virtual-world views are fused
`directly in the optical see-through HMD, and the impact on distor-
`tion of the field of view is slight.
`
`'
`
`Fig. 9. The panoramic view of the operating room for a cadaver experiment.
`
`With the use of this system, including 3D-reconstruction, pre-
`operative planning and registration, the 3D virtual critical anatom-
`ical structures in the optical see-through head-mounted display
`are aligned with the actual structures of patient in real-world
`scenario. During the surgery, the reference frames with mounted
`sphere-shaped retro-reflective markers are fixed on the patient
`and head-mounted display respectively so that their spatial
`positions can be localized in real time through the Polaris Vicra
`optical tracking device. Then, a method for calibrating the patient’s
`position in relation to the HMD is proposed based on a series of
`spatial transformation. Therefore, the movements of HMD and
`patient will have little effect on the overlaid graphics after the cal-
`ibration, and the position and orientation of the virtual model will
`match the real anatomical structures throughout the intra-opera-
`tive navigation procedure. In addition, we can foresee calibrating
`in a real-world scenario during the registration procedure since
`the registration accuracy is much related to the HMD calibration
`effect. If the registration error value shown on the computer screen
`cannot meet the clinical requirements, it is required to repeat the
`registration procedure until it is satisfactory.
`Therefore, some disadvantages of the traditional surgical navi-
`gation (For example, surgeon is no longer obliged to switch
`between the real operation scenario and computer screen) are
`overcome, and the safety, accuracy, and reliability of the surgery
`may be improved. Meanwhile, the results of the accuracy verifica-
`tion experiments show that it can be effective for the applications
`of minimally invasive surgery. The cadaver experiment validates
`the feasibility of AR-SNS. However, it is just a pilot study and still
`under development. Furthermore, some other experiments will be
`conducted to test the accuracy and efficiency of this system. The
`typical method is to measure the average distance deviations
`between the preoperative surgical planning trajectory and postop-
`erative CT data. The researchers from Germany recently have com-
`pared the accuracy of a navigation system for oral implantology
`using either a head-mounted display or a monitor as a device for
`visualization. The results show that using of an HMD has no major
`disadvantages compared to the monitor setting [27].
`Currently,
`there are also some commercially available
`computer-aided surgical systems. For example, the Da Vinci XiÒ
`surgical robot (Intuitive Surgical, Inc., USA) enables surgeons to
`perform complex procedures through several interactive robotic
`arms [28]. In addition, the 3D image with crystal clear definition
`and natural color inside the patient’s body can be seen with its
`3D HD vision system. However, compared with our AR-SNS sys-
`tem, the mixed reality environment is not provided in it, which
`means the critical anatomical structures (blood vessels, nerves,
`etc.) cannot be seen until they are exposed. Furthermore, although
`the Da Vinci XiÒ surgical system can translates the surgeon’s hand,
`wrist and finger movements into precise, real-time movements of
`surgical instruments, the intra-operative navigation is not included
`and the surgeon still has to depend on his experience to operate. In
`our AR-based surgical navigation system, the surgical instruments
`can be tracked in real time and the virtual anatomical models can
`be simultaneously rendered in the HMD so that the pre-operative
`planning can be transformed to the actual surgical site correctly.
`Meanwhile, there are still some technical challenges for further
`research and exploration. First, the measurement volume of
`‘‘Polaris Vicra’’ optical tracking system in this experiment is
`restricted due to the dimension of
`this
`system is only
`273 mm  69 mm  69 mm. As a result, sometimes the reference
`frame is out of tracking system and it is suggested to use another
`system named ‘‘Polaris Spectra’’, which has the larger dimension
`of 613 mm  104 mm  86 mm. In addition, the optical tracking
`result of reflective passive marker spheres attached on the refer-
`ence frame will be impacted since the clinical environment is
`slightly wet. There is now a new kind of markers called ‘‘Radix
`
`Medivis Exhibit 1009
`
`7
`
`

`

`X. Chen et al. / Journal of Biomedical Informatics 55 (2015) 124–131
`
`131
`
`Lens’’, which has a smooth plastic surface that naturally sheds liq-
`uid and is easy to wipe clean to recover tracking. Moreover, since
`the immersion effect of virtual objects in the HMD is not so strong,
`a suitable focal length of HMD is required to be set for the practical
`surgical applications. Meanwhile, the real-time performance needs
`to be improved, since a little bit time latency occurs when the vir-
`tual critical anatomical structures are moving in the head-mounted
`display. Hence, we will not only improve the hardware but also the
`software algorithm to develop a delay free AR-based surgical nav-
`igation system, and some clinical trials will be conducted to vali-
`date the accuracy and reliability of AR-SNS.
`In addition,
`according to the feedback from the surgeons and human–machine
`interaction experts, we found that the bright colors of virtual
`anatomical models with the black background in the HMD had a
`more vivid mixed reality effect. Currently, our research group is
`also developing and integrating the hand gesture recognition func-
`tion into AR-SNS with the Kinect (Microsoft Corporation, USA)
`device. Since the 3D position, orientation and full articulation of
`a human hand from markerless visual observations can be
`obtained by the Kinect sensor, the AR-SNS will provide more con-
`venient and friendly user interactions for various surgeons [29].
`Furthermore, due to the heavy weight of this optical see-
`through head-mounted display, surgeons will feel uncomfortable
`when wearing it to conduct the surgery for several hours.
`Recently, the Google Glass, a mini-computer with an 8-lb optical
`head-mounted display, has been widely used all over the world.
`So it has great potential to develop the AR-based surgical naviga-
`tion system using Google Glass.
`free and open-sourced
`Currently,
`there is a well-known,
`software package named 3D Slicer (http://www.slicer.org/) for
`visualization and medical image computing. It has been developed
`as a common research multi-platform with plenty of modules to
`support a wide variety of clinical applications such as visualization,
`segmentation, volume measurements, etc. Moreover, during the
`past several years, many research groups have developed loadable
`extension modules based on 3D Slicer, for example, DicomRtExport
`module enables basic DICOM RT studies to local storage [30] and
`iGyne module for MR-guided interstitial gynecologic brachyther-
`apy [31]. Therefore, we also plan to integrate the AR-SNS into the
`3D Slicer in the future so that it can b

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket