throbber
III IIIIIIII a NI uiu III 0,111 119110)101)1p uui uui 11110
`
`US 2016.019 1887A1
`
`(19) United States
`(19) United States
`(12) Patent Application Publication (10) Pub. No.: US 2016/019 1887 A1
`(12) Patent Application Publication (10) Pub. No.: US 2016/0191887 Al
`Jun. 30, 2016
`Casas
`(43) Pub. Date:
`Jun. 30, 2016
`(43) Pub. Date:
`Casas
`
`(54) IMAGE-GUIDED SURGERY WITH SURFACE
`IMAGE-GUIDED SURGERY WITH SURFACE
`(54)
`RECONSTRUCTION AND AUGMENTED
`RECONSTRUCTION AND AUGMENTED
`REALITY VISUALIZATION
`REALITY VISUALIZATION
`
`Applicant: Carlos Quiles Casas, Badajoz (ES)
`(71)
`(71) Applicant: Carlos Quiles Casas, Badajoz (ES)
`Inventor: Carlos Quiles Casas, Badajoz (ES)
`(72)
`(72)
`Inventor: Carlos Quiles Casas, Badajoz (ES)
`Appl. No.: 14/753,705
`(21)
`(21) Appl. No.: 14/753,705
`(22)
`Filed:
`(22) Filed:
`
`Jun. 29, 2015
`Jun. 29, 2015
`
`Related U.S. Application Data
`Related U.S. Application Data
`Provisional application No. 62/097,771, filed on Dec.
`(60)
`(60) Provisional application No. 62/097,771, filed on Dec.
`30, 2014.
`30, 2014.
`
`Publication Classification
`Publication Classification
`
`(51)
`Int. C.
`(51) Int. Cl.
`H04N I3/00
`H04N 13/00
`G06T 9/00
`G06T 19/00
`G06T 700
`G06T 7/00
`A61B 34/10
`A6 IB 34/10
`G06T I5/00
`G06T 15/00
`G06T I5/08
`G06T 15/08
`H04N I3/02
`H04N 13/02
`A6 IB34/20
`A61B 34/20
`GO2B 27/0
`G02B 27/01
`G06T 11/00
`G06T II/00
`
`
`
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`(2006.01)
`
`(52) U.S. Cl.
`(52) U.S. Cl.
`CPC ........ H04N 13/00II (2013.01); G02B 27/0172
`H04N 13/0011 (2013.01); G02B 27/0172
`CPC
`(2013.01); G06T 19/006 (2013.01); G06T
`(2013.01); G06T 19/006 (2013.01); G06T
`7/0012 (2013.01); G06T 7/0038 (2013.01);
`7/0012 (2013.01); G06T 7/0038 (2013.01);
`G06T II/005 (2013.01); G06T 15/005
`G06T 11/005 (2013.01); G06T 15/005
`(2013.01); G06T 15/08 (2013.01); H04N
`(2013.01); G06T 15/08 (2013.01); H04N
`13/0239 (2013.01); H04N 13/004 (2013.01);
`13/0239 (2013.01); H04N 13/004 (2013.01);
`H04N 13/0296 (2013.01); A61B 34/20
`H04N 13/0296 (2013.01); A61B 34/20
`(2016.02); A61B34/10 (2016.02); G02B
`(2016.02); A61B 34/10 (2016.02); G02B
`2027/0134 (2013.01); G02B 2027/0138
`2027/0134 (2013.01); G02B 2027/0138
`(2013.01); G02B 2027/014 (2013.01); G06T
`(2013.01); G02B 2027/014 (2013.01); G06T
`2200/04 (2013.01); G06T 2207/10072
`2200/04 (2013.01); G06T 2207/10072
`(2013.01); G06T 2207/20221 (2013.01); G06T
`(2013.01); GO6T 2207/20221 (2013.01); GO6T
`2207/30004 (2013.01); A61B 2090/371
`2207/30004 (2013.01); A61B 2090/371
`(2016.02)
`(2016.02)
`
`ABSTRACT
`(57)
`ABSTRACT
`(57)
`Embodiments disclose a real-time Surgery method and appa
`Embodiments disclose a real-time surgery method and appa-
`ratus for displaying a stereoscopic augmented view of a
`ratus for displaying a stereoscopic augmented view of a
`patient from a static or dynamic viewpoint of the Surgeon,
`patient from a static or dynamic viewpoint of the surgeon,
`which employs real-time three-dimensional Surface recon
`which employs real-time three-dimensional surface recon-
`struction for preoperative and intraoperative image registra
`struction for preoperative and intraoperative image registra-
`tion. Stereoscopic cameras provide real-time images of the
`tion. Stereoscopic cameras provide real-time images of the
`scene including the patient. A stereoscopic video display is
`scene including the patient. A stereoscopic video display is
`used by the Surgeon, who sees a graphical representation of
`used by the surgeon, who sees a graphical representation of
`the preoperative or intraoperative images blended with the
`the preoperative or intraoperative images blended with the
`Video images in a stereoscopic manner through a see through
`video images in a stereoscopic manner through a see through
`display.
`display.
`
`VOL!ffl-e.
`
`xu
`
`.ftron.t.,:r1*.video
`
`4
`
`4
`
`:::::::::::::):
`3.
`
`-
`
`a 3xking
`& 8.
`
`1
`
`Medivis Exhibit 1006
`
`

`

`Patent Application Publication
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 1 of 6
`Jun. 30, 2016 Sheet 1 of 6
`
`US 2016/019 1887 A1
`US 2016/0191887 Al
`
`.......
`4,
`
`'4 40
`c
`•
`
`• :
`
`•
`
`.
`
`..
`firN
`. . — :
`..,, z o E
`i . ts"
`.—,,,
`•
`4.,
`::;• --
`t \t,
`•;,-
`;
`Ls . ""--- P
`izi
`...kz
`:
`\?..
`
`/
`**w.
`1
`i
`%
`,
`
`°~~~~………--~~~~
`
`\
`
`4
`
`\\
`‘
`
`••,--
`,
`,..,
`..-
`•O::
`
`Fri
`
`....
`
`~^^^^^^^^^^^^^^^^^^^^-,
`
`s
`
`•••••••••••
`
`rf
`
`
`
`rl
`
`YYYYYY-xxYs.
`
`·********&
`
`i
`
`'
`
` Z
`
`
`
`.s..6.
`
`
`
`.
`
`ifs`
`
`•
`
`.... r T.,.,
`
`-l• EKei
`' Ei. Ni'
`m -, c.) — o,
`
`•
`
`I ,
`IN.
`..O.C.
`.r,......a m . .4 ... ......,,,
`
`i:....1
`
`'
`
`• i •
`
`s.
`I.: •
`•
`
`an
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`2
`
`Medivis Exhibit 1006
`
`

`

`Patent Application Publication
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 2 of 6
`Jun. 30, 2016 Sheet 2 of 6
`
`US 2016/019 1887 A1
`US 2016/0191887 Al
`
`
`
`205
`
`iO4
`
`203
`
`219
`
`207
`
`2
`
`2O
`
`213
`
`214
`
`17218
`
`215
`216
`
`209
`
`210
`
`211
`
`FIG. 2
`FIG. 2
`
`3
`
`Medivis Exhibit 1006
`
`

`

`Patent Application Publication
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 3 of 6
`Jun. 30, 2016 Sheet 3 of 6
`
`US 2016/019 1887 A1
`US 2016/0191887 Al
`
`
`
`AE CEIVE PNEOPERATAT SCAN Of PATIENr
`VX)§.M/C REN?.11Riii`,E6
`
`
`
`
`
`KOSTER ID
`
`sAgrktAi MATONSK: MOOEL
`&SSSSSSSSSS
`
`RECErA
`
`INK31
`
`RECEMi )£) t:lF3D£C1 SCAMINCI. Or EsrASt£:E`3'F DEMINO SUNOKIIV
`
`:^^ - - - - - - - -- ~~~~~~~~~~~~~~~~~~~~~~~
`r-
`itz3$43:E ikt-CONIIMIViON
`SSSSSSSSSSSS
`ES
`:La
`
`VSER INPUT
`
`k`l
`
`
`
`COMPARK :.M 500,4Mf AND gE)StfOrACE
`EN ri kMikiii PAPENT PtInti
`
`+.23>plY 0051 TO WVOI.(3)VEi itatkaE
`
`REK:EM
`
`IS$U. L INPUT
`
`vizi
`
`1
`LW I
`
`,1113
`
`,
`:!
`
`RUCWE
`
`Kr.:EivE •rnitc.m.s rwim rtc.spct.s, ittsesTRUMEiN rs Aro Pitmen
`
`4
`
`COMPARt
`
`Sig:FACC ANA $E
`
`MKI
`
`RECEIVE. TRACON:i EkArt, (AOM C.AMCRAS
`
`agu
`
`COM.P4RE POSIIM OF. CAM EIV,S ANO ti()S41.agiF.R EN
`r 3:W!.
`(..ONIMON COOP PRIAM SYS(
`331
`
`COMPAiV(.
`
`rliAKXINKi Wrrti PAViCN.!.
`
`illiCKiNO
`
`•
`I:ECM-WINE ))1( LAI no. :Rirm.issig.x K. PAT 3II NT PosaTtom r()
`C.OkIMON C.C.IMWN4TE SYSKM
`,3333
`
`<X)MEin*E0 ;NSAGE OF 40 W.Lsiksii ASNOMEREQ5Ctivie.::
`342
`VIOCE)
`
`$MAiss'E. of PgkOK RAMC WAN A':= A.k>k
`mr3tair
`
`:rtfi) t3P;i£a
`
`AmIN104TEI) MAW' iMicCiE
`
`Natifg.Ant.114
`
`RECIAn: MiAairAtAllt.
`
`IMMO
`
`11§:
`
`at:
`
`FIG. 3
`
`4
`
`Medivis Exhibit 1006
`
`

`

`Patent Application Publication
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 4 of 6
`Jun. 30, 2016 Sheet 4 of 6
`
`US 2016/019 1887 A1
`US 2016/0191887 Al
`
`
`
`407
`
`4O6
`
`4O9
`
`403
`
`402
`
`401
`
`4O8
`
`404
`
`FIG. 4
`FIG. 4
`
`5
`
`Medivis Exhibit 1006
`
`

`

`Patent Application Publication
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 5 of 6
`Jun. 30, 2016 Sheet 5 of 6
`
`US 2016/019 1887 A1
`US 2016/0191887 Al
`
`507
`
`SO2
`
`501
`
`50G
`
`
`
`.503
`
`504
`
`FIG.S
`FIG. 5
`
`6
`
`Medivis Exhibit 1006
`
`

`

`Patent Application Publication
`Patent Application Publication
`
`Jun. 30, 2016 Sheet 6 of 6
`Jun. 30, 2016 Sheet 6 of 6
`
`US 2016/019 1887 A1
`US 2016/0191887 Al
`
`
`
`COMPLITM SYSTEM
`
`LOGIC SLISSYSTEM
`
`DATA-HOLDING SUBSYSTEM
`
`N.4
`
`DISPLAY Sussysam
`
`COMM LINIC.AnOll SLISMIEM
`
`FIG. 6
`FIG. 6
`
`7
`
`Medivis Exhibit 1006
`
`

`

`US 2016/019 1887 A1
`US 2016/0191887 Al
`
`1
`
`Jun. 30, 2016
`Jun. 30, 2016
`
`IMAGE-GUIDED SURGERY WITH SURFACE
`IMAGE-GUIDED SURGERY WITH SURFACE
`RECONSTRUCTION AND AUGMENTED
`RECONSTRUCTION AND AUGMENTED
`REALITY VISUALIZATION
`REALITY VISUALIZATION
`
`CROSS-REFERENCE TO RELATED
`CROSS-REFERENCE TO RELATED
`APPLICATIONS
`APPLICATIONS
`0001. This application claims a benefit of priority under 35
`[0001] This application claims a benefit of priority under 35
`U.S.C. S 119 to Provisional Application No. 62/097,771 filed
`U.S.C. §119 to Provisional Application No. 62/097,771 filed
`on Dec. 30, 2014, which is fully incorporated herein by ref
`on Dec. 30, 2014, which is fully incorporated herein by ref-
`erence in their entirety.
`erence in their entirety.
`
`BACKGROUND INFORMATION
`BACKGROUND INFORMATION
`0002 1. Field of the Invention
`[0002] 1. Field of the Invention
`0003 Embodiments are directed towards image-guided
`[0003] Embodiments are directed towards image-guided
`Surgery, and more particularly CT-guided, MR-guided, fluo
`surgery, and more particularly CT-guided, MR-guided, fluo-
`roscopy-based or Surface-based image-guided Surgery,
`roscopy-based or surface-based image-guided surgery,
`wherein images of a portion of a patient are taken in the
`wherein images of a portion of a patient are taken in the
`preoperative or intraoperative setting and used during Surgery
`preoperative or intraoperative setting and used during surgery
`for guidance.
`for guidance.
`0004 2. Background
`[0004] 2. Background
`0005. In the practice of Surgery, an operating Surgeon is
`[0005]
`In the practice of surgery, an operating surgeon is
`generally required to look back and forth between the patient
`generally required to look back and forth between the patient
`and a monitor displaying patient anatomical information for
`and a monitor displaying patient anatomical information for
`guidance in operation. In this manner, a type of mental map
`guidance in operation. In this manner, a type of mental map-
`ping is made by the Surgeon to understand the location of the
`ping is made by the surgeon to understand the location of the
`target structures. However, this type of mental mapping is
`target structures. However, this type of mental mapping is
`difficult, has a steep learning curve, and compromises the
`difficult, has a steep learning curve, and compromises the
`accuracy of the information used.
`accuracy of the information used.
`0006 Equipment has been developed by many companies
`[0006] Equipment has been developed by many companies
`to provide intraoperative interactive Surgery planning and
`to provide intraoperative interactive surgery planning and
`display systems, mixing live video of the external Surface of
`display systems, mixing live video of the external surface of
`the patient with interactive computer generated models of
`the patient with interactive computer generated models of
`internal anatomy obtained from medical diagnostic imaging
`internal anatomy obtained from medical diagnostic imaging
`data of the patient. The computer images and the live video
`data of the patient. The computer images and the live video
`are coordinated and displayed to a Surgeon in real time during
`are coordinated and displayed to a surgeon in real time during
`Surgery, allowing the Surgeon to view internal and external
`surgery, allowing the surgeon to view internal and external
`structures and the relationship between them simultaneously,
`structures and the relationship between them simultaneously,
`and adjust the Surgery accordingly.
`and adjust the surgery accordingly.
`0007 Preoperative or intraoperative image registration
`[0007] Preoperative or intraoperative image registration
`with Surface reconstruction has been done in conventional
`with surface reconstruction has been done in conventional
`Surgery navigation systems either with a single 3D scanner
`surgery navigation systems either with a single 3D scanner
`device that functions at the same time as video camera (e.g.
`device that functions at the same time as video camera (e.g.
`time-of-flight cameras). These conventional systems display
`time-of-flight cameras). These conventional systems display
`the Surgeon’s main viewpoint, or a video camera or stereo
`the surgeon's main viewpoint, or a video camera or stereo-
`scopic video cameras that are used as viewpoint for the Sur
`scopic video cameras that are used as viewpoint for the sur-
`geon are used for processing a Surface reconstruction. These
`geon are used for processing a surface reconstruction. These
`conventional systems may enhance the Surface reconstruction
`conventional systems may enhance the surface reconstruction
`or image registration with other techniques, such as optical or
`or image registration with other techniques, such as optical or
`infrared techniques, markers, etc. However, these systems are
`infrared techniques, markers, etc. However, these systems are
`limited in the availability of precise 3D surfaces, in their
`limited in the availability of precise 3D surfaces, in their
`precision and speed of image registration of preoperative or
`precision and speed of image registration of preoperative or
`intraoperative image with the 3D Surfaces, and in blending
`intraoperative image with the 3D surfaces, and in blending
`Such registered images with the viewpoint of the Surgeon.
`such registered images with the viewpoint of the surgeon.
`0008 Accordingly needs exist for more effective systems
`[0008] Accordingly needs exist for more effective systems
`and methods that combine real-time preoperative images with
`and methods that combine real-time preoperative images with
`virtual graphics associated with the preoperative images,
`virtual graphics associated with the preoperative images,
`wherein the combination of the preoperative images and Vir
`wherein the combination of the preoperative images and vir-
`tual graphics is displayed on a stereoscopic, see through, head
`tual graphics is displayed on a stereoscopic, see through, head
`mounted display.
`mounted display.
`
`SUMMARY OF THE INVENTION
`SUMMARY OF THE INVENTION
`0009 Embodiments disclosed here describe a real-time
`[0009] Embodiments disclosed here describe a real-time
`Surgery navigation method and apparatus for displaying an
`surgery navigation method and apparatus for displaying an
`
`augmented view of the patient from the preferred static or
`augmented view of the patient from the preferred static or
`dynamic viewpoint of the Surgeon. Embodiments utilize a
`dynamic viewpoint of the surgeon. Embodiments utilize a
`Surface image, a graphical representation the internal ana
`surface image, a graphical representation the internal ana-
`tomic structure of the patient processed from preoperative or
`tomic structure of the patient processed from preoperative or
`intraoperative images, and a computer registering both
`intraoperative images, and a computer registering both
`images. Responsive to registering the images, a head
`images. Responsive to registering the images, a head
`mounted display may present to a Surgeon an augmented view
`mounted display may present to a surgeon an augmented view
`of the patient, wherein the augmented reality is presented via
`of the patient, wherein the augmented reality is presented via
`a head mounted display.
`a head mounted display.
`0010 Embodiments disclosed herein include a stereo
`[0010] Embodiments disclosed herein include a stereo-
`scopic camera system. The stereoscopic camera system may
`scopic camera system. The stereoscopic camera system may
`be configured to provide real-time stereoscopic images of a
`be configured to provide real-time stereoscopic images of a
`target portion of the patient. In embodiments, the stereo
`target portion of the patient. In embodiments, the stereo-
`scopic camera system may include a 3D scanner system that
`scopic camera system may include a 3D scanner system that
`is configured to determine location data and orientation data,
`is configured to determine location data and orientation data,
`wherein the location data and orientation data are determined
`wherein the location data and orientation data are determined
`in reference to a common coordinate system.
`in reference to a common coordinate system.
`0011
`Responsive to the stereoscopic camera system
`[0011] Responsive to the stereoscopic camera system
`recording media, and determining the location data and ori
`recording media, and determining the location data and ori-
`entation data, a stereoscopic view of the 3D volume image
`entation data, a stereoscopic view of the 3D volume image
`may be output to a stereoscopic display to the Surgeon in real
`may be output to a stereoscopic display to the surgeon in real
`time. The stereoscopic view of the 3D volume image may be
`time. The stereoscopic view of the 3D volume image may be
`blended in a same position as the patient appears in the Ste
`blended in a same position as the patient appears in the ste-
`reoscopic video images during Surgery. The stereoscopic
`reoscopic video images during surgery. The stereoscopic
`view of the 3D volume image are displayed in the preferred
`view of the 3D volume image are displayed in the preferred
`manner, e.g. using background Subtraction techniques, the
`manner, e.g. using background subtraction techniques, the
`3D Volume image appearing over the patient as background
`3D volume image appearing over the patient as background
`model, the hands and instruments appearing as foreground
`model, the hands and instruments appearing as foreground
`objects.
`objects.
`0012 Embodiments may be configured to assist in real
`[0012] Embodiments may be configured to assist in real
`time during surgery, wherein the stereoscopic view of the 3D
`time during surgery, wherein the stereoscopic view of the 3D
`Volume image is presented in a Surgeon’s field of view in a
`volume image is presented in a surgeon's field of view in a
`Stereoscopic manner, e.g. graphical representations of instru
`stereoscopic manner, e.g. graphical representations of instru-
`ments tracked, Surgical guides or techniques, anatomical
`ments tracked, surgical guides or techniques, anatomical
`models, etc. as needed. Accordingly, utilizing the stereo
`models, etc. as needed. Accordingly, utilizing the stereo-
`scopic view of the 3D volume image, the Surgeon may be able
`scopic view of the 3D volume image, the surgeon may be able
`to make adjustments to the stereoscopic view of the 3D vol
`to make adjustments to the stereoscopic view of the 3D vol-
`ume image. For example, the Surgeon may modify the stereo
`ume image. For example, the surgeon may modify the stereo-
`scopic view of the 3D volume image by selecting a transpar
`scopic view of the 3D volume image by selecting a transpar-
`ency, color and contrast of each image layer displayed, using
`ency, color and contrast of each image layer displayed, using
`an available real-time user interface means, which may
`an available real-time user interface means, which may
`include gesture recognition methods.
`include gesture recognition methods.
`0013 Embodiments may be independent devices and pro
`[0013] Embodiments may be independent devices and pro-
`cesses for each main task provided during Surgery: Surface
`cesses for each main task provided during surgery: surface
`reconstruction and image registration, Stereoscopic video and
`reconstruction and image registration, stereoscopic video and
`Stereoscopic image registration. Embodiments may also be
`stereoscopic image registration. Embodiments may also be
`configured to provide an enhanced depth perception through
`configured to provide an enhanced depth perception through
`background subtraction methods, and real-time user interac
`background subtraction methods, and real-time user interac-
`tion, which may change the separation of the stereoscopic
`tion, which may change the separation of the stereoscopic
`Video cameras, adjusting the position of the registered 3D
`video cameras, adjusting the position of the registered 3D
`Volume, displaying the 3D volume in a precise manner, adapt
`volume, displaying the 3D volume in a precise manner, adapt-
`ing for pose change detected in the Surface, adjusting the
`ing for pose change detected in the surface, adjusting the
`degree of transparency, color and contrast, etc.
`degree of transparency, color and contrast, etc.
`0014 Embodiments disclosed herein disclose systems
`[0014] Embodiments disclosed herein disclose systems
`that are configured to record stereoscopic video with at least
`that are configured to record stereoscopic video with at least
`two mounted cameras. The media record by the cameras may
`two mounted cameras. The media record by the cameras may
`be in the field of view of a surgeon. Utilizing a head-mounted
`be in the field of view of a surgeon. Utilizing a head-mounted
`display, the Surgeon may freely move in the operating room,
`display, the surgeon may freely move in the operating room,
`keeping the desired field of vision defined by the position and
`keeping the desired field of vision defined by the position and
`orientation of the mounted cameras. With the mounted cam
`orientation of the mounted cameras. With the mounted cam-
`eras and the head mounted display, the Surgeon would be able
`eras and the head mounted display, the surgeon would be able
`to view the media recorded by the mounted cameras.
`to view the media recorded by the mounted cameras.
`00.15
`Virtual graphics may be added to the media
`[0015] Virtual graphics may be added to the media
`recorded by the two cameras. Responsive to the virtual graph
`recorded by the two cameras. Responsive to the virtual graph-
`
`8
`
`Medivis Exhibit 1006
`
`

`

`US 2016/019 1887 A1
`US 2016/0191887 Al
`
`2
`
`Jun. 30, 2016
`Jun. 30, 2016
`
`ics being added to the recorded media, the Surgeon may be
`ics being added to the recorded media, the surgeon may be
`presented on the head-mounted display a preoperative image,
`presented on the head-mounted display a preoperative image,
`Such as a 3D volume image of a previous CT. The preopera
`such as a 3D volume image of a previous CT. The preopera-
`tive image may be presented, recorded, or registered (referred
`tive image may be presented, recorded, or registered (referred
`to hereinafter collectively and individually as “registered”)
`to hereinafter collectively and individually as "registered")
`over the patient, in real time. Thus, the internal anatomical
`over the patient, in real time. Thus, the internal anatomical
`structures of the patient may be blended with the media
`structures of the patient may be blended with the media
`recorded by the mounted cameras.
`recorded by the mounted cameras.
`0016. In embodiments, tracking may be configured to be
`[0016]
`In embodiments, tracking may be configured to be
`added to instruments or implants within the preoperative
`added to instruments or implants within the preoperative
`image, wherein virtual graphics are associated with views
`image, wherein virtual graphics are associated with views
`inside the patient presented to the Surgeon. Accordingly,
`inside the patient presented to the surgeon. Accordingly,
`embodiments may be configured to register preoperative
`embodiments may be configured to register preoperative
`images blended with virtual graphics over a target portion of
`images blended with virtual graphics over a target portion of
`a patient, wherein the blended images are presented over a
`a patient, wherein the blended images are presented over a
`visual field of a Surgeon.
`visual field of a surgeon.
`0017. In embodiments, an intermediate 3D surface may be
`[0017]
`In embodiments, an intermediate 3D surface may be
`obtained by surface reconstruction via 3D scanners. The
`obtained by surface reconstruction via 3D scanners. The
`intermediate 3D surface may be used for registration with a
`intermediate 3D surface may be used for registration with a
`3D volume obtained by volume rendering via image data
`3D volume obtained by volume rendering via image data
`from a CT or MR scan. The 3D volume image of the patient
`from a CT or MR scan. The 3D volume image of the patient
`may be automatically located in real time to the positioned of
`may be automatically located in real time to the positioned of
`the patient based on a common coordinate system between
`the patient based on a common coordinate system between
`the stereoscopic cameras, head mounted display, the virtual
`the stereoscopic cameras, head mounted display, the virtual
`graphics, and/or the 3D Surface. The 3D volume image may
`graphics, and/or the 3D surface. The 3D volume image may
`be any Surface rendering of a preoperative image.
`be any surface rendering of a preoperative image.
`0018 Tracking a 3D scanner's virtual camera and the
`[0018] Tracking a 3D scanner's virtual camera and the
`mounted camera to the coordinate system, may define where
`mounted camera to the coordinate system, may define where
`an augmented view may be positioned on the head mounted
`an augmented view may be positioned on the head mounted
`display. Accordingly, the preoperative images may be utilized
`display. Accordingly, the preoperative images may be utilized
`without markers, which may allow for more flexible and
`without markers, which may allow for more flexible and
`quicker registration.
`quicker registration.
`0019. These, and other, aspects of the invention will be
`[0019] These, and other, aspects of the invention will be
`better appreciated and understood when considered in con
`better appreciated and understood when considered in con-
`junction with the following description and the accompany
`junction with the following description and the accompany-
`ing drawings. The following description, while indicating
`ing drawings. The following description, while indicating
`various embodiments of the invention and numerous specific
`various embodiments of the invention and numerous specific
`details thereof, is given by way of illustration and not of
`details thereof, is given by way of illustration and not of
`limitation. Many Substitutions, modifications, additions or
`limitation. Many substitutions, modifications, additions or
`rearrangements may be made within the scope of the inven
`rearrangements may be made within the scope of the inven-
`tion, and the invention includes all Such substitutions, modi
`tion, and the invention includes all such substitutions, modi-
`fications, additions or rearrangements.
`fications, additions or rearrangements.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`BRIEF DESCRIPTION OF THE DRAWINGS
`0020 Non-limiting and non-exhaustive embodiments of
`[0020] Non-limiting and non-exhaustive embodiments of
`the present invention are described with reference to the fol
`the present invention are described with reference to the fol-
`lowing figures, wherein like reference numerals refer to like
`lowing figures, wherein like reference numerals refer to like
`parts throughout the various views unless otherwise speci
`parts throughout the various views unless otherwise speci-
`fied.
`fied.
`0021
`FIG. 1 shows a block diagram of the present inven
`[0021] FIG. 1 shows a block diagram of the present inven-
`tion;
`tion;
`0022 FIG. 2 shows a perspective view of a surgery navi
`[0022] FIG. 2 shows a perspective view of a surgery navi-
`gation system of the present invention;
`gation system of the present invention;
`0023 FIG. 3 shows a flow diagram of a method of the
`[0023] FIG. 3 shows a flow diagram of a method of the
`present invention;
`present invention;
`0024 FIG. 4 shows a perspective view of the surgery
`[0024] FIG. 4 shows a perspective view of the surgery
`navigation system of the present invention;
`navigation system of the present invention;
`0025 FIG. 5 shows a perspective view of the surgery
`[0025] FIG. 5 shows a perspective view of the surgery
`navigation system of the present invention;
`navigation system of the present invention;
`0026 FIG. 6 shows a block diagram depicting a comput
`[0026] FIG. 6 shows a block diagram depicting a comput-
`ing device of the present invention.
`ing device of the present invention.
`0027 Corresponding reference characters indicate corre
`[0027] Corresponding reference characters indicate corre-
`sponding components throughout the several views of the
`sponding components throughout the several views of the
`drawings. Skilled artisans will appreciate that elements in the
`drawings. Skilled artisans will appreciate that elements in the
`
`figures are illustrated for simplicity and clarity and have not
`figures are illustrated for simplicity and clarity and have not
`necessarily been drawn to Scale. For example, the dimensions
`necessarily been drawn to scale. For example, the dimensions
`of some of the elements in the figures may be exaggerated
`of some of the elements in the figures may be exaggerated
`relative to other elements to help improve understanding of
`relative to other elements to help improve understanding of
`various embodiments of the present disclosure. Also, com
`various embodiments of the present disclosure. Also, com-
`mon but well-understood elements that are useful or neces
`mon but well-understood elements that are useful or neces-
`sary in a commercially feasible embodiment are often not
`sary in a commercially feasible embodiment are often not
`depicted in order to facilitate a less obstructed view of these
`depicted in order to facilitate a less obstructed view of these
`various embodiments of the present disclosure.
`various embodiments of the present disclosure.
`
`DETAILED DESCRIPTION OF THE INVENTION
`DETAILED DESCRIPTION OF THE INVENTION
`0028. In the following description, numerous specific
`In the following description, numerous specific
`[0028]
`details are set forth in order to provide a thorough understand
`details are set forth in order to provide a thorough understand-
`ing of the present embodiments. It will be apparent, however,
`ing of the present embodiments. It will be apparent, however,
`to one having ordinary skill in the art that the specific detail
`to one having ordinary skill in the art that the specific detail
`need not be employed to practice the present embodiments. In
`need not be employed to practice the present embodiments. In
`other instances, well-known materials or methods have not
`other instances, well-known materials or methods have not
`been described in detail in order to avoid obscuring the
`been described in detail in order to avoid obscuring the
`present embodiments.
`present embodiments.
`0029 FIG. 1 shows an exemplary embodiment of the sur
`[0029] FIG. 1 shows an exemplary embodiment of the sur-
`gical navigation system. Surgical navigation system 100 may
`gical navigation system. Surgical navigation system 100 may
`include devices configured to create a 3D rendering of a
`include devices configured to create a 3D rendering of a
`region of interest.
`region of interest.
`0030. Using computer means 100, volume data of a patient
`[0030] Using computer means 100, volume data of a patient
`scanned with a preoperative imaging 102 oran intraoperative
`scanned with a preoperative imaging 102 or an intraoperative
`imaging 106 device (e.g. CT scanner) is rendered as a 3D
`imaging 106 device (e.g. CT scanner) is rendered as a 3D
`Volume image using a Volume rendering technique 104 and
`volume image using a volume rendering technique 104 and
`stored for processing, wherein the Volume data is associated
`stored for processing, wherein the volume data is associated
`with a volume of the patient. Preoperative 102 and intraop
`with a volume of the patient. Preoperative 102 and intraop-
`erative images 106 are also stored as digital images 108 for
`erative images 106 are also stored as digital images 108 for
`processing.
`processing.
`0031 While computing means 110 is scanning the volume
`[0031] While computing means 110 is scanning the volume
`data, 3D scanner system 110 may be configured to capture a
`data, 3D scanner system 110 may be configured to capture a
`3D surface 112 of the target portion of the patient 118, and a
`3D surface 112 of the target portion of the patient 118, and a
`Stereoscopic camera system (e.g. pair of cameras) 114 may be
`stereoscopic camera system (e.g. pair of cameras) 114 may be
`configured to obtain a stereoscopic video 116 of the scene,
`configured to obtain a stereoscopic video 116 of the scene,
`including the target portion of the patient 118.
`including the target portion of the patient 118.
`0032 Registration of the 3D volume and the 3D surface
`[0032] Registration of the 3D volume and the 3D surface
`120 is performed by computer means 100, as is the registra
`120 is performed by computer means 100, as is the registra-
`tion of the stereoscopic video with the 3D surface 122. In
`tion of the stereoscopic video with the 3D surface 122. In
`embodiments, registration of 3D volume 104 and stereo
`embodiments, registration of 3D volume 104 and stereo-
`scopic video 116 is completed through an intermediate reg
`scopic video 116 is completed through an intermediate reg-
`istration of both images with the 3D surface image 112 into a
`istration of both images with the 3D surface image 112 into a
`common coordinate system,
`common coordinate system,
`0033. The images are processed 124 and sent to the ste
`[0033] The images are processed 124 and sent to the ste-
`reoscopic display 126 used by the surgeon 128. T

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket