throbber
1111111111111111 IIIIII IIIII 11111 1111111111 11111 1111111111 1111111111 111111111111111 11111111
`US 20060020204Al
`
`(19) United States
`(12) Patent Application Publication
`Serra et al.
`
`(10) Pub. No.: US 2006/0020204 Al
`Jan. 26, 2006
`( 43) Pub. Date:
`
`(54) SYSTEM AND METHOD FOR
`THREE-DIMENSIONAL SPACE
`MANAGEMENT AND VISUALIZATION OF
`ULTRASOUND DATA ("SONODEX")
`
`(75)
`
`Inventors: Luis Serra, Singapore (SG); Chua
`Beng Choon, Singapore (SG)
`
`Correspondence Address:
`KRAMER LEVIN NAFTALIS & FRANKEL
`LLP
`INTELLECTUAL PROPERTY DEPARTMENT
`1177 AVENUE OF THE AMERICAS
`NEW YORK, NY 10036 (US)
`
`(73) Assignee: Bracco Imaging, S.p.A., Milano (IT)
`
`(21) Appl. No.:
`
`11/172,729
`
`(22) Filed:
`
`Jul. 1, 2005
`
`Related U.S. Application Data
`
`(60) Provisional application No. 60/585,214, filed on Jul.
`1, 2004. Provisional application No. 60/660,858, filed
`on Mar. 11, 2005. Provisional application No. 60/585,
`462, filed on Jul. 1, 2004.
`
`Publication Classification
`
`(51)
`
`Int. Cl.
`A61B 8/00
`(2006.01)
`(52) U.S. Cl. .............................................................. 600/437
`
`ABSTRACT
`(57)
`A system and method for the imaging management of a 3D
`space where various substantially real-time scan images
`have been acquired is presented. In exemplary embodiments
`according to the present invention, a user can visualize
`images of a portion of a body or object obtained from a
`substantially real-time scanner not just as 2D images, but as
`positionally and orientationally located slices within a par(cid:173)
`ticular 3D space. In such exemplary embodiments a user can
`convert such slices into volumes whenever needed, and can
`process the images or volumes using known image process(cid:173)
`ing and/or volume rendering techniques. Alternatively, a
`user can acquire ultrasound images in 3D using the tech(cid:173)
`niques of UltraSonar or 4D Ultrasound. In exemplary
`embodiments according to the present invention, a user can
`manage various substantially real-time images obtained,
`either as slices or volumes, and can control their visualiza(cid:173)
`tion, processing and display, as well as their registration and
`fusion with other images, volumes and virtual objects
`obtained or derived from prior scans of the body or object of
`interest using various modalities.
`
`f~, cftiM\.JE.
`~(1'~
`~1Al•o ~,
`
`0001
`
`Exhibit 1107 page 1 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 1 of 26
`
`US 2006/0020204 Al
`
`Image 6
`
`FIG. 1
`
`0002
`
`Exhibit 1107 page 2 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 2 of 26
`
`US 2006/0020204 Al
`
`Image 2
`
`FIG. 2
`
`0003
`
`Exhibit 1107 page 3 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 3 of 26
`
`US 2006/0020204 Al
`
`Image 1
`
`FIG. 3
`
`0004
`
`Exhibit 1107 page 4 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 4 of 26
`
`US 2006/0020204 Al
`
`f~,J~-JE.
`~(}\(\
`~lAl1c~!
`
`FIG. 4
`
`0005
`
`Exhibit 1107 page 5 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 5 of 26
`
`US 2006/0020204 Al
`
`(111-1)
`
`(111-2)
`
`(111-3)
`
`FIG. 5
`
`0006
`
`Exhibit 1107 page 6 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 6 of 26
`
`US 2006/0020204 Al
`
`Virtual Patient skin
`603
`
`Virtual Tool
`605
`
`/
`
`/
`
`/
`
`Lesion
`601
`
`/
`
`/Virtual Trajectory
`/ 607
`
`,
`, ,
`
`I
`I
`I
`I
`I
`
`'
`
`' ' .
`
`\
`
`\
`I
`
`' I
`
`,
`
`,
`
`-----
`
`...- Ideal spherical optimal cover of RF of
`lesion 609
`
`FIG.6A
`
`0007
`
`Exhibit 1107 page 7 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 7 of 26
`
`US 2006/0020204 Al
`
`Virtual Tool
`605
`
`Virtual Patient Skin
`603
`
`,
`
`/
`
`/
`
`/
`
`/
`
`/
`
`/
`
`/
`
`/
`
`/
`
`/
`
`/
`
`/
`
`/
`
`Lesion
`601
`
`//Virtual Trajectory
`--------- / 607
`' . .
`,:+-Ideal spherical optimal cover of RF of lesion
`'
`609
`
`FIG. 68
`
`0008
`
`Exhibit 1107 page 8 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 8 of 26
`
`US 2006/0020204 Al
`
`Virtual Patient skin
`603
`
`Lesion
`601
`
`/
`
`/
`
`/
`
`/
`
`/
`
`/
`
`/
`
`/
`
`/
`
`/
`
`/Virtual Trajectory
`607
`
`/
`
`/
`
`deal spherical optimal cover of RF of lesion
`609
`
`Ultrasound Image
`620
`
`FIG. 6C
`
`0009
`
`Exhibit 1107 page 9 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 9 of 26
`
`US 2006/0020204 Al
`
`Integrated Approach
`
`S t~
`display
`(optional)
`
`Normal or Stereoscopic
`Monitor
`
`Computer and
`graphics card
`
`Ultrasound Image
`Acquisition System
`
`-~~---3D-tr_a_c-ke_r _ _ ~
`
`701
`
`702
`
`703
`
`Ultrasound
`Probe
`715
`
`□---0- ----- ---- -__ :
`
`I
`
`> ~ ' :~
`
`f--------~·;·:.:.:t
`Patient Area to
`be scanned
`730
`
`3 D Sensor attached to
`Ultrasound Probe or any other way of
`determining the position of the plane of scan
`720
`
`FIG. 7
`
`0010
`
`Exhibit 1107 page 10 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 10 of 26
`w
`
`US 2006/0020204 Al
`
`External Box Approach
`
`ULTRASOUND
`SCANNER
`
`EXTERNAL
`BOX
`
`Stereoscopic
`display
`(optional)
`
`Monitor
`
`Computer with graphics card
`
`Ultrasound
`image
`
`US Image Acquisition System
`
`Monitor
`
`Computer with 3D graphics
`capabilities
`
`Video grabber or data
`transfer port
`
`3D tracker
`
`810
`
`850
`
`Ultrasound
`Prob~
`815
`
`/
`
`3D Sensor attached to
`Ultrasound Probe
`820
`
`,
`
`r1---0 -----------------------------------------·
`/
`
`Object to be
`scanned
`/
`830
`' - - - - - -~
`
`FIG. 8
`
`0011
`
`Exhibit 1107 page 11 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 11 of 26
`
`US 2006/0020204 Al
`
`-ro --0)
`
`C)
`LL
`
`0012
`
`Exhibit 1107 page 12 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 12 of 26
`
`US 2006/0020204 Al
`
`--..c ---0')
`
`(9
`u....
`
`0013
`
`Exhibit 1107 page 13 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 13 of 26
`
`US 2006/0020204 Al
`
`-0 -0)
`
`.
`(9
`LL
`
`0014
`
`Exhibit 1107 page 14 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 14 of 26
`
`US 2006/0020204 Al
`
`--c ---m
`
`(9
`LL
`
`0015
`
`Exhibit 1107 page 15 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 15 of 26
`
`US 2006/0020204 Al
`
`(9
`u..
`
`0016
`
`Exhibit 1107 page 16 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 16 of 26
`
`US 2006/0020204 Al
`
`-~
`0 r r
`0 cc --Q) -(j)
`
`(.9
`LL
`
`0017
`
`Exhibit 1107 page 17 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 17 of 26
`
`US 2006/0020204 Al
`
`0
`
`-a..
`I---~ -O')
`
`(9
`LL
`
`0018
`
`Exhibit 1107 page 18 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 18 of 26
`
`US 2006/0020204 Al
`
`-
`
`0019
`
`Exhibit 1107 page 19 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 19 of 26
`
`US 2006/0020204 Al
`
`-C) -(j)
`
`C)
`LL
`
`0020
`
`Exhibit 1107 page 20 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 20 of 26
`
`US 2006/0020204 Al
`
`0021
`
`Exhibit 1107 page 21 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 21 of 26
`
`US 2006/0020204 Al
`
`--O')
`
`C)
`LL
`
`0022
`
`Exhibit 1107 page 22 of 36
`DENTAL IMAGING
`
`

`

`Stereoscopic
`glasses or auto(cid:173)
`stereoscopic display
`
`3D stylus
`
`2D slice or
`UltraSonar
`
`Patient pre(cid:173)
`segrn ented data
`
`Virtual keyboard
`
`3D tracked
`ultrasotmd probe
`
`FIG. 10
`
`""C
`~
`
`(')
`
`~ = ....
`t "Cl -....
`~ ....
`.... 0 =
`
`(')
`
`~
`O' =-:
`~ ....
`.... 0 =
`
`~
`?
`N
`~~
`N
`0
`0
`~
`
`'JJ. =(cid:173)~
`~ ....
`
`N
`N
`0 ....,
`N
`~
`
`d
`'JJ.
`N
`0
`0
`~
`0
`0
`N
`0
`N
`0
`,i;;..
`
`>
`'"""'
`
`0023
`
`Exhibit 1107 page 23 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 23 of 26
`
`US 2006/0020204 Al
`
`N
`T"""
`
`(9
`LL
`
`-en -T"""
`<( -<( -T"""
`
`T"""
`
`0 z
`
`T"""
`Cf)
`(9
`LL
`
`·:
`:,_.:..
`........ ·\ ·~
`. . · .. ·.:
`....
`
`0024
`
`Exhibit 1107 page 24 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 24 of 26
`
`US 2006/0020204 Al
`
`(.9
`LL
`
`en
`(.9
`LL
`
`0025
`
`Exhibit 1107 page 25 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 25 of 26
`
`US 2006/0020204 Al
`
`.
`Cl)
`(9
`LL
`
`0026
`
`Exhibit 1107 page 26 of 36
`DENTAL IMAGING
`
`

`

`Patent Application Publication Jan. 26, 2006 Sheet 26 of 26
`
`US 2006/0020204 Al
`
`CD
`T'""
`
`(9
`LL
`
`0027
`
`Exhibit 1107 page 27 of 36
`DENTAL IMAGING
`
`

`

`US 2006/0020204 Al
`
`Jan.26,2006
`
`1
`
`SYSTEM AND METHOD FOR
`THREE-DIMENSIONAL SPACE MANAGEMENT
`AND VISUALIZATION OF ULTRASOUND DATA
`("SONODEX")
`
`CROSS REFERENCE TO RELATED
`APPLICATIONS
`
`[0001] This application claims the benefit of the following
`U.S. Provisional Patent Applications: (i) Ser. No. 60/585,
`214, entitled "SYSTEM AND METHOD FOR SCANNING
`AND IMAGING MANAGEMENT WITHIN A 3D SPACE
`("SonoDEX")", filed on Jul. 1, 2004; (ii) Ser. No. 60/585,
`462, entitled "SYSTEM AND METHOD FOR A VIRTUAL
`INTERFACE FOR ULTRASOUND SCANNERS ("Virtual
`Interface")", filed on Jul. 1, 2004; and (iii) Ser. No. 60/660,
`858, entitled "SONODEX: 3D SPACE MANAGEMENT
`AND VISUALIZATION OF ULTRASOUND DATA", filed
`on Mar. 11, 2005.
`
`[0002] The following related United States patent appli(cid:173)
`cations, under common assignment herewith, are also fully
`incorporated herein by this reference: Ser. No. 10/469,294
`(hereinafter "A Display Apparatus"), filed on Aug. 29, 2003;
`Ser. No. 10/725,773 (hereinafter "Zoom Slider"), Ser. No.
`10/727,344 (hereinafter "Zoom Context"), and Ser. No.
`10/725,772 (hereinafter "3D Matching"), each filed on Dec.
`1, 2003; Ser. No. 10/744,869 (hereinafter "UltraSonar"),
`filed on Dec. 22, 2003, and Ser. No. 60/660,563 entitled "A
`METHOD FOR CREATING 4D IMAGES USING MUL(cid:173)
`TIPLE 2D IMAGES ACQUIRED IN REAL-TIME ("4D
`Ultrasound"), filed on Mar. 9, 2005.
`
`TECHNICAL FIELD
`
`[0003] The present invention relates to substantially real(cid:173)
`time imaging modalities, such as ultrasound or the equiva(cid:173)
`lent, and more precisely relates to the interactive display and
`manipulation of a three-dimensional space for which a
`plurality of scans have been performed.
`
`BACKGROUND OF THE INVENTION
`
`[0004] A substantially real-time image produced by a
`probe, such as, for example, an ultrasound probe, represents
`a cut through an organ or other 3D anatomical structure of
`a given patient. Such an image has a 3D position and
`orientation relative to the patient's depicted organ or other
`anatomical structure, and knowing this 3D position and
`orientation is often key to a proper interpretation of the
`ultrasound image for both diagnostic as well as interven(cid:173)
`tional purposes. As an example of the latter is when, for
`example, a clinician plans an intervention and must decide
`precisely where to insert a needle or therapeutically direct an
`ultrasound beam.
`
`[0005] Moreover, key in interpreting substantially real(cid:173)
`time images is the time at which a particular image was
`acquired relative to the time when the scan started. This is
`especially true in cases where one or more contrast media
`have been injected into the arteries ( or other vessels) of a
`patient, given the fact that a contrast fluid's signal varies
`with time as well as organ intake. The body is not a
`stationary object, but a time-varying one. There is much
`evidence that indicates that it is not enough to simply
`observe an organ (or a pathology) as a stationary object but
`it is necessary to perceive it as part of a time-varying process
`
`in order to truly understand its function. The most obvious
`is the heart, since it moves. One 3D image of gives one view,
`but to understand the ejection fraction, or to analyze the
`condition of a valve it is key to visualize its movement. In
`the case of a tumor, and when using contrast media and
`ultrasound, what happens is that the contrast flows through
`the arteries, then reaches and fills the tumor, and then washes
`out. It is important to visualize the entire process (wash in
`and wash out) to understand how vessels are feeding the
`tumor, as well as how much blood is the tumor taking in, in
`order to understand its aggressiveness. There is no single
`picture that can show this process. One at best can capture
`the image ( or volume) that shows the time point when the
`contrast is filling the tumor at its maximum, but that misses
`the time when the vessels are visible. Thus, the rate of
`contrast intake is important in order to diagnose and under(cid:173)
`stand the pathology.
`
`[0006] Moreover, having a volume (and not just a slice
`with position and orientation) is essential to any quantifica(cid:173)
`tion process. If there is only a probe cutting through an organ
`that is moving (due, for example, to breathing or due to its
`own movement, such as, for example, the heart) the resulting
`image can be hard to compare against another image taken
`a fraction of a second later since the organ in question will
`have moved and thus the cut will be in another, slightly
`shifted, part of the organ. However, if a comparison is made
`from one volume to another volume, such error can be
`minimized since the volume is made of several cuts and it
`averages the positioning problem.
`
`[0007] Notwithstanding the interpretational value of such
`additional information, historically conventional ultrasound
`scanners, for example, simply displayed a 'flat' image of the
`cutting plane into a given organ of interest, and provided no
`reference as to the relative position of the displayed cutting
`plane relative to anatomical context or to the displayed cut's
`acquisition time.
`
`[0008] To remedy this problem, state of the art ultrasound
`scanners, such as, for example, models manufactured by
`Kretz (now a GE company) and Philips, added 3D volu(cid:173)
`metric acquisition capabilities to their ultrasound probes. As
`a result they can display a 4D volume (i.e., a volume that
`changes with time) by producing a series of acquired images
`that can then be reconstructed into a volume. The resulting
`volume can then be displayed (after appropriate resampling)
`using standard volume rendering techniques. Nonetheless,
`while the individual slices comprising such a volume are
`loosely registered to each other (loosely because the sub(cid:173)
`ject's body is moving throughout the acquisition, and thus
`the body does not have a fixed spatial relationship to the
`probe during the acquisition) they are not registered in any
`sense to the 3D patient space.
`
`[0009] Moreover, even if such a volume is acquired and
`displayed, the physical interfaces provided to manipulate
`these volumes are not themselves three-dimensional, gen(cid:173)
`erally being nothing more than a standard computer key(cid:173)
`board and mouse (or the equivalent, such as a trackball).
`Accordingly, using such tools to effect 3D operations neces(cid:173)
`sitates awkward mappings of 3D manipulations onto essen(cid:173)
`tially 2D devices. The necessity of such awkward mappings
`may be one of the reasons why 3D visualization has not
`gained the acceptance in the medical community that it may
`be due.
`
`0028
`
`Exhibit 1107 page 28 of 36
`DENTAL IMAGING
`
`

`

`US 2006/0020204 Al
`
`Jan.26,2006
`
`2
`
`[0010] Additionally, some systems, such as, for example,
`the Esaote TM virtual navigator, described at www.esaote(cid:173)
`.com, attempt to provide a user with co-registered pre-scan
`data. However, because in such systems the display of
`ultrasound is restricted to the plane of acquisition, the
`pre-scan data is provided as 2D slices that match the plane
`of the ultrasound slice, and the ultrasound and correspond(cid:173)
`ing pre-operative scan cut are simply placed side-by-side for
`comparison, a user does not gain a 3D sense of where the
`ultrasound slice fits in vis-a-vis the patient space as a whole.
`[0011] What is thus needed in the art is a means of
`correlating ultrasound scans with the 3D space and time in
`which they have been acquired. What is further needed is an
`efficient and ergonomic interface that can allow a user to
`easily interact with ultrasound scan data as well as pre(cid:173)
`operative imaging and planning data in three-dimensions.
`
`SUMMARY OF THE INVENTION
`[0012] A system and method for the imaging management
`of a 3D space where various substantially real-time scan
`images have been, or are being, acquired are presented. In
`exemplary embodiments of the present invention, a user can
`visualize images of a portion of a body or object obtained
`from a substantially real-time scanner not just as 2D images,
`but as positionally and orientationally identified slices
`within the relevant 3D space. In exemplary embodiments of
`the present invention, a user can convert such slices into
`volumes as desired, and can process the images or volumes
`using known image processing and/or volume rendering
`techniques. Alternatively, a user can acquire ultrasound
`images in 3D using the techniques of UltraSonar or 4D
`Ultrasound. In exemplary embodiments of the present
`invention, a user can manage various substantially real-time
`images that have been obtained, either as slices or volumes,
`and can control their visualization, processing and display,
`as well as their registration and fusion with other images,
`volumes or virtual objects obtained or derived from prior
`scans of the area or object of interest using various modali(cid:173)
`ties.
`
`BRIEF DESCRIPTION OF THE DRAWINGS
`[0013] FIG. 1 depicts a user controlling an exemplary
`ultrasound session with an exemplary pen and tablet two(cid:173)
`dimensional interface according to an exemplary embodi(cid:173)
`ment of the present invention;
`[0014] FIG. 2 depicts a user performing three-dimen(cid:173)
`sional interactions in a virtual patient space displayed ste(cid:173)
`reoscopically using an exemplary three-dimensional inter(cid:173)
`face according to an exemplary embodiment of the present
`invention;
`[0015] FIG. 3 depicts a user interacting with the three(cid:173)
`dimensional virtual patient space of FIG. 2, using a mono(cid:173)
`scopic interface according to an exemplary embodiment of
`the present invention;
`[0016] FIG. 4 depicts an exemplary illustrative scenario
`where three 3D ultrasound volumes are fused with three
`pre-operative segmentations in an exemplary composite
`view according to an exemplary embodiment of the present
`invention;
`[0017] FIG. 5 depicts exemplary user manipulations of
`the pre-operative segmentations and volume scans of FIG.
`4 according to an exemplary embodiment of the present
`invention;
`
`[0018] FIGS. 6A-6C depict exemplary preparations for a
`tumor removal procedure according to an exemplary
`embodiment of the present invention;
`[0019] FIG. 7 depicts an exemplary integrated system
`implementing an exemplary embodiment of the present
`invention;
`[0020] FIG. 8 depicts an exemplary external add-on sys(cid:173)
`tem implementing an exemplary embodiment of the present
`invention;
`[0021] FIGS. 9(a)-9(d) depict various exemplary pre-op(cid:173)
`erative scenarios according to an exemplary embodiment of
`the present invention;
`[0022] FIG. 9(e) depict an
`intra-operative scenario
`according to an exemplary embodiment of the present
`invention;
`[0023] FIG. 9(j) depict an alternative exemplary pre(cid:173)
`operative scenario according to an exemplary embodiment
`of the present invention;.
`[0024] FIGS. 9(g)-9(i) respectively depict alternative
`exemplary intra-operative scenarios according to an exem(cid:173)
`plary embodiment of the present invention;
`[0025] FIG. 10 depicts an exemplary system setup accord(cid:173)
`ing to an exemplary embodiment of the present invention;
`[0026] FIG. ll(a) depicts acquiring and storing a plurality
`of 2D ultrasound slices according to an exemplary embodi(cid:173)
`ment of the present invention;
`[0027] FIG. ll(b) depicts segmenting and blending the
`2D ultrasound slices of FIG. ll(a) to produce a 3D effect
`according to an exemplary embodiment of the present
`invention;
`[0028] FIG. 12 depicts scanned regions created in a vir(cid:173)
`tual space according to an exemplary embodiment of the
`present invention;
`[0029] FIG. 13 depicts an exemplary phantom used to
`illustrate an exemplary embodiment of the present inven(cid:173)
`tion;
`[0030] FIG.14 respectively depict an UltraSonar image, a
`reconstructed volumetric image, and a smoothed zoomed in
`and cropped volumetric image of the exemplary phantom of
`FIG. 13 according to an exemplary embodiment of the
`present invention;
`[0031] FIG. 15 depict space tracking of two liver scans
`according to an exemplary embodiment of the present
`invention; and
`[0032] FIG. 16 depicts an exemplary fusion of an ultra(cid:173)
`sound image in a single-plane with pre-operative CT data
`according to an exemplary embodiment of the present
`invention.
`[0033]
`It is noted that the patent or application file con(cid:173)
`tains at least one drawing executed in color. Copies of this
`patent or patent application publication with color drawings
`will be provided by the U.S. Patent Office upon request and
`payment of the necessary fee.
`
`DETAILED DESCRIPTION OF THE
`INVENTION
`[0034] This present invention is directed to a system and
`method for the management of a 3D space where substan-
`
`0029
`
`Exhibit 1107 page 29 of 36
`DENTAL IMAGING
`
`

`

`US 2006/0020204 Al
`
`Jan.26,2006
`
`3
`
`tially real-time images have been, or are being, acquired. For
`purposes of illustration, exemplary embodiments of the
`invention will be described with reference to ultrasound
`images, it being understood that any equivalent substantially
`real-time imaging modality can be used.
`
`[0035]
`In exemplary embodiments of the present inven(cid:173)
`tion a clinician can visualize images obtained from an
`ultrasound scanner not just as 2D images but as 2D slices
`within a particular 3D space ( or alternatively as volumes
`within such 3D space), each acquired at a known time, and
`can convert such 2D slices into volumes whenever needed.
`In exemplary embodiments of the present invention, the
`method allows a user to manage the different images
`obtained (either as slices or volumes), and to manipulate
`them as well as control various display parameters, for
`example, their visualization (including stereoscopically),
`registration and segmentation.
`
`[0036] Moreover,
`in exemplary embodiments of the
`present invention, a system can record for each acquired
`real-time image its 3D time and position. Therefore, in such
`exemplary embodiments, not only can a current image slice
`be displayed in its correct 3D position, but because the time
`of acquisition is available for each image, such methods also
`allow for the display of any previously acquired information
`at the given position. This allows for the visualization of
`time-variant processes, such as, for example, an injection of
`a contrast agent. For example, a contrast agent may be
`needed in order to characterize a particular lesion in liver
`tissue that may not be visible without it. During the time that
`the contrast agent is available in the relevant tissues, a
`system can record both the 3D position and the time of
`acquisition for each image. Later, for example, when a
`procedure is desired to be performed on the relevant tissue,
`such as, for example, a thermoablation, the recording of the
`tissue with the contrast agent flowing through it can be
`replayed (being co-registered to the ablation needle which
`can also be displayed in the 3D space, either within a current
`ultrasound slice, or by tracking the needle) to again visualize
`the lesion now no longer visible.
`
`[0037] Thus, in exemplary embodiments of the present
`invention, a user can manage the entire 3D space within
`which ultrasound scans from a particular scanning session
`are obtained in a way that leads to better diagnosis and/or
`intervention. It is noted that the disclosed method works
`without "co-location" of the ultrasound images with a real
`patient. The fusion in exemplary embodiments is between
`various images, as opposed to being between a virtual world
`and a real patient space such as is done in certain conven(cid:173)
`tional augmented reality techniques.
`
`[0038]
`In exemplary embodiments of the present inven(cid:173)
`tion a 3D interactive system is provided that can work with
`either ultrasound planes (shown in their respective 3D
`context), volumetric reconstructions of such ultrasound
`information, pre-operative imaging and planning data (e.g.,
`CT, MRI, planning pathways and selected objects in 3D data
`set, etc.) as well as other elements that can contribute to the
`procedure. This adds the ability to re-position ultrasound
`planes and other elements, such as an RF probe, more easily
`since the user can see a 3D space with "floating" objects and
`he can then, for example, simply move the needle or
`ultrasound probe to the 3D point where the floating object is
`perceived. This is in contrast to conventional systems, which
`
`neither provide an unrestricted display of an ultrasound ( or
`other substantially real-time scan) plane in the context of
`co-registered pre-scan data, nor allow a user to freely move
`within the 3D space in which the real-time scan is acquired.
`Thus in exemplary embodiments of the present invention the
`facility is provided to make full use of data from prior scans
`such as, for example, CT or other ultrasound imaging scans,
`of the same patient area in an integrated manner with the
`substantially real-time images.
`
`[0039]
`In exemplary embodiments of the present inven(cid:173)
`tion the coordinate positions of prior scan and real-time
`scans can be co-registered, allowing a user to interactively
`visualize the co-registered information in a way that is
`intuitive and precise. In so doing, acquired data can, for
`example, then be used to navigate a procedure, or later
`review a case. Such post procedural review is easily avail(cid:173)
`able because the 3D positions of the ultrasound planes are
`stored and can be analyzed after the ultrasound exploration.
`
`[0040] The disclosed method operates via registration of
`ultrasound images with a virtual patient-i.e., by registering
`pre-operative images and or segmentations therefrom with
`recently acquired ultrasound data of a given patient. Alter(cid:173)
`natively, the disclosed method can operate by registering one
`set of ultrasound data with one or more other sets of
`ultrasound data, either taken at different 3D positions, or at
`different times, or both. In either case, in exemplary embodi(cid:173)
`ments of the present invention, once various images are
`co-registered, fused images incorporating all or parts of the
`various co-registered images, as may be decided dynami(cid:173)
`cally by a user, can be interactively viewed and manipulated.
`Thus, for example, a user can perform, use or implement any
`of the techniques described in any of the pending patent
`applications incorporated by reference above while perform(cid:173)
`ing an ultrasound session or ultrasound guided procedure.
`For example, a user can resegment and adjust any display
`parameters for any pre-scan data relevant to the current
`focus of the ultrasound imaging. Vessels form an earlier CT
`scan can be cropped, segmented, assigned different color
`look-up table values, thresholded, etc. so as to focus the
`current---0r recent-area of interest in the ultrasound pro(cid:173)
`cedure. Alternatively pre-procedural planning notes, high(cid:173)
`lights and/or pathways can be dynamically and interactively
`brought up, hidden, or made more or less transparent as may
`be desired throughout the ultrasound session.
`
`[0041]
`In exemplary embodiments of the present inven(cid:173)
`tion, the disclosed method can be integrated with the fol(cid:173)
`lowing technologies: (a) visualization of 2D ultrasound
`slices into a volume without the need for volume resampling
`(and the concomitant resampling errors), as described more
`fully in "UltraSonar"; and (b) a virtual interface to substan(cid:173)
`tially real-time scanning machines, as described more fully
`in "Virtual Interface."
`
`[0042] Thus, in exemplary embodiments of the present
`invention, a special virtual interface can be used to control
`an interactive ultrasound scanning session. Additionally,
`ultrasound probes and instruments can, for example, be
`tracked by a 3D tracking system so that the each of the
`probes' and instruments' respective 3D positions and ori(cid:173)
`entations can be known at all times during the ultrasound
`scan.
`
`[0043] Moreover, as noted, ultrasound scanning can, for
`example, be preceded by pre-operative CT or MR imaging
`
`0030
`
`Exhibit 1107 page 30 of 36
`DENTAL IMAGING
`
`

`

`US 2006/0020204 Al
`
`Jan.26,2006
`
`4
`
`in which, for example, a segmentation of various objects or
`a "signature" of various organs or organelles (such as, for
`example, the vascular system of a liver or kidney) can be
`extracted to identify geometrical and topological compo(cid:173)
`nents that can define the anatomy and pathology of the
`specific patient under treatment. Such a characteristic can be
`subsequently utilized to maintain registration between pre(cid:173)
`operative data and real-time ultrasound scanning images or
`volumes.
`
`[0044] Also, during ultrasound scanning, acquired images
`can, for example, be visualized using the
`techniques
`described in UltraSonar. This technique, by allowing the
`display of a certain number of past ultrasound slices to only
`slowly fade away, can allow a user to visualize 2D ultra(cid:173)
`sound slices as "pseudo-volumes" without the need for
`time-consuming re-sampling into actual 3D volumes and
`subsequent volume rendering.
`
`Control and Display Interfaces
`
`[0045]
`the
`to
`In exemplary embodiments according
`present invention a pen-and-tablet interface can be used for
`2D control, as depicted in FIG. 1. With reference thereto, a
`user 100 can, for example, physically manipulate a pen 110
`and table 120, and can thus interact with a virtual keyboard
`as shown at the bottom of display 130, in similar fashion as
`described in Virtual Interface or in A Display Apparatus.
`Thus, control commands such as, for example, pushing or
`selecting menu bars, typing in text, selecting between menu
`options, etc. can be mapped from the displayed virtual
`keyboard to 2D manipulations of the pen and tablet. The pen
`and table can utilize a 2D tracking device for this purpose.
`
`[0046] For 3D control, a 3D interface can be used as
`depicted in FIG. 2. With reference thereto, in exemplary
`embodiments of the present invention the entire interface
`can utilize a stereoscopic display 230 (note how the depicted
`scan jumps out of the screen, simulating the stereoscopic
`effect) inasmuch as this can afford superior depth perception,
`which is the key to any 3D interface. However, in alternate
`exemplary embodiments of the present invention the method
`can also be operated using a standard monoscopic interface
`330, as shown in FIG. 3, thus allowing more or less standard
`equipment to be used in, for example, more economic or
`retrofit implementations of exemplary embodiments of the
`present invention.
`
`3D Manipulations in 3D space
`
`[0047]
`the
`to
`In exemplary embodiments according
`present invention, greater control and integrated imaging
`and display management of a 3D space where substantially
`real-time imaging is performed can be enabled. For purposes
`of illustration, in what follows an exemplary ultrasound
`scanning of a liver with a lesion (tumor) will be described.
`In the following description, it is assumed, for example, that
`a patient has had a pre-operative CT scan of his liver, and
`during a subsequent surgical planning session,
`three
`"objects" were identified by the clinician, as depicted in
`FIG. 4. These objects are (i) a vessel defined by three
`terminal points (A, B, C) and a central "hub" (point D), all
`connected together; (ii) a lesion L; and (iii) an adjacent
`organ 0, for example a kidney, that serves as an anatomical
`landmark.
`
`[0048] These three objects can, for example, be defined
`geometrically in a segmentation process and can thus be
`represented by polylines, polygonal meshes, and/or other
`graphical representations.
`
`[0049] Given this exemplary pre-scan history, in an ultra(cid:173)
`sound scanning session a clinician can, for example, perform
`three corresponding volumetric ultrasound scans using, for
`example, an ultrasound probe with a 3D tracker. This
`process is illustrated in the upper right quadrant of FIG. 4.
`These scans can be, for example, with reference to FIG. 4,
`Scan 1 of blood vessel ABCD ( obtained at time T 1 , when a
`contrast medium is flowing through it, for example, at t

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket