`
`By: Joseph Harmer
`Jed Hansen
`Thorpe North & Western, LLP
`175 S. Main St., Ste. 900
`Salt Lake City, UT 84111
`Tel: (801) 566-6633
`
`
`
`
`
`
`
`
`
`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`_____________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`_____________
`
`
`MEDIVIS, INC.,
`
`Petitioner,
`
`
`v.
`
`
`NOVARAD CORP.,
`
`Patent Owner.
`_____________
`
`Patent 11,004,271 B2
`_____________
`
`Case IPR 2023-00042
`
`
`
`PATENT OWNER RESPONSE
`
`
`
`
`
`Case IPR 2023-00042
`Patent 11,004,271 B2
`
`
`
`TABLE OF CONTENTS
`
`
`
`INTRODUCTION ........................................................................................... 1
`I.
`BACKGROUND ............................................................................................. 2
`II.
`A. Technology Background .................................................................................. 2
`B. The Challenged Claims of the ’271 Patent ....................................................... 8
`III. CLAIM CONSTRUCTION ..........................................................................11
`A. “three-dimensional (3D) data … including an outer layer of the patient and
`multiple inner layers of the patient” ..............................................................11
`B. “inner layer(s) of the patient” .........................................................................14
`C. “confined within a virtual 3D shape” .............................................................15
`D. “being having” ................................................................................................18
`IV. AMIRA, CHEN, 3D Visualization, and 3D Slicer ARE NOT PRIOR ART .18
`V.
`THE CHALLENGED CLAIMS ARE PATENTABLE ...............................20
`A. Summary of Alleged Prior Art .......................................................................21
`1. Doo (Ex. 1008) ..........................................................................................21
`2. Amira (Ex. 1005) .......................................................................................21
`3. Chen (Ex. 1009) .........................................................................................22
`4. 3D Visualization (Ex. 1007) ......................................................................22
`5. 3D Slicer (Ex. 1010) ..................................................................................23
`B. Ground 1: Claims 1, 5, and 6 Are Not Anticipated by Doo. ..........................24
`1. Doo Does not Disclose 1[Pre]. ..................................................................25
`2. Doo Does not Disclose 1[A]. .....................................................................26
`3. Doo Does not Disclose 1[B]. .....................................................................27
`4. Doo Does not Disclose 1[C]. .....................................................................28
`5. Doo Does Not Disclose the Elements of Claim 5. ....................................32
`6. Doo Does Not Disclose the Elements of Claim 6. ....................................35
`C. Ground 2: Claims 1-6 and 11-20 Are Not Obvious Over Doo in View of
`Amira. ............................................................................................................35
`
`
`
`ii
`
`
`
`Case IPR 2023-00042
`Patent 11,004,271 B2
`
`
`1. Doo and Amira Do Not Disclose All of the Claim Elements Recited in
`Claims 1-6. .....................................................................................................36
`2. Doo and Amira Do Not Disclose All of the Claim Elements Recited in
`Claim 11. ........................................................................................................43
`3. Doo and Amira Do Not Disclose All of the Claim Elements Recited in
`Claims 12-20. .................................................................................................46
`4. There is No Motivation to Combine Doo with Amira. ..............................48
`D. Ground 3: Claims 1-6 and 11-20 Are Not Obvious Over Chen in view of 3D
`Visualization and 3D Slicer. ..........................................................................52
`1. Chen, 3D Visualization and 3D Slicer Do Not Disclose All of the
`Elements Recited in Claims 1-6 and 11-20. ..................................................52
`2. Chen, 3D Visualization and 3D Slicer Do Not Disclose All of the
`Elements Recited in Claims 11-20. ...............................................................60
`3. There is No Motivation to Combine Chen with 3D Visualization and 3D
`Slicer. .............................................................................................................63
`VI. CONCLUSION ..............................................................................................65
`
`
`
`
`
`
`
`iii
`
`
`
`Case IPR 2023-00042
`Patent 11,004,271 B2
`
`
`
`TABLE OF AUTHORITIES
`
`Cases
`Apotex Inc. v. Wyeth LLC,
`IPR2014-00115 (Apr. 20, 2015) ...........................................................................64
`Argentum Pharms. LLC v. Res. Corp. Techs., Inc.,
`IPR2016-00204 (PTAB May 23, 2016) ...............................................................19
`Blue Calypso, LLC v. Groupon, Inc.,
`815 F.3d 1331 (Fed. Cir 2016) .............................................................................18
`CCS Fitness, Inc. v. Brunswick Corp.,
`288 F.3d 1359 (Fed. Cir. 2002) ............................................................................11
`Continental Can Co. USA, Inc. v. Monsanto Co.,
`948 F.2d 1264 (Fed. Cir. 1991) ............................................................................34
`Dynamic Drinkware v. Nat’l Graphics, Inc.,
`800 F.3d 1375 (Fed. Cir. 2015) ............................................................................20
`Ex parte LARS STOPPE,
`2023 Pat. App. LEXIS 1765 (P.T.A.B. May 23, 2023) ......................................40
`Graham v. John Deere Co. of Kansas City,
`383 U.S. 1 (1966) .................................................................................................35
`HewlettPackard Co. v. U.S. Philips Corp.,
`IPR2015-01505 (PTAB Jan. 19, 2016) ................................................................19
`In re NTP, Inc.,
`654 F.3d 1279 (Fed. Cir. 2011) ............................................................................49
`In re Ratti,
`270 F.2d 810 (CCPA 1959) ..................................................................................40
`In re Stepan Co.,
`868 F.3d 1342 (Fed. Cir. 2017) ............................................................................52
`In re Wyer,
`655 F.2d 221 (CCPA 1981) ..................................................................................19
`KSR Int’l Co. v. Teleflex Inc.,
`550 U.S. 398 (2007) ................................................................................ 36, 49, 63
`Kyocera Wireless Corp. v. ITC,
`545 F.3d 1340 (Fed. Cir. 2008) ............................................................................19
`Medtronic, Inc. v. Barry,
`891 F.3d 1368 (Fed. Cir. 2018) ............................................................................19
`
`
`
`iv
`
`
`
`Case IPR 2023-00042
`Patent 11,004,271 B2
`
`
`Microsoft Corp. v. Global Touch Solutions, LLC,
`IPR2015-01024 (September 23, 2015) .................................................................37
`Net MoneyIN, Inc. v. VeriSign, Inc.,
`545 F.3d 1359 (Fed. Cir. 2008) ............................................................................24
`Par Pharm. Inc. v. TWi Pharms., Inc.,
`773 F.3d 1186 (Fed. Cir. 2014) ............................................................................51
`Phillips v. AWH Corp.,
`415 F.3d 1303 (Fed. Cir. 2005) ............................................................................11
`Procter & Gamble Co. v. Teva Pharms. USA, Inc.,
`566 F.3d 989 (Fed. Cir. 2009) ..............................................................................64
`Shopkick, Inc. v. Novitaz, Inc.,
`IPR2015-00279 (May 29, 2015)...........................................................................50
`Stratoflex, Inc. v. Aeroquip Corp.,
`713 F.2d 1530, 218 USPQ 871 (Fed. Cir. 1983) ..................................................36
`Teva Pharms. USA, Inc. v. Indivio UK Ltd.,
`IPR2016-00280 (PTAB Jun. 10, 2016) ................................................... 18, 51, 64
`Volkswagen Grp. of Am., Inc. v. Velocity Patent LLC,
`IPR2015-00276 (PTAB Jun. 1, 2015) ........................................................... 49, 63
`Statutes
`35 U.S.C. § 103 ........................................................................................................35
`35 U.S.C. §§ 102 and 311(b) ...................................................................................18
`
`
`
`
`
`v
`
`
`
`Case IPR 2023-00042
`Patent 11,004,271 B2
`
`
`
`Exhibit No.
`Ex. 2002
`Ex. 2003
`Ex. 2004
`Ex. 2005
`Ex. 2006
`
`EXHIBIT LIST
`Exhibit Description
`Declaration of Mahesh S. Mulumudi, M.D. (“Mulumudi”)
`Dr. Mulumudi’s C.V.
`Declaration of Craig Rosenberg, Ph.D. (“Rosenberg”)
`Dr. Rosenberg’s C.V.
`Dr. Kazanzides Deposition (“Kazanzides Depo”)
`
`
`
`vi
`
`
`
`Case IPR 2023-00042
`Patent 11,004,271 B2
`
`I.
`
`INTRODUCTION
`The ’271 Patent uses augmented reality (AR) technology in an operating
`
`room. The Patent allows a physician to not only view, but also to navigate in real-
`
`time, a three-dimensional (3D) volume-rendered image of a patient’s anatomy
`
`while it is overlaid onto the actual patient in the operating room during pre-
`
`operative planning or surgery.
`
`More specifically, claims 1-6 provide a novel virtual tool (i.e., a “virtual 3D
`
`shape”) for navigating the 3D volumetric patient anatomy while it is projected onto
`
`the actual patient in real-time. The Petition (Paper 3) attacks on the patentability of
`
`claims 1-6 lack merit foremost because Petitioner fails to appreciate what is
`
`claimed—a virtual tool (i.e., “a virtual 3D shape”) for real-time navigation of the
`
`projected 3D volumetric rendering. For example, Petitioner mistakenly equates the
`
`“virtual 3D shape” of claim 1 with a simple box that merely surrounds 3D models
`
`displayed on conventional computer monitors.
`
`Claims 11-20 provide a novel method for altering the color gradient of the
`
`3D volumetric patient anatomy to: (1) be better visible when projected onto an
`
`actual patient in an operating room; and (2) represent various tissue properties,
`
`including: hardness, relaxivity, echogenicity, enhancement amount, enhancement
`
`speed, density, radioactivity, and water content. The Petition attacks the
`
`patentability of claims 11-20 but does so without identifying a single reference that
`
`
`
`1
`
`
`
`Case IPR 2023-00042
`Patent 11,004,271 B2
`
`alters the color gradient of the 3D volumetric medical imaging to be better visible
`
`when projected onto an actual patient in an operating room. The Petition also fails
`
`to identify a single reference that alters the color gradient of 3D volumetric
`
`medical imaging when projected onto an actual patient in an operating room to
`
`specifically identify any of the tissue properties identified in claims 12-19.
`
`
`
`Petitioner’s misconstruction of the claims and prior art1 aside, the Petition
`
`makes many of its allegations in a conclusory fashion. Even in a tribunal with a
`
`lowered burden of proof, conclusory allegations are not evidence.
`
`II. BACKGROUND
`
`A. Technology Background
`
`Generally speaking, one source of two-dimensional (2D) medical imaging
`
`data is a computed tomography (CT) scan, which uses X-rays to create images of
`
`the inside of the body. Ex. 2002, Mulumudi, ¶ 48. X-ray sources and detectors are
`
`moved around the body, recording patient anatomy from different angles. Id.
`
`“Conventionally, a computer uses this 3D data to reconstruct a set of 2D images or
`
`slices.” Id. “This set of 2D slices are reconstructed along one of three planes:
`
`coronal (front to back), sagittal (left to right), or axial (top to bottom).” Id. “Each
`
`
`1 Discussed in greater detail below, much of the Petitioner’s asserted prior art has
`not been demonstrated to be actual prior art. Meaning, there is no evidence that the
`alleged publication was available and disseminated to the POSITA before the
`applicable priority date.
`
`
`
`2
`
`
`
`Case IPR 2023-00042
`Patent 11,004,271 B2
`
`of these 2D images represents distribution of X-ray attenuation in the patient’s
`
`body.” Id. “Each 2D CT scan image consists of a grid of 2D pixels.” Id.,
`
`¶ 49. “The value of the pixel corresponds to the radiodensity of the tissue within
`
`the pixel.” Id. “While each pixel represents a small volume element, or voxel, of
`
`the body, a 2D CT scan image provides no information about the patient anatomy
`
`between the slices.” Id.
`
`“Another common source of medical imaging data is a magnetic resonance
`
`imaging (MRI) scan, which uses a strong magnetic field and radio waves to create
`
`images of the inside of the body.” Id., ¶ 50. “The output of an MRI scan,
`
`conventionally, is one of three sets of 2D images or slices (coronal, sagittal, and
`
`axial), similar to a CT scan.” Id. “Again, each 2D image consists of a grid of 2D
`
`pixels.” Id. “The value of the pixel corresponds to the strength of the MRI signal
`
`received at that pixel, which depends on the type of tissue in the pixel.” Id. “The
`
`pixel value also depends on the MRI sequence used, among other factors.” Id.
`
`“While each pixel represents a small volume of element of the patient’s body, a 2D
`
`MRI scan image provides no information about layers of the patient anatomy
`
`between the 2D slices.” Id.
`
`“There are various methods for visualizing CT and MRI scan data, including
`
`‘slice-by-slice viewing,’ ‘surface rendering,’ and ‘direct volume rendering
`
`(DVR).’” Id., ¶¶ 51-58. “One of the earliest, and still most common, methods for
`
`
`
`3
`
`
`
`Case IPR 2023-00042
`Patent 11,004,271 B2
`
`visualizing medical imaging data is ‘slice-by-slice viewing.’” Id., ¶ 51. “This is
`
`where individual 2D slices are viewed sequentially, one at a time, in the direction
`
`in which they were compiled.” Id. “Traditionally, CT and MRI slices are acquired
`
`and viewed along only one axis, typically the axial or transverse plane (horizontal
`
`slices as if looking from the feet towards the head).” Id. “This ‘slice-by-slice
`
`viewing’ method does not provide a complete 3D view of anatomical structures
`
`within the volume of a patient.” Id.
`
`“Another method for visualizing a specific anatomical feature from CT
`
`and/or MRI data is ‘surface rendering.’” Id., ¶ 52. “This method involves creating
`
`a 3D surface model of an anatomical feature of interest from the medical imaging
`
`data.” Id. “This is done by first performing a segmentation step to identify the
`
`voxels within the original 3D dataset that belong to an anatomical structure of
`
`interest.” Id. “After segmentation, a polygonal model of the surface of the
`
`identified anatomical structure, i.e., a shell, is generated.” Id. “Since surface
`
`rendering only provides a view of the modeled surface of the previously identified
`
`anatomical structure, it is used when the focus is on a specific structure of
`
`interest.” Id. “But it does not provide information about the anatomical areas of
`
`interest within the structure to which a modeled surface was rendered.” Id.
`
`Surface rendering “relies on identifying and rendering the surfaces of
`
`specific structures.” Id. 53. “As a result, minor errors or noise in the data can
`
`
`
`4
`
`
`
`Case IPR 2023-00042
`Patent 11,004,271 B2
`
`cause significant errors in the final surface rendering.” Id. “In surface rendering, it
`
`is common to apply thresholding to distinguish between different types of tissues
`
`or materials in the data.” Id. “However, if thresholds are not set correctly, this can
`
`lead to errors or omissions in the final surface rendering.” Id. “Also, surface
`
`rendering is unable to represent semi-transparent or overlapping structures
`
`accurately which can be a disadvantage with medical imaging where these types of
`
`structures are common.” Id.
`
`Discussed below with respect to claim construction, the concept of “direct
`
`volume rendering (DVR) is very different from the other visualization methods
`
`described above.” Id., ¶ 54. “DVR is a method for visualizing medical imaging
`
`data, including CT and MRI scan or ultrasound data, in three-dimensions.”
`
`Id. DVR “can offer perspectives that are not limited to the traditional planes (e.g.,
`
`axial, sagittal, and coronal) used to navigate data, enabling a view of the
`
`anatomical structures within the volume of a patient from any direction.” Id.
`
`“Unlike surface rendering, DVR is not feature specific.” Id., ¶ 55. Rather, it
`
`“takes into account the complete volume of the patient data.” Id. “While surface
`
`models depict the external surfaces or boundaries of an object, DVR creates a 3D
`
`image by considering every individual voxel in the dataset allowing for the
`
`visualization of internal structures, and not just pre-selected outer surfaces of a
`
`particular anatomical feature.” Id.
`
`
`
`5
`
`
`
`Case IPR 2023-00042
`Patent 11,004,271 B2
`
`
`“Also, unlike surface rendering, DVR does not require the creation of an
`
`intermediate surface representation and does not require segmentation of the data
`
`into different structures or tissues.” Id., ¶ 56. “Instead, DVR operates directly on
`
`the original dataset (i.e., grid of voxels that make up the raw volume data).”
`
`Id. “Each voxel within the dataset is assigned a color and opacity based on transfer
`
`functions.” Id. “All the voxels are then composited, or blended together, to create
`
`a 3D image of the entire volume of the patient anatomy.” Id.
`
`Navigation of that direct volume rendered data in an AR environment, is one
`
`subject of the ‘271 Patent. A “common problem faced by AR systems is proper
`
`placement of virtual controls for managing virtual elements.” Ex. 1001, 1:29-30.
`
`“Virtual controls, while intended to aide a user in interacting with virtual elements,
`
`are often placed in positions in the live view that render them more of a hinderance
`
`than a help to the user.” Ex. 1001, 1:31-34.
`
`Generally speaking, the ’271 Patent provides methods for “augment[ing]
`
`real-time views of a patient with 3D data.” Ex. 1001, 3:23. The 3D data “may be
`
`automatically aligned, or registered, with [and projected onto] a real-time view of
`
`the actual patient.” Ex. 1001, 3:24-27. The “augmenting of real-time views of a
`
`patient with 3D data may include the display of a virtual user interface and other
`
`virtual controls for altering the [3D data] projected onto the real-time view of the
`
`patient.” Ex. 1001, 3:50-54. The “virtual user interface and … virtual controls
`
`
`
`6
`
`
`
`Case IPR 2023-00042
`Patent 11,004,271 B2
`
`may be projected to avoid obstructing the medical professional’s field of view
`
`when viewing the patient….” Ex. 1001, 3:54-56. The virtual user interface and
`
`virtual controls “allow the medical professional to quickly and easily alter the [3D
`
`data] projected onto the real-time view of the patient.” Ex. 1001, 3:60-62.
`
`An “augmented reality (AR) environment 100 … may include a 3D space
`
`102, a user 104, a patient 106, and an AR headset 108,” (Ex. 1001, 3:63-66), as
`
`illustrated in Figure 1, shown below.
`
`The AR environment “may also include a virtual user interface 114” and “a
`
`
`
`virtual spatial difference box 116.” Ex. 1001, 4:1-3. The “virtual spatial
`
`difference box 116 may be generated by the AR headset 108 to confine within a
`
`
`
`7
`
`
`
`Case IPR 2023-00042
`Patent 11,004,271 B2
`
`volume of the virtual spatial difference box 116 the projected inner layer of the
`
`patient 106 from the 3D data.” Ex. 1001, 6:5-8. The “virtual spatial difference
`
`box 116 may “assist the user when navigating the projected 3D data by providing a
`
`frame of reference for the user 104.” Ex. 1001, 6:11-13. The virtual spatial
`
`difference box “may assist the user when moving axial slices, coronal slices,
`
`sagittal slices, or oblique slices of the 3D data within the virtual spatial difference
`
`box.” Ex. 1001, 6:14-17. The slices may be three-dimensional (3D) and may
`
`include “curved slices, such as curved slices that follow the natural curve of an
`
`anatomical feature, or slices that have depth as well as height and width.” Ex.
`
`1001, 6:17-21. The “user 104 may move [the] slices using hand gestures that
`
`requires the user 104 to generally move his hand in the directions of the lines of the
`
`virtual spatial difference box 116…” Ex. 1001, 6:21-25.
`
`The Challenged Claims of the ’271 Patent
`
`B.
`
`1.
`
`[A]
`
`[Pre] A method for augmenting real-time, non-image actual views of a patient
`with three-dimensional (3D) data, the method comprising:
`
`identifying 3D data for the patient, the 3D data including an outer layer
`of the patient and multiple inner layers of the patient; and
`
`displaying, in an augmented reality (AR) headset, one of the inner layers
`of the patient from the 3D data projected onto real-time, non-image
`actual views of the outer layer of the patient,
`
`
`[B]
`
`8
`
`
`
`
`
`
`
`Case IPR 2023-00042
`Patent 11,004,271 B2
`
`
`[C]
`
`the projected inner layer of the patient from the 3D data being confined
`within a volume of a virtual 3D shape.
`
`
`2. The method as recited in claim 1, wherein: the virtual 3D shape is a virtual box;
`and the virtual box includes a top side, a bottom side, a left side, a right side, a
`front side, and a back side.
`
`3. The method of claim 1, wherein: the virtual 3D shape is configured to be
`controlled to toggle between displaying and hiding lines of the virtual 3D shape;
`and the virtual 3D shape is configured to be controlled to reposition two
`dimensional (2D) slices and/or 3D slices of the projected inner layer of the patient
`from the 3D data.
`
`4. The method of claim 1, wherein lines of the virtual 3D shape are displayed.
`
`5. The method of claim 1, wherein lines of the virtual 3D shape are hidden.
`
`6. One or more non-transitory computer-readable media storing one or more
`programs that are configured, when executed, to cause one or more processors to
`perform the method as recited in claim 1.
`
`11.
`
`[A]
`
`[B]
`
`[Pre] A method for augmenting real-time, non-image actual views of a patient
`with three-dimensional (3D) data, the method comprising:
`
`identifying 3D data for the patient, the 3D data including an outer layer
`of the patient and multiple inner layers of the patient, the multiple inner
`layers of the patient having an original color gradient;
`
`altering the original color gradient of the multiple inner layers to be
`lighter than the original color gradient in order to be better visible when
`projected onto real-time, non-image actual views of the outer layer of
`the patient; and
`
`displaying, in an augmented reality (AR) headset, one of the inner layers
`of the patient from the 3D data projected onto real-time, non-image
`
`[C]
`
`
`
`9
`
`
`
`Case IPR 2023-00042
`Patent 11,004,271 B2
`
`
`actual views of the outer layer of the patient, the projected inner layer of
`the patient from the 3D data being having the altered color gradient.
`
`
`12. The method as recited in claim 11, wherein the altered color gradient
`represents a tissue hardness tissue property of the multiple inner layers of the
`patient.
`
`13. The method as recited in claim 11, wherein the altered color gradient
`represents a tissue relaxivity tissue property of the multiple inner layers of the
`patient.
`
`14. The method as recited in claim 11, wherein the altered color gradient
`represents a tissue enhancement amount tissue property of the multiple inner layers
`of the patient.
`
`15. The method as recited in claim 11, wherein the altered color gradient represents
`a tissue enhancement amount tissue property of the multiple inner layers of the
`patient.
`
`16. The method as recited in claim 11, wherein the altered color gradient
`represents a tissue enhancement speed tissue property of the multiple inner layers
`of the patient.
`
`17. The method as recited in claim 11, wherein the altered color gradient represents
`a tissue density tissue property of the multiple inner layers of the patient.
`
`18. The method as recited in claim 11, wherein the altered color gradient represents
`a tissue radioactivity tissue property of the multiple inner layers of the patient.
`
`19. The method as recited in claim 11, wherein the altered color gradient represents
`a tissue water content tissue property of the multiple inner layers of the patient.
`
`20. One or more non-transitory computer-readable media storing one or more
`programs that are configured, when executed, to cause one or more processors to
`perform the method as recited in claim 11.
`
`
`
`
`
`
`10
`
`
`
`Case IPR 2023-00042
`Patent 11,004,271 B2
`
`III. CLAIM CONSTRUCTION
`Patent claims are construed by evaluating claim language, the specification,
`
`and the prosecution history. See Phillips v. AWH Corp., 415 F.3d 1303, 1314–18
`
`(Fed. Cir. 2005). There is a heavy presumption that a term carries its ordinary
`
`meaning. CCS Fitness, Inc. v. Brunswick Corp., 288 F.3d 1359, 1366 (Fed. Cir.
`
`2002).
`
`A.
`
`“three-dimensional (3D) data … including an outer layer of the
`patient and multiple inner layers of the patient”
`
`Patent Owner’s
`Construction
`“Direct-volume-rendered
`CT, MRI, PET, and SPECT
`imaging (and also
`ultrasound and fluorescence
`imaging, depending on the
`methods used).”
`
`Claim Term
`
`Petitioner’s Construction
`
`“three-dimensional
`(3D) data …
`including an outer
`layer of the patient
`and multiple inner
`layers of the patient”
`
`(claims 1 and 11)
`
`“one or more of MRI
`images, Computerized
`Tomography (CT) scan
`images, X-ray images,
`Positron Emission
`Tomography (PET) images,
`ultrasound images,
`fluorescence, Infrared
`Thermography (IRT)
`images, and Single-Photon
`Emission Computer
`Tomography (SPECT) scan
`image”
`
`Claims 1 and 11 of the ’271 Patent recite “real-time” augmentation of “non-
`
`
`
`
`
`image actual views of a patient with three-dimensional (3D) Data.” In its barest
`
`
`
`11
`
`
`
`Case IPR 2023-00042
`Patent 11,004,271 B2
`
`form, a “POSITA would understand, and [Petitioner’s expert] Kazanzides2
`
`admitted, that ‘something is three-dimensional if it has the dimension of depth as
`
`well as width and height.’” Ex. 2002, Mulumudi, ¶ 68 (citing Ex. 2006,
`
`Kazanzides Depo, 58:11-15). In other words, “a 3D volume has x-, y-, and z-data
`
`in a Cartesian coordinate system.” Id.
`
`Within the context of the ‘271 Patent, “3D data of the patient … may
`
`include, but is not limited to, MRI images, Computerized Tomography (CT) scan
`
`images, X-ray images, Positron Emission Tomography (PET) images, ultrasound
`
`images, fluorescence images, Infrared Thermography (IRT) images, and Single-
`
`Photon Emission Computer Tomography (SPECT) scan image, or some
`
`combination thereof.” Ex. 1001, 11:45-51. With those fundamentals as a starting
`
`
`2 Professor Kazanzides’ relationship with Petitioner and Johns Hopkins University,
`a competitor of Patent Owner, call into question his objectivity and the reliability
`of his testimony. Kazanzides is a professor at Johns Hopkins University.
`Ex. 2006, Kazanzides Depo, 18:3-5. Johns Hopkins competes with Patent Owner.
`Kazanzides does research for Johns Hopkins related to the use of head mounted
`displays in augmented reality (AR) surgical applications. Id., 20:7-24:16.
`Kazanzides works with physicians at Johns Hopkins to conduct his research. Id.,
`24:17-25:9. Kazanzides has co-authored papers with Johns Hopkins physicians
`about the use of AR in surgery. Id. 24:21-25:9. Kazanzides is also a named
`inventor on several related patents, owned by Johns Hopkins, including U.S. Patent
`No. 11,244,508, entitled, “Augmented Reality Display for Surgical Procedures.”
`Ex. 1012, ¶ 12. Finally, at least one of Kazanzides’ former students currently
`works for Petitioner and that relationship resulted in Kazanzides’ request for a
`grant involving augmented reality, which was later declined. Ex. 2006, Kazanzides
`Depo, 12:2-14.
`
`
`
`12
`
`
`
`Case IPR 2023-00042
`Patent 11,004,271 B2
`
`point, “conventional medical imaging systems generally generate 2D images or a
`
`series of 2D images taken along a particular axis of the patient anatomy (e.g., axial,
`
`sagittal, or coronal).” Ex. 2002, Mulumudi, ¶ 68. Petitioner’s expert, Professor
`
`Kazanzides, explained, “It’s like a stack of pancakes through the patient, a stack of
`
`2D slices.” Ex. 2006, Kazanzides Depo, 62:7-8. However, unlike an actual
`
`pancake, which has height, width, and depth, Kazanzides admitted that each of
`
`these 2D images has no “z information” other than that “common to the slice.” Ex.
`
`2006, Kazanzides Depo, 64:1-7.
`
`
`
`With specific reference to the ‘271 Patent, claim 1 requires “real-time”
`
`augmentation of “non-image actual views of a patient with three-dimensional (3D)
`
`data.” Ex. 1001, Claim 1. A POSITA “would understand that this 3D data must
`
`have height, width, and depth.” Ex. 2002, Mulumudi, ¶ 69. Claim 1 also requires
`
`that the “3D data include[e] an outer layer of the patient and multiple inner layers
`
`of the patient.” And according to the specification, the inner layers of the patient
`
`also “have depth as well as height and width.” Ex. 1001, 6:17-21 (“3D slices may
`
`include curved slices, such curved slices that follow the natural curve of an
`
`anatomical feature, or slices that have a depth as well as a height and width.”).
`
`Claim 1 further requires that the projected 3D data be “confined within a volume
`
`of virtual 3D shape.” Ex. 1001, Claim 1.
`
`
`
`13
`
`
`
`Case IPR 2023-00042
`Patent 11,004,271 B2
`
`
`
`Based on the forgoing, a POSITA would understand that direct volume
`
`rendering “is the only visualization method that is: three dimensional (3D),
`
`includes an outer layer of the patient and multiple inner layers of the patient (where
`
`the layers also have depth as well as height and width), and is capable of being
`
`confined3 within a virtual 3D shape.” Ex. 2002, Mulumudi, ¶ 69; see also id. at
`
`¶ 60 (“only DVR provides the comprehensiveness, detail, and flexibility to
`
`visualize and navigate the internal layers of a projected 3D volume of a patient in
`
`real-time: “all the way through the patient,” at a “certain partial depth into the
`
`patient,” and along multiple axes, as described and claimed in the ’271 Patent.).
`
`
`
`As such, the term “3D data … including an outer layer … and multiple inner
`
`layers of the patient” means “direct-volume-rendered three-dimensional (3D)
`
`imaging of the anatomy of a patient, including an outer layer of the patient and
`
`multiple inner layers of the patient.” Ex. 2002, Mulumudi, ¶ 70.
`
`B.
`
`“inner layer(s) of the patient”
`
`Claim Term
`
`Petitioner’s Construction
`
`
`
`“inner layer(s) of the
`patient”
`
`(claims 1 and 11)
`
`
`
`Patent Owner’s
`Construction
`“A 3D direct volume
`rendering of the anatomy of
`the patient at a certain
`depth below the outer layer,
`or skin, of the patient.”
`
`
`3 Patent Owner’s construction of “confined within a virtual 3D shape” is explained
`below. See infra at 15.
`
`
`
`14
`
`
`
`Case IPR 2023-00042
`Patent 11,004,271 B2
`
`
`Claims 1 and 11 of the ’271 Patent require “real-time” augmentation of
`
`“non-image actual views of a patient with three-dimensional (3D) Data.” Based on
`
`the plain and ordinary meaning of the term “three-dimensional,” a POSITA would
`
`understand that the projection onto the real, non-image, actual views of the patient
`
`must have height, width, and depth. Ex. 2002, Mulumudi, ¶ 69. This is true
`
`whether the entire volume of the patient, or some lesser volume (i.e., “one of the
`
`inner layers of the patient”), is projected onto the patient. See id., ¶¶ 67-72.
`
`The ’271 Patent explains that the “multiple inner layers may be layers that
`
`go all the way through the patient 106, or may be layers that only go to a certain
`
`partial depth into the patient.” Ex. 1001, 12:13-16. Meaning, the inner layers will
`
`always have depth (i.e., a z-component) associated with their three-dimensional
`
`geometry. Together, with the definition of 3D data above, an “inner layer of the
`
`patient” means “a 3D volume rendering of some anatomy of the patient at a certain
`
`depth below the outer layer, or skin, of the patient.” Ex. 2002, Mulumudi, ¶¶ 71-
`
`72.
`
`C.
`
`“confined within a virtual 3D shape”
`
`Claim Term
`
`Petitioner’s Construction
`
`“confined with