throbber
c12) INTER PARTES REVIEW CERTIFICATE (1505th)
`United States Patent
`US 9,329,675 Kl
`(10) Number:
`Ojelund et al.
`(45) Certificate Issued:
`Oct. 25, 2019
`
`(54) SYSTEM WITH 3D USER INTERFACE
`INTEGRATION
`
`(71) Applicants:Henrik Ojelund; David Fischer;
`Karl-Josef Hollenbeck
`
`(72)
`
`Inventors: Henrik Ojelund; David Fischer;
`Karl-Josef Hollenbeck
`
`(73) Assignee: 3SHAPE A/S
`
`Trial Number:
`IPR2018-00197 filed Nov. 22, 2017
`
`Inter Partes Review Certificate for:
`Patent No.: 9,329,675
`May 3, 2016
`Issued:
`13/991,513
`Appl. No.:
`Jun. 4, 2013
`Filed:
`
`The results ofIPR2018-00197 are reflected in this inter
`partes review certificate under 35 U.S.C. 318(b).
`
`0001
`
`Align EX1046 (Part 1 of 2)
`Align v. 3Shape
`IPR2022-00145
`
`

`

`INTER PARTES REVIEW CERTIFICATE
`U.S. Patent 9,329,675 Kl
`Trial No. IPR2018-00197
`Certificate Issued Oct. 25, 2019
`
`1
`
`2
`
`AS A RESULT OF THE INTER PARTES
`REVIEW PROCEEDING, IT HAS BEEN
`DETERMINED THAT:
`
`Claims 1-19 are cancelled.
`* * * * *
`
`0002
`
`

`

`Trials@uspto.gov
`571-272-7822
`
`Paper 22
`Entered: May 29, 2019
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`BEFORE THE PA TENT TRIAL AND APPEAL BOARD
`
`ALIGN TECHNOLOGY, INC.
`Petitioner,
`
`V.
`
`3SHAPE A/S,
`Patent Owner.
`
`Case IPR2018-00197
`Patent 9,329,675 B2
`
`Before ELENI MANTIS MERCADER, MICHELLE N. WORMMEESTER,
`and JESSICA C. KAISER, Administrative Patent Judges.
`
`MANTIS MERCADER, Administrative Patent Judge.
`
`FINAL WRITTEN DECISION
`35 USC§ 318(a)
`
`0003
`
`

`

`IPR2018-00197
`Patent 9,329,675 B2
`
`I. INTRODUCTION
`
`A.
`
`Background
`
`Align Technology, Inc. ("Petitioner") filed a Petition requesting an
`
`inter partes review of claims 1-19 of U.S. Patent No. 9,329,675 B2
`
`(Ex. 1001, "the '675 patent"). Paper 2 ("Pet."). 3Shape A/S ("Patent
`
`Owner") filed a Preliminary Response. Paper 5 ("Prelim. Resp.").
`
`Upon consideration of the Petition, the Preliminary Response, and the
`
`associated evidence, we instituted trial to determine whether claims 1, 2, 9-
`
`11, and 18 are anticipated under 35 U.S.C. § 102 by Kriveshko, 1 whether
`
`claims 1-5, 8-11, and 14-19 would have been obvious under 35 U.S.C.
`§ 103 over Kriveshko in combination with Serra,2 and whether claims 6, 7,
`
`12, and 13 would have been obvious under 35 U.S.C. § 103 over Kriveshko
`in combination with Serra and Brennan.3 See Paper 7, 6, 31 ("Institution
`
`Decision" or "Inst. Dec."). After institution of trial, Patent Owner filed a
`
`Patent Owner Response. Paper 11 ("PO Resp."). Petitioner replied.
`
`Paper 14 ("Pet. Reply").
`
`An oral hearing was conducted on February 4, 2019. A transcript of
`
`that hearing is entered in the record. See Paper 21 ("Tr.").
`
`We have jurisdiction under 35 U.S.C. § 6. This decision is a Final
`
`Written Decision under 35 U.S.C. § 31 S(a) as to the patentability of
`
`claims 1-19 of the '675 patent. For the reasons discussed below, we hold
`
`that Petitioner has demonstrated by a preponderance of the evidence that
`
`claims 1-19 of the '675 patent are unpatentable.
`
`1 US 2007/0171220 Al (July 26, 2007) ("Kriveshko"; Ex. 1005).
`2 US 2006/0020204 Al (Jan. 26, 2006) ("Serra"; Ex. 1006).
`3 US 8,903,476 B2 (Dec. 2, 2014) ("Brennan"; Ex. 1007).
`
`2
`
`0004
`
`

`

`IPR2018-00197
`Patent 9,329,675 B2
`
`B. Related Matters
`
`The parties identify inter partes review proceeding IPR2018-00198
`
`that also challenges the '675 patent. Pet. 55, Paper 4, 1. Patent Owner
`
`further submits that the following is a list of judicial and administrative
`
`matters that would affect, or be affected by, a decision in this proceeding:
`
`Align Technology, Inc. v. 3Shape AIS, Petition for Inter Partes Review of
`
`U.S. Patent No. 9,329,675 B2, filed on November 22, 2017; U.S. Provisional
`
`Application No. 61/420,138, filed on December 6, 2010; and PCT
`
`International Application No. PCT/DK.2011/050461, filed on December 5,
`
`2011. Paper 4, 1.
`
`Petitioner states that the '675 patent has not been involved in any
`
`litigation proceedings. Pet. 55.
`
`C. The '675 Patent
`
`The '675 patent relates to handheld intraoral scanner device 100 and
`
`computer screen 101. Ex. 1001, Fig. 1, 11 :29-31. Operator 102 uses the
`
`intraoral scanner 100 to record some intraoral 3D geometry and the user
`
`interface functionality to rotate, pan, and zoom displayed 3D model 105 of
`
`the scanned data on computer screen 101. Id. at 11 :31-37. The integration
`
`of the user interface functionality in device 100 is provided by motion
`
`sensors (not visible), which can be accelerometers inside scanner 100, whose
`
`readings determine the orientation of 3D model 105 of the teeth acquired by
`
`scanner 100 on computer screen 101. Id. at 11:37-42. Figure 1 of the '675
`
`patent is reproduced below.
`
`3
`
`0005
`
`

`

`IPR2018-00197
`Patent 9,329,675 B2
`
`102
`
`Figure 1 above shows operator 102 using intraoral scanner 100 to
`
`record some intraoral 3D geometry and displayed 3D model 105 of the
`
`scanned data on computer screen 101. Id. at Fig. 1, 11:31-37.
`
`The 3D user interface functionality is provided by at least one motion
`
`sensor built into or on the device. Id. at 6:46-56. Two different types of
`
`motion sensors are described. Id. at 6:48-59. One type of motion sensor
`
`includes accelerometers, gyros, and magnetometers, which can sense
`
`rotations, lateral motion, and/or combinations thereof. Id. at 6:48-51.
`
`Another type of motion sensor uses infrared sensing. Id. at 6: 51. At least
`
`one infrared sensor is mounted on the device, and at least one infrared
`
`emitter can be mounted in the surroundings of the device. Id. at 6:51.:..54_
`
`Conversely, the at least one emitter can be mounted on the device, and the at
`
`least one sensor in the surroundings. Id. at 6:54-56. Another possibility is
`
`to use infrared reflector(s) on the device, and both sensor(s) and emitter(s)
`
`on the surroundings. Id. at 6:56-58.
`
`4
`
`0006
`
`

`

`IPR2018-00197
`Patent 9,329,675 B2
`
`An example of user interface functionality in the form of remote
`
`controlling using the handheld device to determine the view to be displayed
`
`is provided by Figures 2a and 2b and respective descriptive disclosure. Id. at
`
`11 :9-42. The motion sensors (not shown) in handheld device 100, i.e.
`
`scanner, allow user 102 to determine the view shown on the display 101, i.e.
`
`screen, by moving handheld device 100. Id. at 11:10-14.
`
`The operation functionality of device 100 is to record some intraoral
`
`3D geometry, and the user interface functionality is to rotate, pan, and zoom
`
`3D model 105 of the scanned data on computer screen 101. Id. at 11:32-37.
`
`The integration of the user interface functionality in device 100 is provided
`
`by motion sensors (not visible), which can be accelerometers inside scanner
`
`100, whose readings determine the orientation of 3D model 105 of the teeth
`
`acquired by scanner 100 on computer screen 101. Id. at 11:37-42.
`
`The user interface functionality is illustrated in Figure 2a, reproduced
`
`below. Figure 2a shows that pointing device 100 down can provide 3D
`
`model 105 of the scanned teeth shown from a downward viewing angle. Id.
`
`at 11 : 15-1 7.
`
`5
`
`0007
`
`

`

`IPR2018-00197
`Patent 9,329,675 B2
`
`100
`
`105
`
`Fig. 2a)
`
`Figure 2a above shows operator 102 using intraoral scanner 100
`
`pointed downwards to provide 3D model 105 of the scanned teeth shown
`
`from a downward viewing angle. Id. at Fig. 2a, 11: 15-17.
`
`Figure 2b reproduced below shows that holding the scanner in a
`
`horizontal position can provide that the viewing angle is likewise horizontal
`
`fromthe front, such that 3D model 105 of the scanned teeth is shown from
`
`the front. Id. at 11: 18-21.
`
`6
`
`0008
`
`

`

`IPR2018-00197
`Patent 9,329,675 B2
`
`105
`
`100
`
`Fig. 2b)
`
`Figure 2b above shows operator 102 using intraoral scanner 100
`
`pointed horizontally to provide 3D model 105 of the scanned teeth shown
`
`from a horizontal frontal viewing angle. Id. at Fig. 2b, 11: 18-21.
`
`Additional functionality to start/stop scanning is provided by
`
`button 103 as seen in Figure 3. Id. at Fig. 3, 11 :42-45. Figure 3 of the '675
`
`patent is reproduced below.
`
`7
`
`0009
`
`

`

`IPR2018-00197
`Patent 9,329,675 B2
`
`107
`
`106
`
`Figure 3 above shows button 103 being located where the user's index
`finger can reach it conveniently. Id. at 11 :44-46. When the button is
`pressed quickly the handheld device is prepared for scanning, e.g., it is set
`for performing at least one action, the scanning procedure, in the physical
`3D environment. Id. at 3:58-61. The scanning is stopped when the button is
`pressed quickly a second time. Id. at 3:61-63. While the scanning is
`performed, a virtual 3D representation is visually built on the display and the
`user can press and hold the button. Id. at 3 :61-66. This action puts the
`handheld device in a controller mode, where the handheld device is adapted
`for remotely controlling the view with which the 3D environment, such as
`scanned teeth, is represented on the display. Id. at 3:66-4:3. While the
`button is pressed, the system will use signals from a motion sensor in the
`
`8
`
`0010
`
`

`

`IPR2018-00197
`Patent 9,329,675 B2
`
`handheld device to determine how to present the view of the virtual 3D
`
`environment on computer screen 101. Id. at 4:3-5.
`
`D. The Challenged Claims
`
`Petitioner challenges claims 1-19 of the '675 patent. Claims 1 and 19
`
`are independent and are reproduced below:
`
`1. A scanning system for scanning a 3D environment, the
`scanning system comprising:
`a handheld device including an optical scanner, wherein
`the 3D environment to be scanned is selected by pointing the
`optical scanner at the 3D environment; and
`at least one display remotely connected to the handheld
`device,
`wherein the handheld device is adapted for performing at
`least one scanning action in a physical 3D environment, and the
`at least one display is adapted for visually representing the
`physical 3D environment; and
`the handheld device includes a user interface for
`remotely controlling the display to adjust the view with which
`the 3D environment is represented on the display.
`
`Ex. 1001, 15:29-42.
`
`19. A system comprising:
`a handheld device and at least one ·display;
`wherein the handheld device is adapted for switching between
`performing at least one action in a physical 3D
`environment, wherein the at least one display is adapted
`for visually representing the physical 3D environment;
`and remotely controlling the display to adjust the view
`with which the 3D environment is represented on the
`display;
`
`9
`
`0011
`
`

`

`IPR2018-00197
`Patent 9,329,675 B2
`
`wherein the handheld device is an intra-oral 3D scanner and the
`at least one action performed in the physical 3D environment
`is scanning and that the view is remotely controlled by at least
`one motion sensor arranged in the handheld device, and
`wherein an actuator provided on the handheld device switches
`between performing the at least one action and remotely
`controlling the view.
`
`Id. at 16:35-50.
`
`E. Instituted Grounds of Unpatentability
`
`On April 24, 2018, the Supreme Court issued its decision in SAS
`
`Institute Inc. v. Jancu, 13 8 S. Ct. 1348 (2018). Consistent with the Supreme
`Court's decision in SAS Institute Inc., as well as PGS Geophysical AS v.
`
`Iancu, 891 F.3d 1354, 1360 (Fed. Cir. 2018), we instituted a trial on all the
`
`asserted grounds of unpatentability, which are set forth in the table below.
`
`Reference( s)
`
`Kriveshko
`Kriveshko and Serra
`Kriveshko, Serra, and Brennan
`
`Basis
`
`§ 102
`§ 103
`§ 103
`
`Challenged Claims
`
`1, 2, 9-11, and 18
`1-5, 8-11, and 14-19
`6, 7, 12, and 13
`
`Inst. Dec. 6. Petitioner relies on the declaration of Chandrajit L. Bajaj,
`
`Ph.D. for support (Ex. 1003). With its Response, Patent Owner submits the
`
`declaration ofRavin Balakrishnan, Ph.D. (Ex. 2011). The transcripts of the
`
`depositions of Dr. Bajaj and Dr. Balakrishnan are entered in the record as
`
`Exhibits 2008 and 1037, respectively.
`
`10
`
`0012
`
`

`

`IPR2018-00197
`Patent 9,329,675 B2
`
`F Level of Ordinary Skill in the Art
`
`In determining whether an invention would have been obvious to an
`
`ordinarily skilled artisan at the time it was made, we consider the level of
`
`ordinary skill in the pertinent art at the time of the invention. Graham v.
`
`John Deere Co. of Kansas City, 383 U.S. 1, 17 (1966). "The importance of
`
`resolving the level of ordinary skill in the art lies in the necessity of
`
`maintaining objectivity in the obviousness inquiry." Ryko Mfg. Co. v.
`
`Nu-Star, Inc., 950 F.2d 714, 718 (Fed. Cir. 1991). The person of ordinary
`
`skill in the art is a hypothetical person who is presumed to have known the
`
`relevant art at the time of the invention. In re GPAC, Inc., 57 F.3d 1573,
`
`1579 (Fed. Cir. 1995). The level of ordinary skill in the art may be reflected
`
`by the prior art of record. Okajima v. Bourdeau, 261 F.3d 1350, 1355 (Fed.
`
`Cir. 2001 ). Factors that may be considered in determining the level of
`
`ordinary skill in the art include, but are not limited to, the types of problems
`
`encountered in the art, the sophistication of the technology, and educational
`
`level of active workers in the field. GPAC, 57 F.3d at 1579. In a given case,
`
`one or more factors may predominate. Id. Generally, it is easier to establish
`
`obviousness under a higher level of ordinary skill in the art. Innovention
`
`Toys, LLC v. MGA Entm 't, Inc., 637.F.3d 1314, 1323 (Fed. Cir. 2011) ("A
`
`less sophisticated level of skill generally favors a determination of
`
`nonobviousness ... while a higher level of skill favors the reverse.").
`
`Relying on the declaration testimony of Dr. Bajaj, Petitioner contends
`
`that a person of ordinary skill ("POSIT A") at the relevant time would have
`
`had a bachelor's degree in computer engineering, computer science,
`
`computer vision or an equivalent field, as well as at least one or two years of
`
`industry experience in three-dimensional imaging systems, or at least five
`
`11
`
`0013
`
`

`

`IPR2018-00197
`Patent 9,329,675 B2
`
`years of comparable industry experience in three-dimensional imaging
`systems. Pet. 13 ( citing Ex. 1003 ,r,r 19-22). In particular, according to
`Petitioner, a POSITA would have had experience with, and knowledge of,
`three-dimensional imaging systems. Id.
`Patent Owner responds that Petitioner's definition of an ordinarily
`skilled artisan is inadequate at least because Petitioner's definition does not
`take into account that the '675 Patent relates to user interfaces. PO Resp. 7-
`8 (citing Ex. 1001 's title ("System with 3D User Interface Integration");
`Ex.2011 ,r,r 40-43).
`The parties' dispute regarding the level of ordinary skill is based on
`the type of relevant experience. Patent Owner's declarant agrees a POSIT A
`would have had "a bachelor's degree in computer engineering, computer
`science, computer vision or an equivalent field, as well as at least one or two
`years of industry or research experience," but Patent Owner's declarant
`testifies that experience would have been "with user interfaces used in three(cid:173)
`dimensional imaging systems" rather than three-dimensional imaging
`systems generally. Ex. 2011 ,r 41. Patent Owner's declarant also notes that
`his opinions would not change under either formulation. Id. ,r 43.
`Based on the evidence of record, including the testimony of the
`parties' declarants as cited above, the subject matter at issue, and the prior
`art of record, we determine that Patent Owner's proposed skill level is
`appropriate, and we adopt Patent Owner's articulation of the level of
`ordinary skill in the art. Our analysis, however, would not differ under
`either party's definition, and this is consistent with Patent Owner's
`declarant's statement noting that his opinions would not change under either
`formulation of the level of the ordinary skilled artisan. See Ex. 2011 ,r 43.
`
`12
`
`0014
`
`

`

`IPR2018-00197
`Patent 9,329,675 B2
`
`II. DISCUSSION
`
`A. Claim Construction
`
`Claims in an unexpired patent subject to inter partes review are given
`
`their broadest reasonable interpretation in light of the specification of the
`
`patent in which they appear. 37 C.F.R. § 42.l00(b) (2017)4; Cuozzo Speed
`
`Techs., LLCv. Lee, 136 S. Ct.2131 (2016). Consistent with the broadest
`
`reasonable construction, claim terms are presumed to have their ordinary and
`
`customary meaning as understood by a person of ordinary skill in the art in
`
`the context of the entire patent disclosure. In re Trans logic Tech., Inc., 504
`
`F.3d 1249, 1257 (Fed. Cir. 2007).
`
`The breadth of a claim term can be limited in two instances: ( 1) where
`
`the Specification reveals a special definition given to a claim term by the
`
`patentee acting as a lexicographer that differs from the meaning it would
`
`otherwise possess (see CCS Fitness, Inc. v. Brunswick Corp., 288 F.3d 1359,
`
`1366 (Fed. Cir. 2002)); or (2), where the Specification reveals an intentional
`
`disclaimer, or disavowal, of claim scope by the inventor (see SciMed Life
`
`, Sys., Inc. v. Advanced Cardiovascular Sys., Inc., 242 F.3d 1337, 1343-44
`
`(Fed. Cir. 2001)). An inventor may provide a meaning for a term that is
`
`different from its ordinary meaning by defining the term in the specification
`
`with reasonable clarity, deliberateness, and precision. In re Paulsen, 30 F.3d
`
`1475, 1480 (Fed. Cir. 1994).
`
`4 A recent amendment to this rule does not apply here because the Petition
`was filed before November 13, 2018. See "Changes to the Claim
`Construction Standard for Interpreting Claims in Trial Proceedings Before
`the Patent Trial and Appeal Boar_d," 83 Fed. Reg. 51,340, 51,340 (Oct. 11,
`2018) (to be codified at 37 C.F.R. pt. 42).
`
`13
`
`0015
`
`

`

`IPR2018-00197
`Patent 9,329,675 B2
`
`Intrinsic evidence "is the most significant source of the legally
`
`operative meaning of disputed claim language." Vitronics Corp. v.
`
`Conceptronic, Inc., 90 F.3d 1576, 1582 (Fed. Cir. 1996). When the
`
`specification is clear about the scope and content of a claim term, there is no
`
`need to turn to extrinsic evidence for claim interpretation. 3M Innovative
`
`Props. Co. v. Tredegar Corp., 725 F.3d 1315, 1326-28 (Fed. Cir. 2013).
`
`The parties separately argue proposed constructions for various
`
`limitations of the claims. See Pet. 9-12; PO Resp. 5-7; Pet. Reply 2-4, 17-
`
`25. In light of the parties' arguments and evidence developed during trial,
`
`we address two claim terms: (1) "motion sensor"; and (2) "us.er interface."
`
`See Vivid Techs., Inc. v. Am. Sci. & Eng'g, Inc., 200 F.3d 795, 803 (Fed. Cir.
`
`1999) ("only those terms need be construed that are in controversy, and only
`
`to the extent necessary to resolve the controversy").
`
`1. "motion sensor"
`
`Claims 4, 5, 7, 8, and 19 of the '675 patent recite a "motion sensor."
`
`Ex. 1001, 15:50-16:9, 16:35-50. Petitioner and Patent Owner agree that the
`
`term "motion sensor" requires "[a] sensor detecting motion." Pet. 11; PO
`
`Resp. 6; Pet. Reply 16 (all citing Ex. 1001, 10:35). Petitioner contends that
`
`"[ w ]here the parties diverge is in regard as to what sensors qualify as motion
`
`sensors." Pet. Reply 16. Petitioner alleges that Patent Owner "seeks to limit
`
`the scope of the claimed 'motion sensor' to exclude sensors that collect
`
`position and orientation data." Id. at 17.
`
`Petitioner contends that the '675 patent specification defines a motion
`
`sensor as
`
`[a] sensor detecting motion. Motion can be detected by: sound
`( acoustic sensors), opacity ( optical and infrared sensors and
`video image processors), geomagnetism (magnetic sensors,
`
`14
`
`0016
`
`

`

`IPR2018-00197
`Patent 9,329,675 B2
`
`magnetometers), reflection of transmitted energy (infrared laser
`radar, ultrasonic sensors, and microwave radar sensors),
`electromagnetic induction (inductive-loop detectors), and
`vibration ( triboelectric, seismic, and inertia-switch sensors).
`MEMS accelerometers, gyros, and magnetometers are examples
`of motions sensors.
`Pet. 11 (citing Ex. 1001, 10:35-39).
`
`Petitioner further references the '675 patent specification for the
`
`teaching of infrared sensors mounted on the device to track the probe in the
`surroundings. Pet. Reply 18 (citing Ex. 1001, 6:46-59). Petitioner contends
`that Patent Owner's own extrinsic evidence (Ex. 2013) explains that infrared
`sensors, such as the '675 patent's infrared sensors, function as motion
`sensors by tracking position and orientation. Id. at 19. Petitioner further
`points us to the deposition testimony of Patent Owner's declarant
`
`Dr. Balakrishan that infrared ("IR") trackers can track position and
`
`orientation. Id. ( citing Ex. 103 7, 44: 15-17). Petitioner contends that
`
`Dr. Balakrishan acknowledged that the '675 patent does not provide any
`"particular way to sense motion." Id. Petitioner contends that the '675
`patent recitation of "motion sensors" is sufficiently broad to include sensors
`
`that sense motion by tracking position and orientation. Id. at 20.
`
`Patent Owner asserts that a 3D tracking system that detects 3D
`position and orientation and records time does not constitute a motion sensor
`
`because it detects 3D position data and not 3D motion data. PO Resp. 19
`(citing Ex. 1006 ,r 74, Fig. 7; Ex. 2011 ,r 59). Patent Owner asserts a
`distinction exists between a 3D sensor that detects "absolute values" of
`
`position as opposed to a motion sensor that detects "relative values" equated
`
`with motion of "how far they move rather than where they are." Id. at 26
`(quoting Ex. 2013, 92). During the hearing, Patent Owner's counsel stated
`
`15
`
`0017
`
`

`

`IPR2018-00197
`Patent 9,329,675 B2
`
`"you can use many different technologies to detect motion, but the fact of
`
`the matter is a motion sensor detects motion. It's different than detecting
`
`position. It's different than tracking position. It's different than sensing
`
`orientation and it's different from tracking orientation." Tr. 28:5-8. Patent
`Owner's counsel acknowledged, however, that "[m]otion can be a change in
`
`orientation and position." Id. at 30:1-2.
`
`We agree with Petitioner that the '675 patent specification teaches a
`
`non-exclusive extensive list of motion sensors including acoustic, optical,
`and infrared sensors. Pet. 11 (citing Ex. 1001, 10:35-43). The non(cid:173)
`
`exhaustive list of motion sensors includes accelerometers. Id.
`
`According to the '675 patent specification, the system uses signals
`from a motion sensor in or on the handheld device to determine how to
`present the view of the virtual 3 D environment on computer screen 101.
`
`Ex. 1001, 4:3-5. The function of the motion sensor used in the '675 patent
`is described in pertinent part as follows:
`
`The integration of the user interface functionality in the
`device 100 is provided by motion sensors (not visible), which
`can be accelerometers inside the scanner 100, whose readings
`determine the orientation, as seen in FIGS. 2a and 2b, of the
`display on the screen of the 3D model 105 of the teeth acquired
`by the scanner 100.
`
`Ex. 1001, 11:37-42 (emphasis added).
`
`The particular descriptions of Figures 2a and 2b show that the viewing
`
`angle changes based on holding or pointing the scanner downwards or
`
`horizontally. Ex. 1001, 11:15-21. Thus, the '675 patent specification
`contradicts Patent Owner's assertion that the motion sensor excludes sensors
`
`that detect position and orientation. See PO Resp. 19 (citing Ex. 1006, 74,
`Fig. 7; Ex. 2011, 59). In fact, the '675 patent specification discloses the
`
`16
`
`0018
`
`

`

`IPR2018-00197
`Patent 9,329,675 B2
`
`reading of the orientation of the motion sensor at a downward angle of the
`scanner shows a downward viewing angle displayed in Figure 2a (see
`Ex. 1001, 11: 15-17, Fig. 2a) compared to the reading of the orientation of
`the motion sensor at a horizontal direction which shows a horizontal viewing
`angle displayed in Figure 2b (see id. at 11: 18-21, Fig. 2b ).
`Thus, we agree with Petitioner that the '675 patent's "motion sensors"
`do not exclude sensors that track position and orientation. See Pet. Reply
`17-20.
`
`Patent Owner further contends the '675 patent specification discloses
`that position and orientation data is from the 3D image data recorded by the
`handheld device, not from the motion sensor. PO Resp. 22 ( citing Ex. 1001,
`9: 1-3 ("The 3D data recorded by the handheld device can be registered in
`real time with the a-priori data, such that the position and orientation of the
`device can be detected."), 1 :24-25, 1 :32-35 (disclosing that "3D data" refers
`to displayed image data)). Patent Owner concludes that the '675 patent
`distinguishes position and orientation data, from motion data detected by a
`motion sensor. Id. (citing Ex. 2011 il 63).
`We do not agree with Patent Owner's contention. The context for the
`cited embodiment is provided below:
`
`In some embodiments the handheld device is a
`mechanical tool. In some embodiments, the tool has at least
`one motion sensor built in. In other embodiments, other user(cid:173)
`interface elements are built in as well, for example buttons,
`scroll wheels, touch-sensitive fields, or proximity sensors.
`In some embodiment[s] the 3D geometry of the 3D
`environment is known a-priori or a 3D representation of the
`environment is known a priori, i.e. before the actions (s) are
`performed. For example in surgery, a CT scan may have been
`taken before the surgical procedure. The handheld device in
`
`17
`
`0019
`
`

`

`IPR2018-00197
`Patent 9,329,675 B2
`
`this example could be a surgical instrument that a physician
`needs to apply in the proper 3D position. To make sure this
`proper position is reached, it could be beneficial to view the 3D
`environment from multiple perspectives interactively, i.e.
`without having to release the surgical instrument.
`An advantage of the system, also in the above surgery
`example, is the ability of the handheld device to record the 3D
`environment at least partially, typically in a 3D field-of-view
`that is smaller than the volume represented in the a-priori data.
`The 3D data recorded by the handheld device can be registered
`in real time with the a-priori data, such that the position and
`orientation of the device can be detected.
`
`Ex. 1001, 8:49-9:3 (emphases added). Furthermore, the '675 patent
`
`specification discloses that the system uses signals from a motion sensor in
`
`the handheld device to determine how to present the view of the virtual 3D
`environment on computer screen 101. Id. at 4:3-5. These disclosures in the
`'675 patent demonstrate the handheld device (i.e., surgical instrument)
`
`includes a built in motion sensor that allows the physician to place the
`surgical instrument in the proper 3D position by allowing viewing of the
`previously acquired 3D environment (i.e., CT scan used to create an a priori
`3D representation) from multiple perspectives. See Ex. 1001, 8:49-63. The
`
`system uses signals from a motion sensor in the handheld device (i.e., in this
`
`instance a surgical instrument) to determine how to present the view of the
`virtual 3D environment on the computer screen. See Ex. 1001, 4:3-5.
`
`Accordingly, the reading of the position and orientation signals of the
`
`motion sensor located in the handheld surgical instrument allows viewing
`
`the respective view of the 3D environment in order to guide the surgical
`instrument at the right position.
`
`18
`
`0020
`
`

`

`IPR2018-00197
`Patent 9,329,675 B2
`
`The citation by Patent Owner (Ex. 1001, 9:1-3) refers to the
`additional embodiment where a handheld device is performing a scanning
`procedure in real time creating a smaller 3D field-of-view than the 3D image
`produced a-priori and the two sets of images can be co-registered so that the
`position and orientation of the device can be detected. See id. at 8:64-9:3.
`In other words, the co-registration of the images allows for determination of
`the position and orientation of the surgical instrument in real time by reading
`the signals from the motion sensor. Although the excerpt cited by Patent
`Owner does not reference the motion sensor, the earlier paragraph, when
`read in context, reveals the use of a motion sensor so that the position and
`orientation of the surgical instrument can be determined. See Ex. 1001,
`8:49-9:3., Thus, in the embodiment cited by Patent Owner, the signals of
`position and orientation read from the motion sensor of the handheld device
`(whether the handheld device is a surgical instrument and/or a scanner) are
`still what determine the view to present on the display.
`Nowhere in the '675 patent specification do we see a special
`definition or disavowal of a particular type of motion sensor to exclude those
`sensors that can only detect position and orientation information to detect
`motion. On the contrary, the specification describes examples of motion
`sensors that detect position and orientation as described above.
`Furthermore, we see no disavowal of a particular type of motion
`sensor to exclude those sensors that only detect position and orientation
`information. In particular, we see no distinction between using a motion
`sensor that is delineated as a "position" versus a "motion" sensor.
`Thus, the breadth of a claimed term is not limited because (1) the
`specification does not reveal a special definition given to "motion sensor" by
`
`19
`
`0021
`
`

`

`IPR2018-00197
`Patent 9,329,675 B2
`
`the patentee acting as a lexicographer that differs from the meaning it would
`
`otherwise possess (see CCS Fitness, Inc. v. Brunswick Corp., 288 F.3d at
`
`1366); and (2) the specification does not reveal an intentional disclaimer, or
`
`disavowal, of claim scope by the inventor (see SciMed Life Sys., 242 F.3d at
`
`1343--44).
`
`Accordingly, we determine that, under a broadest reasonable
`
`interpretation in light of the specification of the '675 patent, the term
`
`"motion sensor" does not exclude motion sensors that track position and
`
`orientation.
`
`When the specification is clear about the scope and content of a claim
`
`term, there is no need to turn to extrinsic evidence for claim interpretation.
`
`3M Innovative Props. Co., 725 F.3d at 1326-28. However, for completeness
`
`we also address the extrinsic evidence of record.
`
`Patent Owner's expert testified that "tracking 3D position over time
`
`provides 3D position over time ... simply sensing position over time does
`
`not give me motion." Ex. 1037, 79:19-80:12. Furthermore, Patent Owner
`provides the extrinsic evidence of the "Buxton table" to indicate that motion
`
`and position are fundamentally different properties and that one ordinarily
`
`skilled in the art would have understood that motion sensed at particular
`
`times is different than position sensed at particular times. PO Resp. 21-22
`
`(citing Ex. 2009, 149). The Buxton Table is reproduced below.
`
`20
`
`0022
`
`

`

`IPR2018-00197
`Patent 9,329,675 B2
`
`1
`
`Number of Dimensions
`2
`Slldrng Tablet Light
`Pot
`Pen
`
`Joystick
`
`Touch Touch
`Tablet Screen
`
`3
`
`30
`Joystick M
`- - - - .. -
`T
`fC M ::::,
`- - - - "' - ::,
`fa.
`ID
`T .q
`'O co
`
`Traci<ball
`
`T
`
`Trackball
`
`Tasa
`X·Y Pad
`
`Isometric
`Joystrck
`
`Rotary
`C: Pot
`0 :.:: "------- ----------
`'in
`~
`m Contlnuou& Treadmill
`$ C: RoleryPod
`~ ~ ------- ----------
`& :E
`0 ...
`(l) ... Sensing Pad
`~ e a..
`
`T~wnbwhool
`
`Tasa
`Ferinstat
`
`Q.
`
`TorQue
`
`Pressure
`
`0
`
`::,
`
`The Bu:.:ton Table
`
`The Buxton Table reproduced above shows "Motion" and "Position" as
`
`different properties sensed. Ex. 2009, 149.
`
`Petitioner asserts that Patent Owner tries to exclude sensors that
`
`derive motion from position and orientation from the scope of the '675
`
`patent's claimed "motion sensor," by turning to studies on the taxonomy of
`
`input devices that show certain devices categorized based on whether they
`
`sense motion ( e.g., trackball, treadmill) or position ( e.g., joystick, light pen,
`
`etc.). Pet. Reply 23 (citing PO Resp. 21 (citing Ex. 2013, 130)).
`
`Petitioner cites to a different section of the reference provided by
`
`Patent Owner showing a table as reproduced below to emphasize that
`
`trackers that sense "position" do so by tracking position and orientation.
`
`Pet. Reply 23-24 (citing to Ex. 2013, 130).
`
`21
`
`0023
`
`

`

`IPR2018-00197
`Patent 9,329,675 B2
`
`i Ill
`i en
`f: m
`0.
`0
`d:
`
`C: .g
`1
`
`j
`
`~
`
`~ i

`
`6
`
`,
`.
`
`•
`
`.,
`'
`
`'
`'
`
`Number of Dimensions
`2
`.
`'
`Tabtat
`T.-a.ckers
`Bend
`and
`: Llnee.r
`: lsotonie
`(?osl1lo1 &
`M
`Sensor : Slider
`Stylus : Joystick
`Ooontallon)
`............... 1 .................... ···~···-~·-······ -·--·--·········- . .
`.
`.
`' .
`'
`Toocil
`'
`!
`T
`.
`.
`Tablet
`.
`. :rreactm111 Mousa :TrackBaJI
`.
`.
`,
`:
`' .
`:
`Sensor .
`'
`.
`'
`.
`'
`' t
`Figure 4.30 of Exhibit 2013 shows the "position" property being
`sensed by trackers ( depicted under numeral 6 of the table) sensing position
`
`br'Q\.18

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket