throbber
Trials@uspto.gov
`571-272-7822
`
`
`Paper 24
`Entered: September 12, 2019
`
`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`____________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`____________
`
`HTC CORPORATION, HTC AMERICA, INC., and
`VALVE CORPORATION,
`Petitioner,
`
`v.
`
`ELECTRONIC SCRIPTING PRODUCTS, INC.,
`Patent Owner.
`____________
`
`Case IPR2018-010321
`Patent 8,553,935 B2
`____________
`
`
`
`Before ANDREI IANCU, Director of the United States Patent and
`Trademark Office, WILLIAM M. FINK, Vice Chief Administrative Patent
`Judge, and ROBERT J. WEINSCHENK, Administrative Patent Judge.
`
`FINK, Vice Chief Administrative Patent Judge.
`
`
`
`
`
`FINAL WRITTEN DECISION
`35 U.S.C. § 318(a) and 37 C.F.R. § 42.73
`
`
`
`1 Valve Corporation filed a Petition in IPR2019-00074, and has been joined
`as a party to this proceeding.
`
`

`

`IPR2018-01032
`Patent 8,553,935 B2
`
`
`I. INTRODUCTION
`HTC Corporation and HTC America, Inc. filed a Petition pursuant to
`35 U.S.C. §§ 311–319 requesting an inter partes review of claims 1–21 of
`U.S. Patent No. 8,553,935 B2 (“the ’935 patent”). Paper 2 (“Pet.”).
`Electronic Scripting Products, Inc. (“Patent Owner”) filed a Preliminary
`Response. Paper 5 (“Prelim. Resp.”).
`On September 13, 2018, we instituted trial. Paper 6 (“Inst. Dec.”).
`After institution, Patent Owner filed a Response. Paper 8 (“PO Resp.”).
`On January 18, 2019, we granted Valve Corporation’s Motion for
`Joinder in Valve Corp. v. Electronic Scripting Products, Case
`IPR2019-00074 (PTAB filed Oct. 11, 2018). Paper 12, 10. In doing so, we
`joined Valve Corporation to this proceeding, resulting in HTC Corporation,
`HTC America, Inc., and Valve Corporation (collectively “Petitioner”)
`challenging the claims of the ’935 patent. Id.
`Petitioner filed a Reply to Patent Owner’s Response. Paper 14
`(“Reply”). Patent Owner filed a Sur-Reply. Paper 16 (“Sur-Reply”).
`An oral hearing was held on June 4, 2019. A transcript of the hearing
`has been entered into the record. Paper 23 (“Tr.”).
`This Final Written Decision (“Decision”) is issued pursuant to
`35 U.S.C. § 318(a). For the reasons that follow, we conclude Petitioner has
`demonstrated, by a preponderance of the evidence, that claims 1–21 of the
`’935 patent are unpatentable.
`
`A. Related Proceedings
`The parties inform us that the ’935 patent and a related patent, U.S.
`Patent No. 9,235,934 B2 (Ex. 1002), are the subject of a patent infringement
`lawsuit in the U.S. District Court for the Northern District of California:
`
`2
`
`

`

`IPR2018-01032
`Patent 8,553,935 B2
`
`Electronic Scripting Products, Inc. v. HTC America, Inc., No. 3:17-cv-
`05806-RS (N.D. Cal. filed Oct. 9, 2017). Pet. 1–2; Paper 4, 2; see also
`Valve, Case IPR2019-00074 (Paper 1, 1–2; Paper 6, 2). With respect to the
`’934 patent, we denied petitions for institution of inter partes review.
`
`B. The ’935 Patent (Ex. 1001)
`The ’935 patent relates to determining an absolute pose of a
`manipulated object in a real three-dimensional environment, particularly of a
`manipulated object used by humans to interface with the digital world.
`Ex. 1001, 1:24–28. An object’s pose combines the three linear displacement
`coordinates (x, y, z) of any reference point on the object and the three
`orientation angles, also called the Euler angles (ϕ, θ, ψ), that describe the
`object’s pitch, yaw, and roll. Id. at 1:46–50.
`According to the ’935 patent, one-to-one motion mapping between
`space and cyberspace is not possible without the ability to digitize the
`absolute pose of a manipulated object with respect to a well-defined
`reference location in real space. Ex. 1001, 2:49–52. The disclosed
`invention optically infers the absolute pose of the manipulated object by
`relating Euler-rotated object coordinates describing the orientation of the
`object to world coordinates (Xo, Yo, Zo). Id. at 11:43–47. More specifically,
`knowledge of the absolute positions of invariant features or beacons in world
`coordinates (Xo, Yo, Zo) allows an optical measuring arrangement to describe
`the absolute pose of the object with absolute pose data expressed in
`parameters (x, y, z, ϕ, θ, ψ) in Euler-rotated object coordinates within world
`coordinates (Xo, Yo, Zo). Id. at 11:65–12:4.
`
`3
`
`

`

`IPR2018-01032
`Patent 8,553,935 B2
`
`
`Figure 21 of the ’935 patent is reproduced below:
`
`
`Figure 21 illustrates a “cyber game” in which user or player 882 interacts
`with game application 880 by moving manipulated object 884, in this case a
`tennis racket, in real three-dimensional environment 886. Ex. 1001,
`37:9–13. Visual tennis match elements 898A–D and image 884′ of tennis
`racket 884 held by user 882 are displayed on screen 890. Id. at 37:29–44.
`The display of image 884′ changes in response to the detected absolute pose
`of racket 884. Id. at 38:12–20.
`
`The absolute pose of racket 884 is determined using on-board optical
`measuring arrangement 888 and auxiliary motion detection component 904.
`
`4
`
`

`

`IPR2018-01032
`Patent 8,553,935 B2
`
`Ex. 1001, 37:14–16, 37:65–66, 38:12–14. Optical measurement
`arrangement 888 infers absolute pose data (x, y, z, ϕ, θ, ψ) of racket 884 by
`sensing light 893 emitted from beacons B1–B9 disposed on and around
`screen 890. Id. at 37:14–21, 37:61–64. Auxiliary motion detection
`component 904 is an inertial sensing device that includes three-axis
`gyroscope 908 for providing information about changes in orientation (ϕ, θ,
`ψ) and three-axis accelerometer 906 for providing information about linear
`displacement (x, y, z). Id. at 37:65–38:11.
`The combination of absolute pose data (x, y, z, ϕ, θ, ψ) and relative
`motion data is used by tennis game 880 as input for interacting with
`output 896. Id. at 38:12–14. Such absolute pose data and relative motion
`data can be combined using any suitable combination or data fusion
`techniques well-known in the art. Id. at 35:24–34; 44:51–55.
`
`C. Illustrative Claim
`Of the challenged claims, claims 1 and 12 are independent. Claim 1 is
`illustrative of the claims at issue and is reproduced below:
`
`1. A method for use with a system having a manipulated
`object, the method comprising:
`a) accepting light data indicative of light detected by a
`photodetector mounted on-board said manipulated
`object from a first plurality of predetermined light
`sources having known locations in world coordinates;
`b) accepting relative motion data from a relative motion
`sensor mounted on-board said manipulated object
`indicative of a change in an orientation of said
`manipulated object; and
`c) determining the pose of said manipulated object based
`on said light data and said relative motion data, wherein
`
`5
`
`

`

`IPR2018-01032
`Patent 8,553,935 B2
`
`
`said pose is determined with respect to said world
`coordinates.
`Ex. 1001, 51:13–26.
`
`D. Evidence of Record
`Petitioner relies on the following references and declaration (see Pet.
`
`3):
`
`
`Exhibit No.
`Ex. 1004
`
`Ex. 1005
`
`Reference or Declaration
`Greg Welch, et al., High-Performance Wide-Area Optical
`Tracking, PRESENCE: TELEOPERATORS AND VIRTUAL
`ENVIRONMENTS, Feb. 2001, at 1 (“Welch-HiBall”)
`Greg Welch, et al., Tracking: Beyond 15 Minutes of
`Thought, SIGGRAPH 2001 Conference (Aug. 12, 2001)
`(“SIGGRAPH 2001”)
`Ex. 1006
`Romanik, US 5,884,239, issued March 16, 1999.
`Petitioner relies on the declaration of Dr. Gregory Welch (Ex. 1003,
`“Welch Decl.”). Patent Owner relies on the declaration of Dr. Hector H.
`Gonzalez-Banos (Ex. 2001, “Gonzalez-Banos Decl.”).
`
`E. Asserted Grounds of Unpatentability
`Petitioner asserts that the challenged claims are unpatentable on the
`following grounds (see Pet. 23, 53):
`References
`Claims Challenged
`Basis
`1–6, 11–18, and 21
`§ 103(a) Welch-HiBall and SIGGRAPH 2001
`7–11 and 19–21
`§ 103(a) Welch-HiBall, SIGGRAPH 2001,
`and Romanik
`
`II. ANALYSIS
`A. Level of Ordinary Skill in the Art
`Petitioner argues a person of ordinary skill in the art at the time of the
`invention would have had “at least a Master of Science or equivalent degree
`in electrical engineering or computer science, or alternatively a Bachelor of
`
`6
`
`

`

`IPR2018-01032
`Patent 8,553,935 B2
`
`Science or equivalent degree in electrical engineering or computer science,
`with approximately two years of experience in a field relating to image
`processing, robotics, virtual worlds, or sensor integration.” Pet. 13–14
`(citing Welch Decl. ¶¶ 26–30). Petitioner further argues “[t]his definition is
`approximate and more experience may substitute for education or vice
`versa.” Id. at 14 (citing Welch Decl. ¶ 29).
`Patent Owner does not provide its own explanation of the level of
`ordinary skill in the art. Patent Owner also does not oppose Petitioner’s
`proffered level of ordinary skill in the art. Upon consideration of the record,
`including the level of skill reflected by the cited references, Welch-HiBall
`and SIGGRAPH 2001, we agree with and therefore adopt Petitioner’s
`proposed definition for the level of ordinary skill in the art.
`
`B. Claim Construction
`In an inter partes review filed prior to November 13, 2018, claim
`terms in an unexpired patent are interpreted according to their broadest
`reasonable construction in light of the specification of the patent in which
`they appear. 37 C.F.R. § 42.100(b) (2017).2 Under the broadest reasonable
`construction standard, claim terms are presumed to have their ordinary and
`customary meaning, as would be understood by one of ordinary skill in the
`art in the context of the entire disclosure. In re Translogic Tech., Inc., 504
`
`
`2 The Petition in this proceeding was filed on May 10, 2018, prior to the
`effective date of the rule change that replaces the broadest reasonable
`interpretation standard with the federal court claim interpretation standard.
`See Changes to the Claim Construction Standard for Interpreting Claims in
`Trial Proceedings Before the Patent Trial and Appeal Board, 83 Fed. Reg.
`51340 (Oct. 11, 2018) (codified at 37 C.F.R. § 42.100(b) (2019)) (amending
`37 C.F.R. § 42.100(b) effective Nov. 13, 2018).
`
`7
`
`

`

`IPR2018-01032
`Patent 8,553,935 B2
`
`F.3d 1249, 1257 (Fed. Cir. 2007). However, only terms that are in
`controversy need to be construed, and only to the extent necessary to resolve
`the controversy. See Nidec Motor Corp. v. Zhongshan Broad Ocean Motor
`Co., 868 F.3d 1013, 1017 (Fed. Cir. 2017) (citing Vivid Techs., Inc. v. Am.
`Sci. & Eng’g, Inc., 200 F.3d 795, 803 (Fed. Cir. 1999)).
`Petitioner proposes construction of the following claim terms:
`“photodetector,” “relative motion sensor,” “world coordinates,” and “a
`processor for determining the pose of said manipulated object based on said
`light data and said relative motion data.” Pet. 14–17. Patent Owner does not
`propose any separate claim construction or refute Petitioner’s constructions.
`For purposes of institution, we determined it was unnecessary to
`construe any of these claim terms expressly to address the parties’
`arguments. Inst. Dec. 6. Similarly, in view of the complete record, we find
`that it is unnecessary to provide a separate explicit claim construction to
`resolve the issues before us. Instead, to the extent the parties’ arguments are
`based on the scope of the claims, we resolve the disputed claim scope in the
`context of the parties’ arguments as set forth below.
`
`C. Prior Art Status of the References
`Despite some disagreement regarding the priority date of the
`challenged claims (see Pet. 12–13; PO Resp. 24–25), neither party disputes
`that the references relied on by Petitioner in its patentability challenges
`qualify as prior art under 35 U.S.C. § 102(b) against even the earliest
`possible priority date of the claims of the ’935 patent.
`Petitioner contends Welch-HiBall was published in the peer-reviewed
`journal “PRESENCE: TELEOPERATORS AND VIRTUAL ENVIRONMENTS” in
`February 2001, and, therefore, is prior art under 35 U.S.C. § 102(b).
`
`8
`
`

`

`IPR2018-01032
`Patent 8,553,935 B2
`
`Pet. 18–19. Petitioner provides evidence in support of Welch-HiBall’s
`public availability. See id. (citing Exs. 1019–1024; Welch Decl. ¶ 120).
`Patent Owner does not dispute these contentions. We agree that
`Welch-HiBall is prior art under 35 U.S.C. § 102(b).
`Petitioner contends SIGGRAPH 2001 is a set of printed, bound course
`materials distributed to attendees of a course taught by Dr. Welch and
`colleagues on August 12, 2001, at a widely attended conference (called
`“SIGGRAPH”), and, therefore, is prior art under 35 U.S.C. § 102(b).
`Pet. 19–21. Petitioner provides evidence in support of SIGGRAPH 2001’s
`availability, including Dr. Welch’s testimony regarding the distribution of
`the course materials at the conference. See id. (citing Exs. 1025–1029,
`1036; Welch Decl. ¶¶ 104–108). Patent Owner does not dispute these
`contentions. In view of the evidence demonstrating the public availability of
`SIGGRAPH 2001, Petitioner has demonstrated persuasively that
`SIGGRAPH 2001 is prior art under 35 U.S.C. § 102(b).
`
`D. Obviousness of Claims 1–6, 11–18, and 21 over Welch-HiBall and
`SIGGRAPH 2001
`Petitioner contends the subject matter of claims 1–6, 11–18, and 21
`would have been obvious over Welch-HiBall and SIGGRAPH 2001.
`Pet. 23–52; Reply 1–17, 21–25. Petitioner relies on the declaration of Dr.
`Welch for support. See Welch Decl. ¶¶ 120–196. Patent Owner disputes
`Petitioner’s contentions, and relies on the declaration of Dr. Gonzalez-Banos
`for support. PO Resp. 4–24, 27–30; Sur-Reply 10–27; Gonzalez-Banos
`Decl. ¶¶ 4–9. We first summarize Welch-HiBall and SIGGRAPH 2001, and
`then address the parties’ contentions.
`
`9
`
`

`

`IPR2018-01032
`Patent 8,553,935 B2
`
`
`1. Welch-HiBall (Ex. 1004)
`Welch-HiBall is an article titled “High-Performance Wide-Area
`Optical Tracking” that describes a system for head or hand tracking in
`virtual and augmented environments, referred to as a “HiBall Tracking
`System.” Ex. 1004 ¶¶ 1, 9. Figure 6 of Welch-HiBall is reproduced below:
`
`
`Figure 6 depicts the HiBall Tracking System, which consists mainly of
`optical sensing units called HiBalls, infrared (“IR”) light-emitting diodes
`(“LEDs”) fixed on the ceiling, and a Ceiling-HiBall Interface Board (“CIB”)
`connected to a host computer. Id. ¶ 14. The CIB provides communication
`and synchronization between the host computer, HiBalls, and LEDs. Id.
`¶ 23. In order to track the pose of the user, the HiBalls are attached to, for
`example, a head-worn display or user-held wand or drill. Id. ¶¶ 9, 50, 53,
`64, Figs. 4, 13. The HiBalls observe sequentially-flashed LEDs on the
`ceiling using lateral-effect photodiode (“LEPD”) silicon photodetectors,
`which detect the centroid of the luminous flux incident on the detector. Id.
`¶¶ 14–15, 17–18. The LEPD measurements are communicated to the CIB to
`generate pose estimates of the HiBalls using a Kalman-filter-based
`
`10
`
`

`

`IPR2018-01032
`Patent 8,553,935 B2
`
`prediction-correction approach known as single-constraint-at-a-time
`(“SCAAT”) tracking. Id. ¶¶ 8–9, 15, 19, 31.
`Although the disclosed HiBall Tracking System is an optical system,
`Welch-HiBall discloses that “[r]ecently, inertial hybrid systems . . . have
`been gaining popularity . . . with the added benefit of reduced
`high-frequency noise and direct measurements of derivatives.” Id. ¶¶ 2, 4.
`The authors of Welch-HiBall indicate that they are pursuing an
`inertial/optical hybrid approach. Id. ¶ 67.
`SIGGRAPH 2001 (Ex. 1005)
`2.
`SIGGRAPH 2001 is a set of course materials distributed to attendees
`of a conference course titled “Tracking: Beyond 15 Minutes of Thought.”
`Ex. 1005, 1; see Pet. 19. SIGGRAPH 2001 describes methods for tracking a
`computer user’s real-world position and orientation, or pose, in virtual
`environment systems for the purpose of continually providing the user with
`two-dimensional computer generated images that match the user’s
`three-dimensional, real-world position and orientation. Ex. 1005, 9. The
`disclosed methods are applicable to head or hand tracking for interactive
`computer graphics. Id. at 71.
`Tracking systems for interactive computer graphics have included
`mechanical, magnetic, acoustic, inertial, and optical technologies. Id. at 9.
`According to SIGGRAPH 2001, “every type of sensor has fundamental
`limitations related to the associated physical medium,” and “no single
`medium or sensor type provides the necessary performance over the wide
`spectrum of temporal and spatial characteristics desired for many
`applications.” Id. at 56. Accordingly, SIGGRAPH 2001 discusses
`combining tracking technologies exhibiting “complementary behavior” into
`
`11
`
`

`

`IPR2018-01032
`Patent 8,553,935 B2
`
`“hybrid systems” such that “the complementary behavior of each system is
`leveraged to obtain more accurate and stable tracking information than either
`system alone.” Id. at 56–57.
`SIGGRAPH 2001 discloses that “[o]ne of the most popular
`technologies (mediums) used to improve or augment the performance of
`other mediums is to incorporate accelerometers and gyros in some form of
`an inertial navigation system.” Id. at 56 (emphases omitted). One type of
`inertial hybrid system disclosed by SIGGRAPH 2001 is the inertial-optical
`system. Id. at 57.
`Figure 4.1 of SIGGRAPH 2001 is reproduced below:
`
`
`
`Figure 4.1 illustrates an inertial-optical hybrid system that uses an
`indirect-feedback Kalman filter to combine information from inertial and
`optical sensors. Id. at 67. According to SIGGRAPH 2001, the Kalman filter
`is a well-known and often-used mathematical pose estimation tool. Id. at 66.
`“The basic idea is to use the Kalman filter to weight [sic] the different
`mediums most heavily in the circumstances where they each perform best,
`thus providing more accurate and stable estimates than a system based on
`any one medium alone.” Id. at 67.
`
`12
`
`

`

`IPR2018-01032
`Patent 8,553,935 B2
`
`
`3.
`
`Independent Claim 1
`Undisputed limitations
`a.
`Independent claim 1’s preamble recites “[a] method for use with a
`system having a manipulated object.” Ex. 1001, 51:13–14. Petitioner
`contends Welch-HiBall’s “HiBall Tracking System” (or “HiBall System”) is
`a system having a manipulated object such as “a handheld probe mounted on
`a drill to determine the pose of the drill.” Pet. 35 (citing Ex. 1004 ¶¶ 50, 53,
`Figs. 13–14; Welch Decl. ¶¶ 144–146). For example, Petitioner refers to
`Figure 14 as depicting a “HiBall” attached to a drill. Id. Petitioner also
`contends SIGGRAPH 2001 suggests a manipulated object because it teaches
`tracking principles applicable to hand tracking. Id. (citing Welch Decl.
`¶¶ 147–148).
`Independent claim 1 also recites “a) accepting light data indicative of
`light detected by a photodetector mounted on-board said manipulated object
`from a first plurality of predetermined light sources having known locations
`in world coordinates” (“limitation 1(a)”). Ex. 1001, 51:15–18. According to
`Petitioner, the HiBall on-board a manipulated object (e.g., a drill) includes
`LEPDs, which detect light from a plurality of IR LEDs mounted on strips in
`the ceiling to generate and digitize data and transmit it via the CIB to a host
`computer. Pet. 35–36 (citing Ex. 1004 ¶¶ 10–12, 14–15; Welch Decl.
`¶ 149). Welch-HiBall refers to this arrangement in which the sensors are on
`the moving object to be tracked and the landmarks are fixed as an
`“inside-looking-out” system as opposed to an “outside-looking-in” system in
`which the sensors are fixed and the landmarks are on the moving object.
`Ex. 1004 ¶¶ 10–12, Fig. 5. Petitioner contends the LEPDs in the HiBall are
`configured in different orientations to detect the x and y coordinates of the
`
`13
`
`

`

`IPR2018-01032
`Patent 8,553,935 B2
`
`centroid of the light beams hitting the sensor from the IR LEDs mounted in
`the ceiling. Pet. 36 (citing Ex. 1004 ¶¶ 16–19, Figs. 7–9). Petitioner
`contends the IR LEDs are light sources mounted in ceiling tiles at a
`“‘fixed-location’ and therefore have known locations in world coordinates.”
`Id. at 37 (citing Ex. 1004 ¶¶ 14, 20–22, Fig. 10). Petitioner also relies on
`SIGGRAPH 2001 as teaching “optical tracking systems relying on sensors
`mounted on-board a moving object and receiving light from external light
`sources, like those taught by Welch-HiBall.” Id. at 38 (citing Ex. 1005,
`49–56).
`Patent Owner does not dispute Petitioner’s analysis of independent
`claim 1’s preamble or limitation 1(a). We have considered Petitioner’s
`contentions and supporting evidence, and we find Petitioner meets its burden
`for these claim elements.
`Limitation 1(b)
`b.
`Independent claim 1 recites “b) accepting relative motion data from a
`relative motion sensor mounted on-board said manipulated object indicative
`of a change in an orientation of said manipulated object” (“limitation 1(b)”).
`Ex. 1001, 51:19–22. Petitioner relies on Welch-HiBall’s description of
`inertial tracking systems, which Petitioner contends are the “relative motion
`sensors” in the ’935 patent, as showing that such sensors are known in the
`art. Pet. 38 (citing Ex. 1004 ¶¶ 4, 35; Ex. 1001, 42:1–11). Petitioner also
`relies on Welch-HiBall’s discussion of the authors pursuing a “hybrid
`approach” referring to Dr. Welch’s 1995 paper entitled “Hybrid
`Self-Tracker: An Inertial/Optical Hybrid Three-Dimensional Tracking
`System” (Ex. 1012) as teaching combining the HiBall System’s optical
`
`14
`
`

`

`IPR2018-01032
`Patent 8,553,935 B2
`
`tracking with data from inertial sensors to provide more accurate pose
`estimation. Pet. 38 (citing Ex. 1004 ¶ 67).
`Petitioner contends SIGGRAPH 2001 teaches inertial trackers
`including “accelerometers” and “gyroscopes” for pose determination (id. at
`39 (citing Ex. 1005, 35–43)), and that such sensors “must be mounted
`‘on-board’ the object being tracked” (id. (citing Ex. 1005, 25; Welch Decl.
`¶ 185)). Petitioner further relies on SIGGRAPH 2001’s discussion of
`combining inertial tracking system data with optical data in a hybrid tracking
`system. Id. (citing Ex. 1005, 56–57). Patent Owner disputes these
`contentions, arguing that a gyroscope cannot be equated to limitation 1(b).
`PO Resp. 4–10.
`According to Patent Owner, the relative motion sensor recited in
`limitation 1(b) does not need to be a gyroscope to produce relative motion
`data indicative of a change in an orientation of the manipulated object, as
`accelerometers can produce the same. Id. at 8–9; see also id. at 7 (“[A]ny
`sensor choice can be made to measure relative changes in orientation.”).
`Patent Owner also contends “the definition of how a gyroscope operates is
`not the ‘relative motion data’ of claim limitation 1(b).” Id. at 8. According
`to Patent Owner,
`[i]t is thus NOT the selection of any particular inertial sensor, but
`a judicious selection of the subset of inertial data or, more
`broadly, a judicious selection of the subset of possible
`measurements of the motion of the manipulated object - namely
`changes in orientation - that are the gist of claim 1 limitation b).
`Id. at 9. The issue raised by Patent Owner is that, although the disclosed
`gyroscope could be a relative motion sensor (one of several possible
`choices), that alone is not sufficient for limitation 1(b). Rather, the
`gyroscope must be configured to measure “relative motion data” that is
`
`15
`
`

`

`IPR2018-01032
`Patent 8,553,935 B2
`
`“indicative of a change in orientation” to satisfy limitation 1(b), which,
`according to Patent Owner, has not been shown. We disagree.
`As an initial matter, the evidence of record—including the ’935 patent
`and both declarants’ testimonies—conclusively establishes that a person of
`ordinary skill in the art would have understood a gyroscope could be used as
`a relative motion sensor that is indicative of a change in orientation. See,
`e.g., Ex. 1001, 42:4–8 (stating that relative motion sensor 1014 can include
`any suitable inertial sensing device, such as an accelerometer and
`gyroscope); Welch Decl. ¶ 90 (“The relative motion sensor may be an
`inertial sensing device such as a gyroscope or accelerometer.”); PO Resp. 8
`(“[A] gyroscope is a type of sensor that inherently measures orientation, as
`taught by SIGGRAPH 2001 along with thousands of other references.”);
`Ex. 1048, 25:13–24 (admitting the inventors did not invent the “concept of a
`relative motion sensor for measuring changes in orientation” and that one
`way to measure such changes is using a gyroscope).
`Furthermore, the evidence of record also establishes that a person of
`ordinary skill in the art would have used a gyroscope as a relative motion
`sensor that is indicative of a change in orientation. According to Dr. Welch,
`Petitioner’s declarant, the idea of using a relative motion sensor (“e.g.
`inertial or gyroscopic trackers”) and combining its output with optical data
`was well known in the art prior to the ’935 patent. Welch Decl. ¶ 121. Dr.
`Welch further testifies that “SIGGRAPH 2001 teaches the use of relative
`motion sensors, such as inertial trackers including accelerometers and
`gyroscopes.” Id. ¶ 157 (emphasis added).
`We find this testimony to be supported by the evidence and therefore
`entitled to substantial weight. For example, the portions of SIGGRAPH
`
`16
`
`

`

`IPR2018-01032
`Patent 8,553,935 B2
`
`2001 cited by Dr. Welch teach using a gyroscope to determine orientation
`changes expressed as roll, pitch, and yaw. Ex. 1005, 35 (“Inertial trackers
`use . . . gyros to measure the orientation . . . 3D orientation in roll, pitch, and
`yaw.”), 37–38 (“If the [gyroscope’s] gimbals are constrained with springs,
`the rate of change of direction can be measured.”); Figure 3.7 (depicting
`“Rate gyroscopes for measuring rate of turn (left) and rate of roll (right)”);
`Reply 3–6. This evidence is inconsistent with Patent Owner’s contention
`that “nowhere in [the references] is there any terminology that even closely
`resembles the terms ‘relative orientation data’, ‘relative orientation motion’,
`‘relative orientation’, ‘change in orientation’, etc.” Sur-Reply 21. Patent
`Owner’s declarant and the co-inventor of the ’935 patent, Dr.
`Gonzalez-Banos, acknowledged that SIGGRAPH 2001 teaches a gyroscope
`for measuring change in orientation. Ex. 1048, 124:5–125:7 (“Q. And the
`data that is sensed from that angular rate by the micro-mechanical gyro
`[described on page 39 of SIGGRAPH 2001] is data that is indicative of a
`change in orientation of the micro-mechanical gyro? A. I agree.”). Based
`on the evidence before us, we find Petitioner has established persuasively
`that SIGGRAPH 2001 teaches use of a gyroscope for measuring relative
`motion data indicative a change of orientation of the manipulated object.
`At the oral hearing, Patent Owner raised a variation on the above
`argument—contending that SIGGRAPH 2001’s gyroscope may not be
`providing relative motion data, because it is unclear whether the gyroscope
`is disregarding the gravity vector. E.g., Tr. 21:22–22:20, 27:12–28:26,
`47:15–21, 50:21–51:6, 53:6–19. This was the first time Patent Owner
`specifically mentioned that the effect of gravity or the “gravity vector” must
`
`17
`
`

`

`IPR2018-01032
`Patent 8,553,935 B2
`
`be removed for a gyroscope to teach relative motion data.3 Patent Owner’s
`declarant never mentions the gravity vector as an issue that should have been
`addressed by Petitioner.
`Before addressing substantively this argument, we point out that oral
`argument is not itself evidence and, even if it were, it is not the proper forum
`for introducing new evidence or arguments in the proceeding. See Office
`Trial Practice Guide, 77 Fed. Reg. 48756, 48768 (Aug. 14, 2012) (“A party
`may rely upon evidence that has been previously submitted in the
`proceeding and may only present arguments relied upon in the papers
`previously submitted. No new evidence or arguments may be presented at
`the oral argument.”).
`Nonetheless, Petitioner’s evidence addresses the gravity vector issue.
`For example, in describing the operation of gyroscopes used in inertial
`tracking (also called relative motion sensing in the ’935 patent (Ex. 1001,
`42:1–6)), Dr. Welch states:
`Gyroscopes employ the principle of conservation of
`angular momentum. If torque is exerted on a spinning mass, its
`axis of rotation will precess at right angles to both itself and the
`axis of exerted torque. If the mass spins very fast, it will have a
`large angular momentum that will strongly resist changes in
`direction. For 3D orientation (roll, pitch and yaw), three rate
`gyroscopes are typically fitted to a platform with their axes
`mutually perpendicular. Two of the gyroscopes provide for
`horizontal stabilization of the platform--an essential requirement
`to eliminate the influence of accelerations due to gravity--while
`the third is responsible for the north-south alignment. Pitch, roll,
`and yaw are detected by the three gyroscope input axes. The
`
`3 The only mention of gravity or gravity vector in Patent Owner’s papers
`occurs in the Sur-Reply and, even then, only in discussing the human inner-
`ear as an analog to a gyroscope. See Sur-Reply 6–7.
`
`18
`
`

`

`IPR2018-01032
`Patent 8,553,935 B2
`
`
`gimbal deflection of each of the gyroscopes is converted into a
`signal.
`Welch Decl. ¶ 54 (emphasis added). This testimony is nearly verbatim from
`SIGGRAPH 2001, which Dr. Welch authored. See Ex. 1005, 38. Finally, at
`the oral hearing, when asked about SIGGRAPH 2001’s teaching of
`removing the influence of gravity, Patent Owner admitted that “[t]his has
`been known for a very long time.” Tr. 50:21–23.
`In another line of argument, Patent Owner contends Welch-HiBall and
`SIGGRAPH 2001 suggest relying primarily on the inertial measurement unit
`of the hybrid system, and using all relative motion data, not just the subset
`thereof indicative of a change in orientation. PO Resp. 7 (“The two
`references . . . teach to use ALL possible relative data coming from the
`inertials.”), 14–22.
`Claim limitation 1(b), however, is not limited solely to relative motion
`data indicative of a change in orientation. There is no language in
`limitation 1(b) to expressly limit it to only relative motion data indicative of
`a change in orientation. Indeed, we agree with Petitioner that interpreting
`the claim as so limited would be contrary to the specification of the
`’935 patent, which explains that the invention uses other types of relative
`motion data to determine pose. Reply 11–14 (citing Ex. 1001, 26:12–24,
`38:2–11 (“[G]yroscope 908 provides information about changes in the
`orientation. . . . [A]ccelerometer 906 provides information about linear
`displacements.”), 46:2–5, 47:42–49). Moreover, at the oral hearing, Patent
`Owner conceded that claim 1 does not exclude other types of relative data.
`Tr. 25:7–11 (“DIRECTOR IANCU: Okay. But if I have a device that
`accepts relative motion data indicative of the change in orientation and also
`in addition to that, it accepts relative data indicative of a change in position,
`
`19
`
`

`

`IPR2018-01032
`Patent 8,553,935 B2
`
`X, Y, Z, does that device meet the claim? MR. ALBOSZTA: Yes.”).
`After considering the parties’ contentions and evidence, Petitioner
`shows persuasively that SIGGRAPH 2001 teaches a gyroscope configured
`as a relative motion sensor which provides relative motion data indicative of
`a change in orientation of the manipulated object, as recited in
`limitation 1(b). Petitioner has satisfied its burden to show that the proposed
`combination of Welch-HiBall and SIGGRAPH 2001 teaches limitation 1(b).
`Limitation 1(c)
`c.
`Independent claim 1 recites “c), determining the pose of said
`manipulated object based on said light data and said relative motion data,
`wherein said pose is determined with respect to said world coordinates”
`(“limitation 1(c)”). Petitioner contends Welch-HiBall teaches determining
`the pose for a manipulated object, i.e., a drill, using data from light sources.
`Pet. 39–40 (citing Ex. 1004 ¶¶ 8–9, 10–12, 15, 31–41). For example,
`Petitioner contends paragraphs 31–41 teach “recursive pose estimation.” Id.
`at 40. Petitioner also refers to its contentions regarding claim limitation 1(b)
`for Welch-HiBall’s suggestion to use inertial hybrid systems for obtaining
`relative motion data. Id.
`The claim requires that the pose is “determined with respect to said
`world coordinates.” Petitioner contends Welch-HiBall teaches pose
`ca

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket