throbber
Trials@uspto.gov
`571-272-7822
`
`
`
`
`
`
`
`Paper No. 12
`Date: December 17, 2019
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`LG ELECTRONICS INC.,
`Petitioner,
`v.
`
`CYWEE GROUP LTD,
`Patent Owner.
`
`IPR2019-01203
`Patent 8,441,438 B2
`
`
`
`
`
`
`
`
`
`Before PATRICK M. BOUCHER, KAMRAN JIVANI, and
`CHRISTOPHER L. OGDEN, Administrative Patent Judges.
`
`OGDEN, Administrative Patent Judge.
`
`DECISION
`Granting Institution of Inter Partes Review
`35 U.S.C. § 314
`Granting Motion for Joinder
`35 U.S.C. § 315(c); 37 C.F.R. § 42.122
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`
`I.
`
`INTRODUCTION
`
`LG Electronics Inc. (“Petitioner”)1 filed (1) a Petition for inter partes
`review (Paper 2, “Pet.”) of claims 1, 4, 5, 14–17, and 19 of U.S. Patent No.
`8,441,438 B2 (Ex. 1001, “the ’438 patent”); and (2) a Motion for Joinder
`(Paper 3, “Mot.”) with IPR2019-00143 (the “related IPR” or “ZTE IPR”),
`for which we instituted an inter partes review on May 17, 2019. ZTE (USA),
`Inc. v. Cywee Group Ltd., IPR2019-00143, Paper 7 (PTAB May 17, 2019)
`(“IPR2019-00143 Dec.”). CyWee Group Ltd. (“Patent Owner”2), filed a
`Preliminary Response (Paper 8, “Prelim. Resp.”). Patent Owner’s arguments
`regarding the Motion for Joinder appear solely in the Preliminary Response.
`See Prelim. Resp. 59–63.
`At our discretion, we may institute an inter partes review when “the
`information presented in the petition . . . and any response . . . shows that
`there is a reasonable likelihood that the petitioner would prevail with respect
`to at least 1 of the claims challenged in the petition.” 35 U.S.C. § 314(a).
`Applying that standard, we institute an inter partes review of claims 1, 4, 5,
`14–17, and 19 of the ’438 patent, for the reasons explained below. We also
`grant the Motion for Joinder, joining Petitioner as a party to the related IPR.
`
`
`1 Petitioner identifies itself and LG Electronics U.S.A., Inc. as the real
`parties in interest. Pet. 1. Petitioner also “further identifies” as real parties in
`interest “the parties identified in IPR2019-00143 (to which this petition
`seeks joinder): ZTE (USA), Inc. and ZTE Corporation.” Id.
`2 Patent Owner identifies itself as the real party in interest. Paper 5, 2.
`
`2
`
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`
`II.
`
`BACKGROUND
`
`A. RELATED PROCEEDINGS
`
`In addition to the related IPR, the parties identify the following as
`related matters: CyWee Group Ltd. v. ZTE (USA) Inc., No. 3:17-cv-02130
`(S.D. Cal.); CyWee Group Ltd. v. Google, Inc., No. 1:18-cv-00571 (D. Del.);
`CyWee Group Ltd. v. HTC Corporation et al., No. 2:17-cv-00932 (W.D.
`Wash.); CyWee Group Ltd. v. Motorola Mobility LLC, No. 1:17-cv-00780
`(D. Del.); CyWee Group Ltd. v. Huawei Technologies Co., Inc. et al., No.
`2:17-cv-00495 (E.D. Tex.); CyWee Group Ltd. v. LG Electronics, Inc. et al.,
`Case No. 3:17-cv-01102, (S.D. Cal.); and CyWee Group Ltd. v. Samsung
`Electronics Co. Ltd. et al., No. 2:17-cv-00140 (E.D. Tex.); CyWee Group
`Ltd. v. Apple Inc., No. 4:14-cv-01853 (N.D. Cal.); and Google LLC v. CyWee
`Group Ltd., IPR2018-01258 (PTAB) (trial instituted Dec. 11, 2018). Pet. 1–
`2; Paper 5, 2–3.
`
`B.
`
`THE ’438 PATENT (EX. 1001)
`
`The ’438 patent “relates to a three-dimensional (3D) pointing device.”
`Ex. 1001, 1:17–18. The pointing device uses a “six-axis motion sensor
`module” to measure movements and rotations of the device. Id. at 1:18–23.
`The device then compensates for accumulated measurement errors, to obtain
`actual deviation angles in the device’s spatial reference frame. Id. at 1:23–
`26. The pointing device relates to prior art shown in Figure 1 of the ’438
`patent, reproduced below:
`
`
`
`
`3
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`
`
`Figure 1, above, depicts handheld 3D pointing device 110, which a user may
`point at screen 122 of display device 120. Ex. 1001, 1:28–30. The figure also
`depicts a reference frame, called the “spatial pointer reference frame,”
`associated with pointing device 110, which is defined by coordinate axes XP,
`YP, and ZP (113, 112, and 111, respectively). Id. at 1:38–41.
`Figure 3 of the ’438 patent, reproduced below, shows the pointing
`device’s hardware components:
`
`
`
`
`4
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`
`
`Figure 3, above, is an exploded diagram showing 3D pointing device 300.
`Ex. 1001, 7:26–28. Within housing 330, formed of top cover 310 and bottom
`cover 320, are rotation sensor 342, accelerometer 344, data transmitting unit
`346, and computing processor 348, each attached to printed circuit board
`340. Id. at 7:36–55.
`Some of the above hardware components are also depicted in Figure
`4, reproduced below:
`
`
`
`
`5
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`
`
`Figure 4, above, is a schematic block diagram showing the relationship
`between rotation sensor 342, accelerometer 344, data transmitting unit 346,
`and computing processor 348. Box 302 represents a “six-axis motion sensor
`module,” which groups together rotation sensor 342 and accelerometer 344.
`Ex. 1001, 7:59–61. Box 304 represents a “processing and transmitting
`module,” which groups together data transmitting unit 346 and computing
`processor 348. Id. at 7:61–63.
`Figure 4 also includes arrows from rotation sensor 342 and
`accelerometer 344 to data transmitting unit 346, depicting the flow of first
`and second signal sets, respectively, and an arrow from data transmitting unit
`346 to computer processor 348. See Ex. 1001, 7:64–8:26. The first signal set,
`from rotation sensor 342, includes “angular velocities ωx, ωy, and ωz
`associated with the movements and rotations of the 3D pointing device”
`about the coordinate axes of the reference frame. Id. at 7:65–8:2. The second
`signal set, from accelerometer 344, includes “axial accelerations Ax, Ay, Az
`associated with the movements and rotations of the 3D pointing device . . .
`along each of the three orthogonal coordinate axes XP YP ZP of the spatial
`pointer reference frame.” Id. at 8:4–8.
`
`6
`
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`
`Using the first and second signal sets, the 3D pointing device
`compensates for accumulated errors, over time, in the device’s estimation of
`its spatial orientation. See Ex. 1001, 1:17–26, 4:6–30. Figure 7, reproduced
`below, depicts a flowchart embodying this process:
`
`
`The process depicted above in Figure 7 starts with either initializing a new
`state or “obtaining a previous state of the six-axis motion sensor module (. . .
`
`
`
`
`7
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`steps 705, 710).” Ex. 1001, 10:66–11:1. This state is in the form of “a first
`quaternion[3] associated with previous angular velocities ωx, ωy, ωz gained
`from the motion sensor signals of the six-axis motion sensor module at a
`previous time T−1.” Id. at 11:2–4.
`The method proceeds by “obtaining measured angular velocities ωx,
`ωy, ωz gained from the motion sensor signals of the six-axis motion sensor
`module at a current time T (. . . steps 715, 720),” to form a second
`quaternion representing the “current state.” Ex. 1001, 11:6–8, 12:32–60. The
`method then obtains a “measured state” using sets of axial accelerations:
`“measured axial accelerations Ax, Ay, Az” from the accelerometer (step
`725), and “predicted axial accelerations Ax′, Ay′, Az′,” which are calculated
`based on the measured angular velocities (step 730). Id. at 11:6–12, 12:61–
`13:24. Using the “measured state,” the method next obtains a third
`quaternion, representing an “updated state,” by comparing the current state
`with the measured state (step 735). Id. at 11:15–18, 13:25–14:34.
`“[T]o provide a continuous loop,” the method then outputs and
`substitutes the updated state or third quaternion (step 740) into the first
`quaternion or previous state (step 710). Ex. 1001, 11:22–29. Ultimately, the
`method generates a resultant deviation, in terms of yaw, pitch, and roll
`angles, with respect to the axes of the spatial pointer reference frame. Id. at
`
`
`3 Petitioner’s declarant, Mr. Andrews, explains that a quaternion is an
`extension of complex numbers which can represent rotations in three-
`dimensional space in a way that is computationally efficient. Ex. 1003
`¶¶ 41–42 (citing Eric Robert Bachmann, Inertial and Magnetic Tracking of
`Limb Segment Orientation for Inserting Humans into Synthetic
`Environments 56 (Dec. 2000) (unpublished Ph.D. dissertation, Naval
`Postgraduate School), Ex. 1014). Patent Owner does not contest that
`characterization.
`
`
`
`8
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`14:47–15:7. According to the ’438 patent, one may use these deviation
`angles to map locations from 3D space to corresponding locations that
`indicate where the device is pointing on a 2D display device. See id. at
`15:39–17:40, Figs. 8, 9.
`
`C.
`
`CHALLENGED CLAIMS AND ASSERTED GROUNDS OF
`UNPATENTABILITY
`
`Petitioner challenges the patentability of claims 1, 4, 5, 14–17, and 19
`of the ’438 patent under 35 U.S.C. § 103(a) (2006),4 as summarized in the
`following table:
`Claims Challenged 35 U.S.C. § References
`1, 4, 5, 14–17, 19
`103(a)
`Yamashita5 and Bachmann6
`Nasiri7 (including Sachs8) and
`1, 4, 5, 14–17, 19
`103(a)
`Song9
`Pet. 6–7. Independent claims 1 and 14, which exemplify the other claims,
`are as follows:
`
`
`4 Because the filing date of the ’438 patent was before March 16, 2013, the
`applicable version of 35 U.S.C. § 103 is the one that was in force prior to the
`Leahy–Smith America Invents Act. See Pub. L. No. 112-29, § 3(n)(1), 125
`Stat. 284, 293 (2011).
`5 Yamashita et al., US 8,267,785 B2, issued Sept. 18, 2012 (“Yamashita”).
`Ex. 1006.
`6 Bachmann et al., US 7,089,148 B1, issued Aug. 8, 2006 (“Bachmann”). Ex.
`1007.
`7 Nasiri et al., US 8,462,109 B2, issued June 11, 2013 (“Nasiri”) (Ex. 1008).
`8 Sachs et al., US 2009/0265671 A1, published Oct. 22, 2009 (“Sachs”) (Ex.
`1009). Nasiri incorporates the entirety of Sachs by reference. See Ex. 1008,
`1:47–49, 1:57–58, 13:65–14:3.
`9 Song et al., US 2007/0299626 A1, published Dec. 27. 2007 (“Song”) (Ex.
`1010).
`
`
`
`9
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`
`1. A three-dimensional (3D) pointing device subject to
`
`movements and rotations in dynamic environments,
`comprising:
`a housing associated with said movements and rotations of the
`3D pointing device in a spatial pointer reference frame;
`a printed circuit board (PCB) enclosed by the housing;
`a six-axis motion sensor module attached to the PCB,
`comprising a rotation sensor for detecting and generating a
`first signal set comprising angular velocities ωx, ωy, ωz
`associated with said movements and rotations of the 3D
`pointing device in the spatial pointer reference frame,
`an accelerometer for detecting and generating a second signal
`set comprising axial accelerations Ax, Ay, Az associated
`with said movements and rotations of the 3D pointing device
`in the spatial pointer reference frame; and
`a processing and transmitting module, comprising a data
`transmitting unit electrically connected to the six-axis
`motion sensor module for transmitting said first and second
`signal sets thereof and a computing processor for receiving
`and calculating said first and second signal sets from the
`data transmitting unit, communicating with the six-axis
`motion sensor module to calculate a resulting deviation
`comprising resultant angles in said spatial pointer reference
`frame by utilizing a comparison to compare the first signal
`set with the second signal set whereby said resultant angles
`in the spatial pointer reference frame of the resulting
`deviation of the six-axis motion sensor module of the 3D
`pointing device are obtained under said dynamic
`environments, wherein the comparison utilized by the
`processing and transmitting module further comprises an
`update program to obtain an updated state based on a
`previous state associated with said first signal set and a
`measured state associated with said second signal set;
`wherein the measured state includes a measurement of said
`second signal set and a predicted measurement obtained
`based on the first signal set without using any derivatives of
`the first signal set.
`
`
`
`
`
`
`10
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`
`14. A method for obtaining a resulting deviation including
`
`resultant angles in a spatial pointer reference frame of a three-
`dimensional (3D) pointing device utilizing a six-axis motion
`sensor module therein and subject to movements and rotations
`in dynamic environments in said spatial pointer reference
`frame, comprising the steps of:
`obtaining a previous state of the six-axis motion sensor module;
`wherein the previous state includes an initial-value set
`associated with previous angular velocities gained from the
`motion sensor signals of the six-axis motion sensor module
`at a previous time T−1;
`obtaining a current state of the six-axis motion sensor module
`by obtaining measured angular velocities ωx, ωy, ωz gained
`from the motion sensor signals of the six-axis motion sensor
`module at a current time T;
`obtaining a measured state of the six-axis motion sensor module
`by obtaining measured axial accelerations Ax, Ay, Az gained
`from the motion sensor signals of the six-axis motion sensor
`module at the current time T and calculating predicted axial
`accelerations Ax′, Ay′, Az′ based on the measured angular
`velocities ωx, ωy, ωz of the current state of the six-axis
`motion sensor module without using any derivatives of the
`measured angular velocities ωx, ωy, ωz; said current state of
`the six-axis motion sensor module is a second quaternion
`with respect to said current time T; comparing the second
`quaternion in relation to the measured angular velocities ωx,
`ωy, ωz of the current state at current time T with the
`measured axial accelerations Ax, Ay, Az and the predicted
`axial accelerations Ax′, Ay′, Az′ also at current time T;
`obtaining an updated state of the six-axis motion sensor module
`by comparing the current state with the measured state of the
`six-axis motion sensor module; and
`calculating and converting the updated state of the six axis
`motion sensor module to said resulting deviation comprising
`said resultant angles in said spatial pointer reference frame
`of the 3D pointing device.
`
`
`
`
`11
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`Ex. 1001 at 18:54–19:26, 21:8–45. Claim 19 is also independent, and similar
`to claim 14. See id. at 22:17–54. Claims 4 and 5 depend from claim 1, while
`claims 15–17 depend from claim 14. See id. at 19:37–48, 21:46–22:8.
`Petitioner relies on the Declaration of Scott Andrews, Oct. 31, 2018.
`Ex. 1003. Patent Owner relies on the Declaration of Joseph LaViola, Ph.D.,
`Feb. 20, 2018. Ex. 2001. Patent Owner also challenges Mr. Andrews’s
`qualifications, on the ground that he is not a person of at least ordinary skill
`in the art. Prelim. Resp. 58–59. However, for the purpose of deciding
`whether to institute an inter partes review, any genuine issue of material fact
`created by differing testimonial evidence is “viewed in the light most
`favorable to the petitioner.” See 37 C.F.R. § 42.108(c).10 Therefore, solely
`for the purpose of this decision, we accord weight to Mr. Andrews’s
`testimony. We will consider any arguments and evidence that Patent Owner
`presents on this issue during the trial.11
`
`III. GROUNDS OF THE PETITION
`
`In the related IPR, we instituted an inter partes review on the same
`grounds as this Petition. IPR2019-00143 Dec. 8–10, 38. Petitioner
`“challenges the same claims” and “relies on the same substantive arguments
`and substantive evidentiary record” as the petition in the related IPR. See
`
`
`10 We note that a witness does not necessarily need to be a person of ordinary
`skill in the art to testify as an expert. See Patent Trial and Appeal Board
`Consolidated Trial Practice Guide 34 (Nov. 2019) (citing Sundance, Inc. v.
`DeMonte Fabricating Ltd., 550 F.3d 1356, 1363–64 (Fed. Cir. 2008)),
`https://go.usa.gov/xpvPF.
`11 We note that in the related IPR, Patent Owner has raised this issue in its
`Patent Owner Response. See IPR2019-00143, Paper 18 at 62 (PTAB Aug. 9,
`2019).
`
`
`
`12
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`Mot. 1. However, Patent Owner’s Preliminary Response contains additional
`and more extensive arguments compared to its preliminary response in the
`related IPR. Considering these additional arguments, and all the arguments
`and evidence specific to this proceeding, we determine for the reasons below
`that the Petition “warrants the institution of an inter partes review.” 35
`U.S.C. § 315(c).
`A claim is unpatentable under 35 U.S.C. § 103 if the differences
`between the claimed subject matter and the prior art are “such that the
`subject matter as a whole would have been obvious at the time the invention
`was made to a person having ordinary skill in the art to which said subject
`matter pertains.” KSR Int’l Co. v. Teleflex Inc., 550 U.S. 398, 406 (2007).
`Additionally, the obviousness inquiry typically requires that we analyze
`“whether there was an apparent reason to combine the known elements in
`the fashion claimed by the patent at issue.” Id. at 418 (citing In re Kahn, 441
`F.3d 977, 988 (Fed. Cir. 2006)). A sufficient ground for obviousness in a
`Petition must “articulate specific reasoning, based on evidence of record, to
`support the legal conclusion of obviousness.” In re Magnum Oil Tools Int’l,
`Ltd., 829 F.3d 1364, 1380 (Fed. Cir. 2016) (citing KSR, 550 U.S. at 418); see
`also 35 U.S.C. § 322(a)(3); 37 C.F.R. §§ 42.22(a)(2), 42.204(b)(4).
`The obviousness inquiry is based on underlying factual
`determinations, including (1) the scope and content of the prior art, (2) any
`differences between the claimed subject matter and the prior art, (3) the level
`of skill in the art, and (4) any objective indicia of obviousness or non-
`obviousness (i.e., secondary considerations) that may be in evidence. See
`Graham v. John Deere Co., 383 U.S. 1, 17–18 (1966). We address these
`factors in the sections below, and conclude that Petitioner has shown that
`
`
`
`
`13
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`there is a reasonable likelihood of prevailing on claims 1, 4, and 5 (but not
`claims 14–17 or 19) based on the combination of Yamashita and Bachmann.
`We do not conclude that there is a reasonable likelihood of prevailing based
`on the combination of Nasiri and Song.
`
`A.
`
`LEVEL OF ORDINARY SKILL IN THE ART
`
`One of the Graham factors is the level of ordinary skill in the
`pertinent art at the time of the invention. See 383 U.S. at 17. The level of
`ordinary skill is also relevant to how we construe the patent claims. See
`Phillips v. AWH Corp., 415 F.3d 1303, 1312–13 (Fed. Cir. 2005) (en banc).
`The “person of ordinary skill in the art” is a hypothetical construct, from
`whose vantage point we assess obviousness. In re Rouffet, 149 F.3d 1350,
`1357 (Fed. Cir. 1998). This legal construct “presumes that all prior art
`references in the field of the invention are available to this hypothetical
`skilled artisan.” Id. (citing In re Carlson, 983 F.2d 1032, 1038 (Fed. Cir.
`1993)).
`Petitioner’s declarant, Scott Andrews, opines that a person of ordinary
`skill in the art
`would have been familiar with motion sensors (such as
`gyroscopes, accelerometers, and magnetometers) and mobile
`device technology. Such [person of ordinary skill in the art]
`would have, at minimum, a bachelor’s degree in computer
`science, computer engineering, electrical engineering, or a
`related field, with at least two years of experiences in research,
`design, or development of pointing devices that utilizing motion
`sensors. Extensive experience and technical training may
`substitute for educational requirements, while advanced
`education such as a relevant MS or PhD might substitute for
`experience.
`
`
`
`
`14
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`Ex. 1003 ¶ 22; see also Pet. 10. Patent Owner’s declarant, Dr. LaViola,
`opines that a person of ordinary skill in the art would have had
`at least a Bachelor’s Degree in Computer Science, Electrical
`Engineering, Mechanical Engineering, or Physics, or equivalent
`work experience, along with knowledge of sensors (such as
`accelerometers, gyroscopes, and magnetometers), and mobile
`computing technologies. In addition, a [person having ordinary
`skill in the art] would be familiar with Kalman filters and
`[extended Kalman filters], and with equations typically used
`with such filters.
`Ex. 2001 ¶ 27; see also Prelim. Resp. 58–59 (arguing that according to
`Patent Owner’s proposed definition, Mr. Andrews has less than an ordinary
`level of skill in the art).
`These positions are the same as those proposed in the related IPR. See
`IPR2019-00143 Dec. 10–12. There, we determined that the two competing
`articulations present a genuine issue of material fact, and we thus we
`accepted the minimum of the education and experience level that Mr.
`Andrews proposes. Id. (citing 37 C.F.R. § 42.108(c)).
`For the same reason, solely for the purpose of this decision, we adopt
`the lower end of Petitioner’s articulation as the level of ordinary skill in the
`art. This level of ordinary skill is consistent with the ’438 patent, which does
`not suggest the need for an advanced degree or many years of experience in
`order to practice in the field. The patent also presumes a level of experience
`with sensors and mobile devices. We will consider this and any further
`arguments and evidence presented at trial, including through cross-
`examination of the respective declarants.
`
`
`
`
`15
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`
`B.
`
`CLAIM CONSTRUCTION
`
`Petitioner proposes identical constructions to those that the petitioner
`proposed in the related IPR. Particularly, Petitioner’s arguments address the
`alleged plain meaning of the terms “three-dimensional (3D) pointing
`device,” “six-axis motion sensor module,” “receiving and calculating said
`first and second signal sets,” and phrases in claims 1, 14, and 19 that include
`the terms “comparison” or “comparing.” Pet. 11–15. However, Petitioner
`consents to the Board’s construction in the related IPR. See Pet. 11.12
`Likewise, Patent Owner asserts that the Board’s constructions in the
`related IPR “are applicable here.” Prelim. Resp. 16. Thus, for the purpose of
`this decision, we apply the same construction of claim terms that we applied
`in the institution decision of the related IPR. See IPR2019-00143 Dec. 12–
`16. These two constructions are as follows: (1) Comparison and comparing
`refer to “the calculating and obtaining of the actual deviation angles of the
`3D pointing device . . . with respect to the first reference frame or spatial
`pointing frame XP YP ZP utilizing signals generated by motion sensors while
`reducing or eliminating noises associated with said motion sensors.” Id. at
`15 (quoting Ex. 1001, 2:28–32). (2) Attached to the PCB means “attached to
`one or more PCBs.” Id. at 16.
`Although we did not construe the term “three-dimensional (3D)
`pointing device” in the related IPR, Patent Owner argues that we should
`
`
`12 Petitioner argues that “the Board should apply [the broadest reasonable
`interpretation standard] because Petitioner is seeking joinder as a passive co-
`petitioner to the [related] IPR.” Pet. 11. On the current record, we would
`reach the same claim construction, whether under the broadest reasonable
`interpretation standard or the standard used to construe claims in a civil
`action under 35 U.S.C. § 282(b).
`
`
`
`16
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`construe it now. See Prelim. Resp. 16; but see Pet. 13 (arguing that the term
`“should be given its plain and ordinary meaning” (citing Ex. 1003 ¶ 55–57)).
`Patent Owner proposes that we construe the term as “a handheld device that
`detects the motion and orientation of said device in three-dimensions and is
`capable of translating the detected motions to control an output on a
`display.” Prelim. Resp. 17. According to Patent Owner, this construction is
`consistent with a prior construction in a parallel case in the Eastern District
`of Texas. Id. (citing Ex. 2001 ¶¶ 47–50; Ex. 2003, 8; Ex. 2004, 2; Ex. 2006,
`6–7.).
`Patent Owner proposed the same construction in the related IPR,
`based on essentially the same arguments. See ZTE v. Cywee, IPR2019-
`00143, Paper 6 at 19–20 (PTAB Feb. 20, 2019). The Board declined to adopt
`this construction, however, because the construction “would not affect our
`decision to institute.” IPR2019-00143 Dec. 13. Likewise, adopting Patent
`Owner’s proposed construction of 3D pointing device would not affect our
`decision to institute with respect to this Petition, based on the evidence of
`record. Therefore, we do not construe this term. See Nidec Motor Corp. v.
`Zhongshan Broad Ocean Motor Co., 868 F.3d 1013, 1017 (Fed. Cir. 2017)
`(“[W]e need only construe terms ‘that are in controversy, and only to the
`extent necessary to resolve the controversy’” (quoting Vivid Techs., Inc. v.
`Am. Sci & Eng’g, Inc., 200 F.3d 795, 803 (Fed. Cir. 1999)).
`
`C.
`
`COMBINATION OF YAMASHITA AND BACHMANN
`
`Petitioner’s first ground in the Petition is that claims 1, 4, 5, 14–17,
`and 19 would have been obvious over Yamashita in view of Bachmann. Pet.
`7. We address this ground below.
`
`
`
`
`17
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`
`1.
`
`Overview of Yamashita
`
`Yamashita discloses technology associated with the Nintendo Wii
`Remote (a game controller), and its associated Wii MotionPlus module (an
`attachable gyrosensor unit). Prelim. Resp. 8. Yamashita’s Figure 3,
`reproduced below, illustrates both parts:
`
`
`Figure 3 depicts game controller 5, and associated gyrosensor unit 7 in a
`non-attached configuration. Controller 5 includes housing 31 with various
`operation buttons 32a–32h. Ex. 1006, 9:42–57. Gyrosensor unit 7 includes
`gyrosensors “for sensing an angular velocity around the three axes,” and is
`detachably attached on connector 33 of controller 5. Id. at 11:14–17.
`Figure 7, reproduced below, is a block diagram showing the interior
`parts of controller 5 and gyrosensor unit 7:
`
`
`
`
`18
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`
`
`Figure 7 depicts game controller 5 and gyrosensor unit 7. On a substrate (not
`shown) within game controller 7, acceleration sensor 37 detects acceleration
`in three axes, and outputs the acceleration data to communication section 36,
`which includes microcomputer 42. Ex. 1006, 11:52–65, 12:62–67, 13:11–13.
`Connector 33, also on substrate 30, detachably connects to plug 53 on
`gyrosensor unit 7. Id. at 11:8–9, 16–20. Gyrosensor unit 7 also includes
`microcomputer 54, and gyrosensors 55 and 56, which sense angular velocity
`
`
`
`
`19
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`around three axes (X, Y, and Z) and the unit transmits that information to
`controller 5. Id. at 14:28–34.
`Yamashita discloses that one may calculate the estimated posture or
`orientation of controller 5 with reference to the measured acceleration and
`angular velocity. Ex. 1006, 19:29–40. Yamashita’s Figure 23, a relevant
`portion of which we reproduce below, is an overview of processing steps for
`making this calculation:
`
`
`This reproduced portion of Figure 23 depicts angular velocity data d1 from
`gyrosensor unit 7, and acceleration data d2 from acceleration sensor 37,
`which are combined in posture estimation step p1 to produce estimated
`posture d3.13 Ex. 1006, 19:1–8. According to Yamashita, “any method is
`usable” for “calculating the estimated posture based on the acceleration and
`angular velocity.” Id. at 19:8–10.
`
`
`13 Although Figure 23 labels d3 as “estimated velocity,” the associated text
`consistently refers to d3 as an “estimated posture.” Ex. 1006, 19:8, 41, 44.
`
`20
`
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`
`2.
`
`Overview of Bachmann
`
`Bachmann describes “a method of determining an orientation of a
`sensor,” Ex. 1007, code (57). “[B]y tracking changes in the orientation of the
`sensor with respect to the local magnetic field vector and the local gravity
`vector,” a sensor “can track the orientation of a body.” Id. at 5:21–25; see
`also id. at 4:59–60 (describing the invention as “a method and apparatus for
`tracking the posture of a body”). In addition, “a system having a plurality of
`sensors, each mounted to a limb of an articulated rigid body can be used to
`track the orientation of each limb.” Id. at 5:25–28.
`Figure 4 of Bachmann is reproduced below:
`
`
`
`Figure 4, above, shows an embodiment “of an overall system
`implementation in accordance with the principles” described by Bachmann.
`Ex. 1007, 13:33–35. This embodiment uses three sensors 401 to track the
`posture of an articulated rigid body in the form of human body 402. Id. at
`13:35–36, 13:64–67. Sensors 401 send sensor information to processing unit
`403, which calculates the posture of body 402 and outputs a display signal to
`display 404, “thereby enabling the movement of the articulated rigid body
`
`21
`
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`402 to be incorporated into a synthetic or virtual environment and then
`displayed.” Id. at 14:23–26.
`In addition to tracking the posture of a human body as shown above in
`Figure 4, the disclosed sensors “can be used to track motion and orientation
`of simple rigid bodies as long as they are made of non-magnetic materials.
`Examples include, but are not limited to hand-held devices, swords, pistols,
`or simulated weapons.” Ex. 1007, 13:43–51; see also id. at 13:57–62
`(suggesting use of the sensors to track “non-magnetic prosthetic devices,
`robot arms, or other machinery”).
`Bachmann uses a filter, in conjunction with data supplied by the
`sensors, “to produce a sensor orientation estimate expressed in quaternion
`form.” Ex. 1007, 7:32–34. In one embodiment, “the sensors include a three-
`axis magnetometer and a three-axis accelerometer.” Id. at 7:34–35. In
`another embodiment, “the magnetometers and accelerometers are
`supplemented with angular rate detectors configured to detect the angular
`velocity of the sensor.” Id. at 7:34–40.
`
`
`
`
`22
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`
`Figure 3 of Bachmann is a block diagram of this filter:
`
`
`
`Ex. 1007, 4:46–48. As depicted in Figure 3 above, the filter takes
`measurement inputs from angular rate sensors 33, which measure sensor
`orientation to produce angular rate information 37, and these measurements
`contain noise. Id. at 10:17–20. According to Bachmann, “output 33 of
`angular rate detectors tends to drift over time . . . unless this orientation is
`continuously corrected using ‘complementary’ data from additional sensors
`(here, accelerometer 31 and magnetometer 32).” Id. at 10:36–42. Thus, the
`
`filter converts angular rate data 37 to a rate quaternion 𝑞𝑞̇ and corrects 𝑞𝑞̇ by
`adding a correction factor 𝑞𝑞𝜀𝜀̇ derived from accelerometers 31 and
`magnetometers 32. See id. The corrected rate quaternion 𝑞𝑞�̇ is then integrated
`
`
`
`
`23
`
`

`

`IPR2019-01203
`Patent 8,441,438 B2
`
`
`value for estimated orientation of the sensor.” Id. at 10:33–36.
`
`(42) and normalized (43) to produce output 𝑞𝑞� (39), which “describes [a] new
`To obtain correction factor 𝑞𝑞𝜀𝜀̇ , the filter combines accelerometer 31
`and magnetometer 32 measurements into a single vector 𝑦𝑦⃗0 (34). See Ex.
`1007, 8:37–51. The filter then compares measurement vector 𝑦𝑦⃗0 with
`calculated vector 𝑦𝑦⃗(𝑞𝑞�) (35a), which is a predicted value derived from the
`𝑞𝑞�. See id. at 8:52–9:8, 9:65–10:2. Measurement error 𝜀𝜀⃗(𝑞𝑞�) (36) is the
`difference between measurement vector 𝑦𝑦⃗0 and calculated vector 𝑦𝑦⃗(𝑞𝑞�). Id. at
`9:13, 10:2–5. The filter uses error 𝜀𝜀⃗(𝑞𝑞�) in equations to obtain the correction
`factor 𝑞𝑞𝜀𝜀̇ and update the next orientation estimate 𝑞𝑞�. See id. at 10:46–11:26,
`
`local gravitational and magnetic fields, and the updated orientation estimate
`
`Fig. 3.
`
`3.
`
`Obviousness Rationale
`
`Petitioner contends that a person of ordinary skill in the art at the time
`of the challenged invention would have had reason “to use Yamashita’s
`game console device with Bachmann’s sensors and filter calculations.” Pet.
`23. In particular, according to Petitioner, “additional sensors, and additional
`types of sensors, would have yielded at least better error and noise control.”
`Id. (citing Ex. 1003 ¶ 84). Petitioner argues that “magnetic, angular rate and
`gravitational (acceleration) sensors, known in the art as MARG sensors,
`were already commercially available” at the time of invention, and could
`have been “integrated in a known fas

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket