`U.S. Pat. No. 8,552,978
`
`
`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`_______________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`_____________
`
`Google LLC
`
`Petitioner
`
`v.
`
`Cywee Group Ltd.
`
`
`
`
`
`
`
`
`
`
`
`(record) Patent Owner
`
`IPR2018-01257
`
`Patent No. 8,552,978
`
`
`
`
`
`
`PETITIONER’S OPPOSITION TO PATENT OWNER’S
`MOTION TO AMEND
`
`
`
`
`
`
`i
`
`
`
`IPR2018-01257
`U.S. Pat. No. 8,552,978
`
`
`II.
`
`TABLE OF CONTENTS
`TABLE OF EXHIBITS ........................................................................................... iii
`I.
`CYWEE IS NOT ENTITLED TO A PRIORITY DATE EARLIER
`THAN JULY 6, 2011 ...................................................................................... 1
`AT A MINIMUM, THE PROPOSED AMENDED CLAIMS
`ARE NOT ENTITLED TO THE BENEFIT OF THE
`PROVISIONAL APPLICATION. .................................................................. 3
`A.
`Claim 19 ................................................................................................ 3
`1.
`“handheld” 3D pointing device [19(a)] ...................................... 3
`2.
`a display device built-in to and integrated with the 3D pointing
`device [19(g)] .............................................................................. 5
`Dependent Claim 20 .............................................................................. 6
`B.
`III. THE PROPOSED AMENDED CLAMS WOULD BE OBVIOUS ............... 6
`A. Overview of the Combination ............................................................... 7
`B.
`Rationale and Motivation Supporting the Combination ....................... 7
`C.
`Reasonable Expectation of Success .................................................... 10
`D. Analogous Art ..................................................................................... 10
`E.
`Claim Mapping .................................................................................... 11
`CERTIFICATE OF SERVICE ................................................................................ 25
`
`
`
`
`
`ii
`
`
`
`IPR2018-01257
`U.S. Pat. No. 8,552,978
`
`
`TABLE OF EXHIBITS
`
`
`Exhibit No.
`1001
`1002
`1003
`1004
`1005
`1006
`1007
`
`1008
`
`1009
`1010
`
`1011
`
`1012
`1013
`
`1014
`
`Description
`U.S. Pat. No. 8,552,978 (“the ’978 patent”).
`Declaration of Professor Majid Sarrafzadeh.
`C.V. of Professor Majid Sarrafzadeh.
`U.S. Pat. No. 7,089,148 (“Bachman”).
`U.S. Pat. App. Pub. 2004/0095317 (“Zhang”).
`U.S. Pat. 7,158,118 (“Liberty”).
`Return of Service for Cywee Group Ltd. v. Google, Inc., Case No.
`1-18-cv-00571, (D. Del.).
`Return of Service for Cywee Group Ltd. v. Huawei Technologies
`Co., Inc. et al., Case No. 2-17-cv-00495, (E.D. Tex.).
`File History of U.S. Pat. App. 13/176,771
`Joint Claim Construction and Prehearing Statement in Cywee
`Group Ltd. v. Samsung Electronics Co. Ltd. et al., Case No. 2-17-
`cv-00140, (E.D. Tex.).
`Ex. D to Complaint of April 16, 2018 in Cywee Group Ltd. v.
`Google, Inc., Case No. 1-18-cv-00571 (D. Del.).
`Email of August 3, 2018 from Michael Shore to Luann Simmons.
`CyWee’s First Requests for Production of Documents in Cywee
`Group Ltd. v. Google, Inc., Case No. 1-18-cv-00571, (D. Del.).
`CyWee’s Opposition to Petitioner’s Motion for Joinder to Inter
`Partes Review IPR2018-01258 of February 8, 2019.
`
`
`
`iii
`
`
`
`IPR2018-01257
`U.S. Pat. No. 8,552,978
`
`
`1015
`
`1016
`
`1017
`
`1018
`1019
`
`1020
`1021
`1022
`1023
`
`1024
`1025
`
`CyWee’s Opp. to Defendants’ Motion to Stay Pending Inter Partes
`Review Proceedings in CyWee Group, Ltd. v. Samsung Elec. Co.,
`Ltd., Case 2:17-cv-00140-WCB-RSP (E.D. Tex. Jan. 25, 2019).
`Complaint of April 16, 2018 in Cywee Group Ltd. v. Google, Inc.,
`Case No. 1-18-cv-00571 (D. Del.).
`U.S. Pat. Pub. No. US 2010/0312468 Al (“Withanawasam”).
`Rebuttal Declaration of Professor Majid Sarrafzadeh
`Deposition Transcript of Dr. Joseph LaViola in IPR2018-01257,
`-01258 (May 22, 2019)(“LaViola Tr.”).
`U.S. Pat. No. 7,356,361 (“Hawkins”).
`U.S. Pat. No. 7,630,741 (“Siddiqui”).
`U.S. Pat. No. 8,738,103 (“Puente Baliarda”)
`USPTO PATFT database search
`results
`“ref/7089148”).
`U.S. Pat. Pub. 2018/0153587 A1 (“van der Walt”).
`Deposition Transcript of Joseph LaViola in CyWee Group Ltd., v.
`Huawei Device Co. Ltd., CASE NO. 2017-cv-00495-WCB-RSP
`(E.D. Tex. September 25, 2018).
`
`(search string
`
`
`
`iv
`
`
`
`IPR2018-01257
`U.S. Pat. No. 8,552,978
`
`
`Google respectfully submits this opposition to CyWee’s motion to amend
`
`(“Mot.”). The motion should be denied-in-full for the following reasons.
`
`I.
`
`CYWEE IS NOT ENTITLED TO A PRIORITY DATE EARLIER
`THAN JULY 6, 2011
`To the extent CyWee seeks a priority date for its proposed amended claims
`
`earlier than the actual filing date of the ’771 application, CyWee has not met its
`
`burden. A patent owner seeking the benefit of an earlier priority date in an IPR bears
`
`an initial burden of production to demonstrate entitlement to priority. See Dynamic
`
`Drinkware, LLC v. Nat’l Graphics, Inc., 800 F.3d 1375, 1379-80 (Fed. Cir. 2015)
`
`(the initial burden of production for showing an earlier priority date rests with the
`
`patent owner, not the petitioner). At a minimum, the patent owner’s initial burden
`
`of production under § 119(e)/120 requires the patent owner to identify support in
`
`each application—including each intermediate application in the chain—stretching
`
`back to the first application whose priority date is sought. The need to identify how
`
`each (non-provisional)
`
`intermediate application
`
`independently satisfies
`
`the
`
`requirements of § 120 is born out of the statutory language requiring each earlier
`
`application to be “similarly entitled to the benefit of the filing date of the first
`
`application.” 35 U.S.C. § 120; Encyclopaedia Britannica, Inc. v. Alpine Elecs. of
`
`Am., Inc., 609 F.3d 1345, 1350-52 (Fed. Cir. 2010)(interpreting “similarly entitled”
`
`to require each intermediate application to independently satisfy all § 120
`
`
`
`1
`
`
`
`IPR2018-01257
`U.S. Pat. No. 8,552,978
`
`requirements).
`
`Here, CyWee has not met its burden of production for priority purposes under
`
`§ 119(e)/120. CyWee cites and discusses only the very first application (the ’558
`
`provisional) and the very last application (the ’771 application) in the priority chain
`
`that resulted in the ’978 patent. (Mot. 5-8). CyWee fails to even acknowledge that
`
`two intermediate applications are in the priority chain, namely, U.S. Appl. No.
`
`12/943,934 (filed Nov. 11, 2010) and U.S. Appl. No. 13/072794 (filed Mar. 28,
`
`2011). These intermediate applications—one of which is a CIP—are necessary for
`
`CyWee to satisfy the co-pendency requirement of § 119(e)/120, because the ’558
`
`provisional had already been abandoned as of its 1-year anniversary (on Jan. 6, 2011)
`
`by the time the ’771 application was filed (on July 6, 2011).
`
`The two intermediate applications (the ’934 and ’794 applications) have not
`
`been entered into evidence in this IPR2018-01257 proceeding. This absence of
`
`evidence, coupled with the absence of argument in CyWee’s motion, constitutes a
`
`failure to satisfy the burden of production under § 119(e)/120, including the
`
`requirement that each intermediate application must have co-pendency, a common
`
`inventor, a specific reference to earlier applications, and adequate support under
`
`§ 112(a) for each claim. See Encyclopaedia Britannica, 609 F.3d at 1350-52. As
`
`but one example, the intermediate ’934 application lacks support for “a smartphone”
`
`added in claim 20. (Ex. 1019, ¶34).
`
`
`
`2
`
`
`
`IPR2018-01257
`U.S. Pat. No. 8,552,978
`
`
`As a result, CyWee’s failure to meet its burden of production under Dynamic
`
`Drinkware means that no proposed new claim is entitled to a prior date earlier than
`
`the actual filing date of the ’771 application (i.e., July 6, 2011).
`
`II. AT A MINIMUM, THE PROPOSED AMENDED CLAIMS ARE NOT
`ENTITLED TO THE BENEFIT OF THE PROVISIONAL
`APPLICATION.
`Assuming arguendo that CyWee did meet its burden of production (contrary
`
`to supra §I), the proposed amended claims are not entitled to the benefit of the
`
`provisional application. As a result, the relevant date for purposes of obviousness
`
`(infra §III), is no earlier than the filing date of the ’934 application (i.e., November
`
`11, 2010).
`
`A. Claim 19
`Claim 19 is not entitled to the benefit of the provisional application because
`
`the provisional does not support either (a) the genus of “handheld” 3D pointing
`
`devices (infra §II.1) or (b) “a display device built-in to and integrated with the 3D
`
`pointing device” (infra §II.2).
`
`1. “handheld” 3D pointing device [19(a)]
`The word “handheld” never appears in the provisional application. See
`
`generally Ex. 2012 (provisional). The word was added in the non-provisional ’934
`
`application. Ignoring this omission, CyWee points to the limited disclosure in the
`
`provisional of “a remote controller, a joystick or a cellular phone,” and then reasons
`
`
`
`3
`
`
`
`IPR2018-01257
`U.S. Pat. No. 8,552,978
`
`that these three devices are handheld. (Mot. 6)(citing Ex. 2012, ¶[0023]). This
`
`reasoning is faulty. Each of the three devices mentioned in paragraph [0023] may or
`
`may not be handheld depending on the device or its context. (Ex. 1018, ¶28). For
`
`example, a remote controller or a joystick can be built into a keyboard, tabletop, or
`
`laptop. A cellular phone can be hardwired into a speaker system and controlled
`
`using a keypad integrated in a tabletop console. In each of these scenarios, the device
`
`is not handheld. (Ex. 1018, LaViola Tr., 62:7:16) (testifying that a “laptop” is not
`
`“handheld” because it is “designed to be put on your lap or on a table”). The
`
`provisional application is silent as to whether or not the three devices are handheld.
`
`It is not enough, for purposes of written description, that it would have been obvious
`
`to make these devices handheld. See Ariad Pharms., Inc. v. Eli Lilly & Co., 598 F.3d
`
`1336, 1352 (Fed. Cir. 2010) (en banc) (“[A] description that merely renders the
`
`invention obvious does not satisfy the [written description] requirement.”).
`
`CyWee argues that Fig. 1 of the provisional “depicts such a handheld
`
`embodiment of the device.” (Mot. 6). This figure, however, does not show a human
`
`hand holding the device, or show how the device is moved in three dimensions—by
`
`hand or otherwise. No size scale is provided in Figure 1 to know whether the device
`
`is of a size that can be physically lifted or is ergonomically designed to be lifted
`
`comfortably by a user's hand. (Ex. 1018, ¶31). The size of the device may be too
`
`big—or too small—to be used comfortably and without encumbrance or undue
`4
`
`
`
`
`
`IPR2018-01257
`U.S. Pat. No. 8,552,978
`
`strain. (Ex. 1019, LaViola Tr., 60:21-61:23)(testifying that the “essential
`
`characteristic” of a “handheld” device is “something that was ergonomically
`
`designed so that one could hold it in their hand or something small enough that could
`
`be held in the hand”). Figure 1 also provides insufficient information to know
`
`whether the device is attached to the surface on which it is sitting or is otherwise
`
`intended to be used while it rests on a tabletop—scenarios that Dr. LaViola admits
`
`would render a device not “handheld”. (Ex. 1019, LaViola Tr., 62:7:16).
`
`2. a display device built-in to and integrated with the 3D pointing
`device [19(g)]
`There is also no support in the provisional application for element 20(j), which
`
`adds, among other things, “a display device built-in to and integrated with the 3D
`
`pointing device and associated with a display reference frame.”
`
`The device in Fig. 1, discussed above, shows a “Screen”—separate from the
`
`3D pointing device—on which a cursor or a game is displayed. (Ex. 2012, Fig. 1
`
`id.at ¶[0024]). Because the display screen is separate from the 3D pointing device,
`
`the display is clearly not “built-in to and integrated with” the device. (Ex. 1018,
`
`¶31). There is no other relevant disclosure. CyWee thus hangs its entire argument
`
`on the mention of “a cellular phone” in paragraph [0023], from which CyWee
`
`assumes that the cellular phone has a built-in display integrated therein. (Mot. 7).
`
`The provisional, however, says nothing about whether this cellular phone has a
`
`
`
`5
`
`
`
`IPR2018-01257
`U.S. Pat. No. 8,552,978
`
`display. Nor does it suggest that the display is large enough to display “a movement
`
`pattern in the display reference frame,” as required by element [20(j)], rather than
`
`displaying merely a single line of numbers or letters (as many cellular phones do).
`
`(Ex. 1018, ¶29).
`
`B. Dependent Claim 20
`Dependent claim 20 is additionally not entitled to the benefit of the provisional
`
`because a “smartphone” is not supported by the provisional. CyWee’s proffered
`
`support for the “smartphone” of claim 21 is the description of “a cellular phone.”
`
`(Mot. 7)(citing Ex. 2012, ¶[0023]). But a smartphone is a specific type of cellular
`
`phone. CyWee points to no disclosure in the provisional of the specific functionality
`
`and features necessary to turn a generic “cellular phone” into the claimed species
`
`that is a “smartphone.” (Ex. 1018, ¶30). The bare disclosure of a genus is
`
`insufficient to demonstrate possession of an undisclosed species. See Ariad, 598
`
`F.3d at 1352.
`
`III. THE PROPOSED AMENDED CLAMS WOULD BE OBVIOUS
`The proposed amended claims are also obvious under 35 U.S.C. §103(a) in
`
`view of U.S. Pat. Pub. US 2010/0312468 A1 (“Withanawasam”)(Ex. 1017) in view
`
`of U.S. Pat. No. 7,089,148 (“Bachmann”)(Ex. 1004). Withanawasam published on
`
`December 9, 2010, from an application filed June 3, 2009, and is thus prior art
`
`under pre-AIA 35 U.S.C. § 102(e)(1). Bachmann, also cited in the petition, is a
`
`
`
`6
`
`
`
`IPR2018-01257
`U.S. Pat. No. 8,552,978
`
`issued on August 8, 2006, and is thus prior art under 35 U.S.C. § 102(b).
`
`A. Overview of the Combination
`Withanawasam teaches a device, such as a smartphone, that uses sensors to
`
`obtain data that are processed to produce an orientation, where the orientation data
`
`can be displayed on a built-in display in a variety of ways. (Ex. 1018, ¶¶55).
`
`Withanawasam does not restrict the choice of sensors, and does not expressly teach
`
`a method of integrating sensor data (“sensor fusion”) to calculate a device
`
`orientation. (Ex. 1018, ¶¶55). Bachmann, however, teaches a method for accurately
`
`calculating a device orientation from three-axis magnetic, three-axis gyroscopic and
`
`three-axis acceleration sensors. (Ex. 1018, ¶¶55). It would have been obvious to
`
`use Bachmann’s choice of sensors and Bachmann’s method of calculating
`
`orientation by fusing magnetic, gyroscopic and acceleration sensor outputs to
`
`implement Withanawasam’s device. (Ex. 1018, ¶¶55).
`
`B. Rationale and Motivation Supporting the Combination
`There was ample motivation to combine Withanawasam with Bachmann. a
`
`person of ordinary skill in the relevant timeframe would have understood from
`
`Withanawasam that smartphones with multiple sensors (including magnetic,
`
`gyroscopic and acceleration sensors) existed, and that these sensors were typically
`
`used to calculate a device orientation. (Ex. 1017, ¶0001)(Ex. 1018, ¶56). A person
`
`of ordinary skill would also have understood that the sensors themselves do not
`
`
`
`7
`
`
`
`IPR2018-01257
`U.S. Pat. No. 8,552,978
`
`produce an orientation value as an output, but rather the sensors’ output data must
`
`be processed further, by Withanawasam’s processor 110, to calculate a device
`
`orientation. (Ex. 1017, ¶¶0011-0012)(Ex. 1018, ¶56). Withanawasam, however,
`
`leaves open the exact configuration of sensors. (Ex. 1017, ¶¶0009-0013, 0015,
`
`0017-0018, 0008, 00024)(Ex. 1018, ¶¶57-60). Withanawasam also does not
`
`expressly teach a method for mathematically fusing sensor data. Thus, a person of
`
`ordinary skill, seeking to implement the smartphones described by Withanawasam,
`
`would have had a reason to look to relevant art to pick a known sensor configuration
`
`and method for mathematically fusing sensor data. (Ex. 1018, ¶¶56, 61).
`
`A person of skill would have understood that using Bachmann’s nine-axis
`
`sensor in Withanawasam’s smartphone would have provided several advantages.
`
`(Ex. 1018, ¶61). First, the teachings of Bachmann would have allowed a person of
`
`skill to choose sensors and fuse the sensor data accurately. (Ex. 1018, ¶61). Second,
`
`Bachmann’s nine-axis sensor would have allowed Withanawasam’s smartphone to
`
`obtain the orientation of the device in all rotational degrees of freedom (roll, pitch
`
`and yaw). (Ex. 1018, ¶61). Third, a person of skill would have understood that
`
`Bachmann’s nine-axis sensor and sensor fusion method would have allowed for
`
`greater precision through overdetermination (the information beyond that necessary
`
`to determine orientation), which enables better error control. (Ex. 1018, ¶61).
`
`Bachmann’s nine-axis sensors were well-known in the art in the relevant
`8
`
`
`
`
`
`IPR2018-01257
`U.S. Pat. No. 8,552,978
`
`timeframe. (Ex. 1004, 14:37-57)(Ex. 1018, ¶62). Bachmann further states that its
`
`sensors and filter processing can be used
`
`in hand-held devices (like
`
`Withanawasam’s. (Ex. 1004, 13:42-48)(Ex. 1018, ¶63). A person of ordinary skill
`
`in the art would have been motivated and able to use nine-axis MARG sensors like
`
`those in Bachmann to implement Withanawasam’s smartphone, and could have used
`
`as MARG sensors implemented as Withanawasam’s integrated sensor devices to do
`
`so. (Ex. 1018, ¶64). It would also have been obvious to use Bachmann’s specific
`
`filter (sensor fusion) method, which in Bachman is a quaternion-based filter
`
`processing method that is computationally more efficient than processing that uses
`
`spatial (e.g., Euler) angle calculations. (Ex. 1004, 5:33-7:31)(Ex. 1018, ¶65).
`
`Bachmann’s quaternion-based techniques also avoid singularities that might
`
`otherwise occur at certain sensor orientations. (Ex. 1004, 5:33-7:31)(Ex. 1018, ¶65).
`
`The combination of Withanawasam and Bachmann presented here is nothing
`
`more than the combination of known elements to achieve an expected improvement.
`
`Withanawasam’s smartphone has a housing, sensors, and a processor that estimates
`
`the device’s orientation used the sensor outputs. (Ex. 1018, ¶66). Bachmann’s
`
`device does the same thing, but with different sensors and specific processing.
`
`Bachmann’s sensors were available on the commercial market as of 2001, and
`
`Bachmann’s calculations were known at least as early as 2004. (Ex. 1018, ¶66).
`
`Bachmann’s functional blocks (sensors and calculations) could have been
`9
`
`
`
`
`
`IPR2018-01257
`U.S. Pat. No. 8,552,978
`
`substituted for the similar functional blocks in Withanawasam using only ordinary
`
`skill as discussed below. (Ex. 1018, ¶66). The results would have been the expected
`
`improvement described in Bachmann. (Ex. 1018, ¶66).
`
`Thus, a person of ordinary skill would have been motivated to use
`
`Bachmann’s quaternion-based filter processing with a nine-axis MARG sensor
`
`because (1) that was its intended use and (2) it performed better than the alternatives.
`
`C. Reasonable Expectation of Success
`A person of ordinary skill would have reasonably expected success in using
`
`Bachmann’s quaternion-based filter processing with a nine-axis sensor in a
`
`smartphone like Withanawasam’s. (Ex. 1018, ¶68). As explained by Dr.
`
`Sarrafzadeh, smartphones with requisite sensors were well-known in the relevant
`
`timeframe, and a person of ordinary skill could have implemented Bachmann’s
`
`sensors and filter method using ordinary skill. (Ex. 1018, ¶68). CyWee’s expert, Dr.
`
`LaViola, has also admitted that many of the requisite skills in the art existed in the
`
`relevant timeframe. (Ex. 1019, LaViola Tr., 91:9-93:15, 84:21-85:9, 88:8-89:25).
`
`D. Analogous Art
`Both Withanawasam and Bachmann are analogous art. Both references relate
`
`to the field of the ’438 and ’978 patents, namely that of generating device
`
`orientations based on sensor outputs. (Ex. 1017, ¶0001, 0011-0012)(Ex. 1004, 1:18-
`
`20, 13:42-48)(Ex. 1018, ¶69). Both the Withanawasam and Bachmann references
`
`
`
`10
`
`
`
`IPR2018-01257
`U.S. Pat. No. 8,552,978
`
`further would have been reasonably pertinent to the particular problem with which
`
`the inventor is involved. (Ex. 1018, ¶69). Withanawasam teaches the use of sensors
`
`within devices to determine orientation (Ex. 1017, ¶0011), while Bachmann teaches
`
`a specific selection of sensors and method for fusing sensor data to produce an
`
`orientation of a tracked device. (Ex. 1004, generally)(Ex. 1018, ¶69). These
`
`teachings are reasonably pertinent to the particular problem with which the inventors
`
`of the ’438 and ’978 patents were involved. (Ex. 1018, ¶69).
`
`E. Claim Mapping
`Proposed Amended Claim 19
`“[19(a)] A method for compensating rotations of a 3D pointing
`device, which is handheld, comprising:”
`
`Withanawasam teaches a three-dimensional (3D) pointing device, in the
`
`form of a portable navigation device that can be a smartphone running certain
`
`software. Withanawasam states:
`
`“FIG. 1 is one embodiment of a personal navigation device (PND)
`100 comprising an integrated MEMS and magnetic sensor 130. The
`PND 100 can be a mobile (hand-held) navigation device, a smart
`phone, or any similar mobile device configured to aid a user in
`navigation and applications requiring orientation information.”
`
`(Ex. 1017, ¶0011)(Emphasis added)(Ex. 1018, ¶71).
`
`Withanawasam’s smartphone, which functions as a personal navigation
`
`
`
`11
`
`
`
`IPR2018-01257
`U.S. Pat. No. 8,552,978
`
`device, is a (3D) pointing device according to CyWee’s proposed construction of
`
`that term,1 because it is “a handheld device that detects the motion of said device in
`
`three-dimensions and is capable of translating the detected motions to control an
`
`output on a display.” (Patent Owner Response, p. 19). First, as stated in the quote
`
`above, Withanawasam’s device is handheld. (Ex. 1017, ¶0011, claim 17)(Ex. 1018,
`
`¶¶72). Second, Withanawasam’s device “detects the motion of said device in three-
`
`dimensions and is capable of translating the detected motions to control an output
`
`on a display”. Withanawasam first describes detecting the motion of said device in
`
`three-dimensions. Withanawasam states:
`
`“Orientation information is information relating to the present
`orientation of the PND 100, and can be determined using the
`integrated MEMS and magnetic sensor 130 (also referred to herein
`as the integrated MEMS sensor). The integrated MEMS and
`magnetic sensor 130 provides information to the processor 110
`relating to acceleration, roll, and directional data (that is, relating to
`a compass direction). The PND 100 can use three axes of sensing for
`acceleration and gyroscope data in one single integrated MEMS
`sensor 130.”
`
`(Ex. 1017, ¶0012, see also claim 15)(Ex. 1018, ¶¶73). Withanawasam also notes
`
`
`1 Google disputes CyWee’s proposed construction.
`
`
`
`12
`
`
`
`IPR2018-01257
`U.S. Pat. No. 8,552,978
`
`that smart phones typically have a “magnetic compass that have to work even when
`
`the device is not held level, which requires a micro-electromechanical systems
`
`(MEMS) accelerometer or a gyroscope to be integrated with the magnetic sensors.”
`
`(Ex. 1017, ¶0001)(Emphasis added)(Ex. 1018, ¶73).
`
`Third, the Withanawasam device is capable of translating the detected
`
`motions to control an output on a display. Specifically, Withanawasam’s device
`
`comprises “a display configured to present the positional information to a user.” (Ex.
`
`1017, claim 16). For example, Withanawasam states:
`
`“The PND 100 includes a processor 110 configured to run a
`navigation and orientation routine module 120. A display 140
`presents navigation information to the user, and can comprise a
`liquid crystal display (LCD), a digital display, or the like. Navigation
`information that can be displayed includes positional information,
`orientation
`information, maps,
`compass directions, a
`predetermined path, or any other information useful in
`navigation.”
`
`(Ex. 1017, ¶0011)(Emphasis added)(Ex. 1018, ¶74).
`
` This indicates that
`
`Withanawasam’s device is capable of translating the detected motions to control an
`
`output on a display, because the motions that are detected by a sensor are translated
`
`to a device orientation, which is used to control output on a display (by display the
`
`orientation, a compass direction, or a path). (Ex. 1018, ¶74). Dr. LaViola, CyWee’s
`
`
`
`13
`
`
`
`IPR2018-01257
`U.S. Pat. No. 8,552,978
`
`expert, would agree. His declaration cites a “navigation device” as an example of a
`
`“3D pointing device”. (Ex. 2011, ¶16).
`
`The combination also teaches a method for compensating rotations, as
`
`described in under the limitations below. (Ex. 1018, ¶75)
`
`“[19(b)] generating an orientation output associated with an
`orientation of the 3D pointing device associated with three
`coordinate axes of a global reference frame associated with Earth;”
`
`The combination uses Bachmann’s filter to generate an orientation output¸
`
`in the form of Bachmann’s orientation quaternion q̂ that represents the device’s
`
`orientation. (Ex. 1004, 7:59-61)(Ex. 1018, ¶76). The orientation quaternion is
`
`associated with an orientation of the 3D pointing device because it is generated
`
`from measurements by an accelerometer, magnetometer, and angular rate sensor
`
`made in the device’s frame of reference. (Ex. 1004, 10:10-14)(Ex. 1018, ¶77). The
`
`orientation quaternion q̂ is associated with three coordinate axes of a global
`
`reference frame associated with Earth because it represents the rotation between
`
`the spatial sensor frame of reference and a “flat Earth” or “earth fixed” reference
`
`frame. (Ex. 1004, 8:61-67, 5:50-61)(Ex. 1018, ¶77-83). Put differently, q̂ represents
`
`the tracked body’s roll, pitch, and yaw between the sensor frame and Earth frame.
`
`(Ex. 1018, ¶77-83).
`
`“[19(c)] generating a first signal set comprising axial accelerations
`associated with movements and rotations of the 3D pointing device
`
`
`
`14
`
`
`
`IPR2018-01257
`U.S. Pat. No. 8,552,978
`
`
`in [the] a spatial reference frame;”
`
`
`
`The combination of Withanawasam and Bachmann would use a three-axis
`
`accelerometer that generates a first signal set (
`
`) representing measured
`
`acceleration. (Ex. 1004, 8:12-16)(Ex. 1018, ¶84-85). Bachmann also represents the
`
`measured acceleration as a vector or set of axial acceleration signals along different
`
`axes. Bachmann denotes this vector as (h1, h2, h3) (shown in box 31 of Fig. 3) and
`
`derives it by low-pass filtering the measured acceleration to remove fast
`
`accelerations. (Ex. 1004, 8:13-42)(Ex. 1018, ¶86). The measured acceleration
`
`comprises axial accelerations associated with movements and rotations of the 3D
`
`pointing device because the signals comprise forced linear acceleration. (Ex. 1004,
`
`8:12-16)(Ex. 1018, ¶87). The measured acceleration is in the spatial reference
`
`frame because the accelerometer is part of the 3D pointing device. (Ex. 1005, ¶¶21
`
`and 25)(Ex. 1018, ¶88). As a result, moving or rotating the 3D pointing device causes
`
`the accelerometer to move or rotate in the exact same way. (Ex. 1018, ¶88).
`
`“[19(d)] generating a second signal set associated with Earth's
`magnetism;”
`
`The combination of Withanawasam and Bachmann would include a three-axis
`
`magnetometer,
`
`implemented as
`
`three orthogonally mounted
`
`single-axis
`
`magnetometers. (Ex. 1018, ¶89). The magnetometers in Bachmann generate a
`
`
`
`15
`
`
`
`IPR2018-01257
`U.S. Pat. No. 8,552,978
`
`second signal set (b1, b2, b3), which is shown in box 32 of Fig. 3. (Ex. 1004, 7:63-
`
`65)(Ex. 1018, ¶¶89-90). The local magnetic field vector is associated with Earth’s
`
`magnetism because it points to magnetic north. (Ex. 1004, 5:11-20)(Ex. 1018, ¶90).
`
`“[19(e)] generating the orientation output based on the first signal
`set, the second signal set and the rotation output or based on the
`first signal set and the second signal set;”
`
`As shown in Fig. 3 and explained in detail by Dr. Sarrafzadeh (Ex. 1018, ¶¶91-
`
`97), Bachmann generates the orientation output (orientation quaternion q̂ ) both
`
`based on the first signal set (measured acceleration)(Ex. 1018, ¶92) and the second
`
`signal set (magnetometer outputs)(Ex. 1018, ¶93). The orientation output is also
`
`based on the rotation output, which is the angular rate information from the angular
`
`rate sensors. (Ex. 1004, 10:10-14)(Ex. 1018, ¶94).
`
`“[19(f)] generating a rotation output associated with a rotation of
`the 3D pointing device associated with three coordinate axes of [a]
`the spatial reference frame associated with the 3D pointing device;”
`
`Bachmann uses angular velocity (rate) detectors 33 to generate a rotation
`
`output. (Ex. 1018, ¶98). This rotation output is a vector with components (p, q, r)
`
`and is associated with three coordinate axes of a spatial reference frame
`
`associated with the 3D pointing device. (Ex. 1018, ¶98). Specifically,
`
`Bachmann’s angular rate sensor is a “three-axis angular rate sensor”. (Ex. 1004,
`
`10:12)(Emphasis added)(Ex. 1018, ¶99). The three-axis angular rate sensor is
`
`
`
`16
`
`
`
`IPR2018-01257
`U.S. Pat. No. 8,552,978
`
`mounted or affixed to the 3D pointing device, so its axes match or are at least fixed
`
`with respect to the axes of 3D pointing device. (Ex. 1018, ¶99). As a result, rotating
`
`the 3D pointing device causes the three-axis angular rate sensor to rotate in the same
`
`way. Bachmann states that the angular rates are “measured in the sensor reference
`
`frame” (Ex. 1004, 10:17-30), which would be the same as or fixed with respect to
`
`the spatial reference frame because the sensor is fixed with respect to the 3D pointing
`
`device. (Ex. 1018, ¶99). Moreover, a person of ordinary skill in the art would
`
`readily recognize that p, q, and r each represent angular velocity around a different
`
`axis within the spatial reference frame. This is standard notation, with p as the roll
`
`rate, q as the pitch rate, and r as the yaw rate. (Ex. 1004, 10:30-32)(Ex. 1018, ¶100).
`
`“[19(g)] and using the orientation output and the rotation output to
`generate a transformed output associated with a fixed display
`reference frame associated with a display device built-in to and
`integrated with the 3D pointing device, wherein the orientation
`output and the rotation output is generated by a nine-axis motion
`sensor module;”
`
`As discussed above under limitations [19e] and [19f], Bachmann discloses
`
`generating an orientation output q̂ and a rotation output (p, q, r). (Ex. 1018, ¶101).
`
`These outputs are generated by a nine-axis motion sensor module with “a three-
`
`axis accelerometer (h1, h2, h3) 31, a three-axis magnetometer (b1, b2, b3) 32, and a
`
`three-axis angular rate sensor (p, q, r) 33.” (Ex. 1004, 10:10-14)(Ex. 1018, ¶101).
`
`This is the same type of nine-axis motion sensor module described in the ’978 patent.
`
`
`
`17
`
`
`
`IPR2018-01257
`U.S. Pat. No. 8,552,978
`
`(Ex. 1001, FIG. 4)(Ex. 1018, ¶101).
`
`It would have been obvious to generate a transformed output associated
`
`with a fixed reference frame associated with a display device. Bachmann
`
`suggests that its orientation output be transformed to the coordinate system of a
`
`display device (e.g., in connection with Fig. 4). (Ex. 1004, 14:20-29)(Ex. 1018,
`
`¶102).
`
`Withanawasam teaches that its smartphone has a built-in display device 140.
`
`(Ex. 1017, Fig. 1 RN 140, ¶0011)(Ex. 1018, ¶157). It further would have been well-
`
`understood and obvious in the relevant timeframe that the display 140 of
`
`Withanawasam, in a smartphone embodiment, would have been built-in to and
`
`integrated with the smartphone. (Ex. 1018, ¶158). This was typical for
`
`smartphones in the relevant timeframe, and would have been obvious to do, to
`
`protect the electrical circuits of the display and avoid forcing the user to carry the
`
`display as a separate element. (Ex. 1018, ¶158). A display built in to and integrated
`
`with the device would have inherently and obviously been “associated with” a
`
`display reference frame, which is simply a coordinate system that moves with the
`
`display. (Ex. 1018, ¶158).
`
`It would have been obvious to use the orientation output and the rotation
`
`output for this transformation. (Ex. 1018, ¶103). In the combination here, the
`
`orientation output q̂ represents the orientation of Withanawasam’s smartphone. (Ex.
`18
`
`
`
`
`
`IPR2018-01257
`U.S. Pat. No. 8,552,978
`
`