`U.S. Patent No. 10,664,143 B2
`
`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`
`
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`
`
`
`META PLATFORMS, INC.,
`Petitioner
`
`v.
`
`IMMERSION CORPORATION,
`Patent Owner
`
`
`
`Case IPR2023-00945
`U.S. Patent No. 10,664,143 B2
`Issue Date: May 26, 2020
`
`
`PETITION FOR INTER PARTES REVIEW
`OF U.S. PATENT NO. 10,664,143 B2
`
`
`
`
`
`
`TABLE OF CONTENTS
`
`
`Page
`
`
`I.
`
`MANDATORY NOTICES UNDER §42.8(A)(1) ......................................... 1
`A.
`Real Party-In-Interest Under §42.8(b)(1) ............................................. 1
`B.
`Related Matters Under §42.8(b)(2) ...................................................... 1
`C.
`Lead and Back-Up Counsel under §42.8(b)(3) .................................... 1
`D.
`Service Information .............................................................................. 2
`E.
`Power of Attorney ................................................................................ 2
`FEE PAYMENT ............................................................................................. 3
`II.
`III. REQUIREMENTS UNDER §§ 42.104 AND 42.108 AND
`CONSIDERATIONS UNDER §§ 314(A) AND 325(D) ............................... 3
`A.
`Standing ................................................................................................ 3
`B.
`Identification of Challenge ................................................................... 3
`C.
`§§314(a) and 325(d) ............................................................................. 3
`IV. OVERVIEW OF THE ’143 PATENT ........................................................... 4
`A.
`Level of Ordinary Skill in the Art ........................................................ 4
`B.
`Specification Overview ........................................................................ 5
`CLAIM CONSTRUCTION ........................................................................... 6
`V.
`VI. THE CHALLENGED CLAIMS ARE UNPATENTABLE. .......................... 7
`A. Overview of the Ground of Unpatentability ........................................ 7
`B.
`Ground 1: Claims 1-3, 7, 8-10, 14, 15-17, and 20 Are Obvious
`Over Nogami. ....................................................................................... 8
`1.
`Independent Claim 1 .................................................................. 8
`(a)
`“a position sensor;” (Limitation 1[a]) ............................ 11
`(b)
`“a processor; and” (Limitation 1[b]) ............................. 13
`(c)
`“a non-transitory computer-readable medium
`comprising program code that is executable by the
`processor to cause the processor to:” (Limitation
`1[c]) ................................................................................ 13
`
`
`
`
`
`-i-
`
`
`
`
`
`TABLE OF CONTENTS
`(continued)
`
`Page
`
`(d)
`
`(e)
`
`(f)
`
`(g)
`
`(h)
`
`(i)
`
`“output first interactive content to a display, the
`first interactive content comprising a virtual
`environment;” (Limitation 1[d]) .................................... 16
`“receive one or more sensor signals from the
`position sensor;” (Limitation 1[e]) ................................ 24
`“determine a position of a peripheral in real space
`based on the one or more sensor signals, the
`peripheral configured to be worn on a user’s
`head;” (Limitation 1[f]) ................................................. 25
`“output second interactive content to the display
`based on the position of the peripheral in real
`space, the second interactive content being
`different from the first interactive content;”
`(Limitation 1[g]) ............................................................ 29
`“determine a haptic signal based on the position of
`the peripheral in real space and the second
`interactive content; and” (Limitation 1[h]) .................... 34
`“transmit the haptic signal to a haptic output
`device, the haptic output device being configured
`to receive the haptic signal and output haptic
`feedback.” (Limitation 1[i]) ........................................... 38
`Claim 2: “The system of claim 1, wherein the position
`sensor is positioned on the peripheral.” ................................... 41
`Claim 3: “The system of claim 1, wherein the peripheral
`is a wearable device.” .............................................................. 42
`Claim 7: “The system of claim 1, wherein the peripheral
`is a first peripheral, and wherein the haptic output device
`is positioned on a second peripheral that is separate from
`the first peripheral.” ................................................................. 42
`Independent Claim 8 ................................................................ 43
`Claims 9, 10 and 14 ................................................................. 44
`Independent Claim 15 .............................................................. 44
`
`-ii-
`
`
`
`
`
`
`
`
`
`2.
`
`3.
`
`4.
`
`5.
`6.
`7.
`
`
`
`TABLE OF CONTENTS
`(continued)
`
`Page
`
`
`
`Claims 16, 17 and 20 ............................................................... 45
`8.
`VII. CONCLUSION ............................................................................................. 45
`CERTIFICATE OF SERVICE ............................................................................... 47
`
`
`
`
`
`-iii-
`
`
`
`
`
`Petition for Inter Partes Review of
`U.S. Patent No. 10,664,143 B2
`
`List of Exhibits
`
`
`
`
`Description of Document
`
`Exhibit
`No.
`1001 U.S. Patent No. 10,664,143 B2 to David M. Birnbaum, Danny A.
`Grant, and Robert W. Heubel (filed February 2, 2019 and issued May
`26, 2020) (“’143” or “’143 patent”)
`1002 Declaration of Jeremy Cooperstock, Ph.D. (“Cooperstock”)
`1003 U.S. Patent Application Publication No. 2009/0066725 A1 to Atsushi
`Nogami and Naoki Nishimura (filed Sep. 9, 2008 and published
`March 12, 2009) (“Nogami”)
`
`1004
`1005
`
`1006
`
`Excerpts from Microsoft Computer Dictionary (5th ed. 2002)
`
`Excerpts from Chambers Dictionary of Science and Technology
`(2007)
`
`Joint Claim Construction Statement filed in Immersion Corp. v. Meta
`Platforms, Inc., No. 6:22-cv-00541-ADA (filed January 21, 2023)
`
`1007
`Prosecution history of the ’143 patent
`1008 Affidavit of Service, ECF No. 6, dated June, 2, 2002, filed in
`Immersion Corp. v. Meta Platforms, Inc., No. 6:22-cv-00541-ADA
`(W.D. Tex.)
`
`
`
`
`
`‐iv‐
`
`
`
`
`
`Petition for Inter Partes Review of
`U.S. Patent No. 10,664,143 B2
`
`I. MANDATORY NOTICES UNDER §42.8(A)(1)
`A. Real Party-In-Interest Under §42.8(b)(1)
`Meta Platforms, Inc. is the real party-in-interest to this IPR petition.
`
`B. Related Matters Under §42.8(b)(2)
`The ’143 patent is the subject of the following pending litigation involving
`
`Petitioner: Immersion Corp. v. Meta Platforms, Inc., No. 6:22-cv-00541-ADA
`
`(W.D. Tex.). Petitioner was served with the Complaint in that action on May 27,
`
`2022. (EX1008, p.001.)
`
`C. Lead and Back-Up Counsel under §42.8(b)(3)
`Petitioner provides the following designation of counsel.
`
`LEAD COUNSEL
`Heidi L. Keefe (Reg. No. 40,673)
`hkeefe@cooley.com
`COOLEY LLP
`ATTN: Patent Group
`1299 Pennsylvania Ave. NW, Suite 700
`Washington, DC 20004
`Tel: (650) 843-5001
`Fax: (650) 849-7400
`
`
`BACK-UP COUNSEL
`Phillip E. Morton (Reg. No. 57,835)
`pmorton@cooley.com
`COOLEY LLP
`ATTN: Patent Group
`1299 Pennsylvania Ave. NW, Suite 700
`Washington D.C. 20004
`Tel: (202) 728-7055
`Fax: (202) 842-7899
`Andrew C. Mace (Reg. No. 63,342)
`amace@cooley.com
`COOLEY LLP
`ATTN: Patent Group
`1299 Pennsylvania Ave. NW, Suite 700
`Washington D.C. 20004
`Tel: (650) 843-5808
`Fax: (650) 849-7400
`
`
`
`
`
`-1-
`
`
`
`
`
`Petition for Inter Partes Review of
`U.S. Patent No. 10,664,143 B2
`
`
`LEAD COUNSEL
`
`
`
`
`
`
`
`BACK-UP COUNSEL
`Mark R. Weinstein (Admission pro hac
`vice to be requested)
`mweinstein@cooley.com
`COOLEY LLP
`ATTN: Patent Group
`1299 Pennsylvania Avenue NW
`Suite 700
`Washington D.C. 20004
`Tel: (650) 843-5007
`Fax: (650) 849-7400
`Lowell D. Mead (Admission pro hac
`vice to be requested)
`lmead@cooley.com
`COOLEY LLP
`ATTN: Patent Group
`1299 Pennsylvania Avenue NW
`Suite 700
`Washington D.C. 20004
`Tel: (650) 843-5007
`Fax: (650) 849-7400
`
`D.
`Service Information
`This Petition is being served by Federal Express to the attorneys of record for
`
`the ’143 patent, 34300 - Immersion / Kilpatrick Townsend and Stockton, Mailstop:
`
`IP Docketing – 22, 1100 Peachtree Street, Suite 2800, Atlanta, GA 30309. This
`
`Petition is also being served on litigation counsel identified in the Certificate of
`
`Service. Petitioner consents to electronic service at the addresses provided above
`
`for lead and back-up counsel.
`
`E.
`Power of Attorney
`Filed concurrently per 37 C.F.R. § 42.10(b).
`-2-
`
`
`
`
`
`
`
`
`Petition for Inter Partes Review of
`U.S. Patent No. 10,664,143 B2
`
`II.
`
`FEE PAYMENT
`Petitioner requests review of twelve (12) claims, with a $41,500 payment.
`
`III. REQUIREMENTS UNDER §§ 42.104 AND 42.108 AND CONSIDERATIONS
`UNDER §§ 314(A) AND 325(D)
`A.
`Standing
`Petitioner certifies that the ’143 patent is available for IPR and that Petitioner
`
`is not barred or otherwise estopped.
`
`B.
`Identification of Challenge
`Petitioner requests institution of IPR based on the following grounds:
`
`Ground
`
`Claims
`
`Basis under §103
`
`1
`
`1-3, 7, 8-10,
`14, 15-17, 20 Nogami
`
`Submitted herewith is a Declaration of Jeremy Cooperstock, Ph.D. (EX1002)
`
`(“Cooperstock”), a qualified technical expert. (Cooperstock, ¶¶1-7, Ex. A.)
`
`C.
`§§314(a) and 325(d)
`Petitioner respectfully submits that no basis exists under either §314(a) or
`
`§325(d) for discretionary denial of this Petition. Petitioner reserves its right to
`
`respond to any discretionary denial arguments later presented by Patent Owner.
`
`§ 314(a): Petitioner has not previously filed an IPR petition against the ’143
`
`patent, and is unaware of any other prior IPR petition filed against the ’143 patent
`
`by any party. In the pending litigation, the district court has continued the claim
`
`-3-
`
`
`
`
`
`
`Petition for Inter Partes Review of
`U.S. Patent No. 10,664,143 B2
`
`construction hearing several times, and as of the filing of this Petition, no claim
`
`construction or other substantive rulings have issued. A trial date has been set for
`
`February 20, 2024, but Petitioner has filed a motion to transfer the case to the
`
`Northern District of California (which remains pending as of the filing of this
`
`Petition). In the event of IPR institution, moreover, Petitioner intends to file a
`
`motion to stay the litigation pending the outcome of the IPR. In order to address any
`
`concerns regarding duplication of efforts, Petitioner hereby provides a Sotera
`
`stipulation that, in the event of IPR institution, Petitioner will not pursue in the
`
`district court any grounds of invalidity against the challenged claims that were raised
`
`or reasonably could have been raised in IPR.
`
`§ 325(d): The Board assesses §325(d) issues under the two-part Advanced
`
`Bionics framework: (1) whether the same or substantially the same art was
`
`previously presented to the Office and if so (2) whether Petitioner has demonstrated
`
`that the Examiner erred in a manner material to the patentability of challenged
`
`claims. In this case, none of the prior art cited in the grounds identified in Part III.B
`
`above was previously presented during prosecution of the ’143 patent.
`
`IV. OVERVIEW OF THE ’143 PATENT
`A. Level of Ordinary Skill in the Art
`A person of ordinary skill in the art would have possessed a bachelor’s degree
`
`in electrical engineering or computer science (or an equivalent degree), and two
`
`-4-
`
`
`
`
`
`
`Petition for Inter Partes Review of
`U.S. Patent No. 10,664,143 B2
`
`years of practical or industry experience in the field of human computer interaction,
`
`including implementation of computer-based systems and software for providing
`
`haptic feedback effects to a user and determining the position and/or orientation of
`
`a device used by a user (such as a device worn on the human body). (Cooperstock,
`
`¶¶8-9.) A person could also have qualified as an ordinarily skilled artisan with more
`
`formal education and less practical or industry experience, or vice versa. (Id.)
`
`B.
`Specification Overview
`The ’143 patent states that it “relates to haptically enhanced interactivity with
`
`interactive content being conveyed to a user by a content output appliance, wherein
`
`the content output appliance is under control of one or more processors that control
`
`the output of the interactive content based on one or more position parameters of a
`
`peripheral being manipulated by the user.” (’143, 1:25-31.)
`
`The “Background” of the ’143 patent acknowledges the existence of systems
`
`in the prior art in which “[u]sers may interact with virtual objects within a virtual
`
`environment in a number of manners,” in which users can also “manipulate a
`
`physical object in the real world in order to interact with a virtual object.” (’143,
`
`1:35-38.) The ’143 patent states that such interactions “may involve augmented
`
`reality technology” and that “visual and/or audio feedback may provide a sense of
`
`interaction with virtual objects to users.” (’143, 1:38-45.)
`
`
`
`
`
`-5-
`
`
`
`
`
`Petition for Inter Partes Review of
`U.S. Patent No. 10,664,143 B2
`
`
`The patent goes on to describe “a system configured to present interactive
`
`content to a user that is manipulating a peripheral.” (’143, 1:54-56.) The disclosed
`
`system may include a “haptics module” that may be “configured to determine haptic
`
`feedback to be provided to the user,” where the haptic feedback may be determined
`
`based on an “identification of the peripheral” and “one or more position parameters”
`
`that are “related to the position of the peripheral.” (’143, 1:59-2:5.) The ’143 patent
`
`explains that “[h]aptics may include tactile and/or kinesthetic (force) feedback
`
`technology that takes advantage of a user’s sense of touch by applying forces,
`
`vibrations, motions, and/or other touch feedback to the user.” (’143, 3:9-12.)
`
`V. CLAIM CONSTRUCTION
`No claim construction rulings have issued in the pending litigation. Claim
`
`constructions have been proposed by Petitioner and/or Patent Owner for the term
`
`“real space” as recited in the challenged claims. (EX1006, p.002.) Petitioner does
`
`not believe any term requires express construction at this time because, as shown
`
`below, the claims are obvious under the constructions proposed by both parties in
`
`the pending litigation, or any other reasonable construction. Nevertheless, Petitioner
`
`reserves its right to respond in the event of any new claim construction arguments
`
`Patent Owner may make in these proceedings.
`
`
`
`
`
`-6-
`
`
`
`
`
`Petition for Inter Partes Review of
`U.S. Patent No. 10,664,143 B2
`
`VI. THE CHALLENGED CLAIMS ARE UNPATENTABLE.
`A. Overview of the Ground of Unpatentability
`The challenged claims attempt to lay claim to well-known interactive content
`
`techniques for providing haptic feedback to a user based on the position of a
`
`peripheral worn on the user’s head. All of the challenged claims are disclosed and
`
`rendered obvious by Nogami (EX1003). As discussed in detail below, Nogami
`
`describes a system remarkably similar to the one later described in the ’143 patent,
`
`in which a user wearing a head-mounted display can interact with virtual objects in
`
`a virtual space, and receive haptic feedback based on the position of the head-
`
`mounted display and the virtual objects with which the user is interacting.
`
`Nogami qualifies as prior art to the ’143 patent under at least 35 U.S.C. §
`
`102(b) (pre-AIA) because it is a patent application published on March 12, 2009,
`
`which is more than one year before the application filing date for the ’143 patent.
`
`(EX1001, EX1003.) Nogami is an analogous reference in the same field as the ’143
`
`patent of providing interactive content, including a virtual space that includes
`
`interactive content with which the user can interact, and in response to which, haptic
`
`stimulation can be provided. (Cooperstock, ¶39.) Nogami would also have been
`
`reasonably pertinent to a number of problems facing the ’143 patent inventors. (Id.)
`
`Nogami’s teachings involve known image processing and computer graphics
`
`techniques that could have been implemented using conventional programming
`
`-7-
`
`
`
`
`
`
`Petition for Inter Partes Review of
`U.S. Patent No. 10,664,143 B2
`
`techniques, with at least a reasonable expectation of success. (Id.)
`
`Nogami also describes at least 10 “exemplary embodiments” with respect to
`
`its system. (Nogami, e.g., ¶¶0046, 0202, 0214, 0233, 0240, 0252, 0262, 0268, 0275,
`
`0278, 0284.) But Nogami’s separately-numbered “embodiments” merely describe
`
`additional features that could have been incorporated into Nogami’s system rather
`
`than different or separate systems altogether. (Cooperstock, ¶40.) Nogami confirms
`
`as much by stating, after discussing the “Tenth Exemplary embodiment,” that “the
`
`above-described exemplary embodiments can be used in combination with each
`
`other where appropriate.” (Nogami, ¶0283.) It therefore would have been obvious
`
`to include and/or combine the additional features described in Nogami’s variously
`
`numbered embodiments. (Cooperstock, ¶40.) See also Boston Scientific Scimed,
`
`Inc. v. Cordis Corp., 554 F.3d 982, 991 (Fed. Cir. 2009) (“Combining two
`
`embodiments disclosed adjacent to each other in a prior art patent does not require a
`
`leap of inventiveness.”). As explained below, each limitation of the challenged
`
`claims is disclosed by and rendered obvious over Nogami.
`
`B. Ground 1: Claims 1-3, 7, 8-10, 14, 15-17, and 20 Are Obvious Over
`Nogami.
`1.
`Independent Claim 1
`The preamble of claim 1 simply recites, “[a] system comprising….” Nogami
`
`discloses a “system” that comprises various components shown in Figures 1 and 2
`
`
`
`
`
`-8-
`
`
`
`
`
`Petition for Inter Partes Review of
`U.S. Patent No. 10,664,143 B2
`
`(reproduced below). Figure 2 of Nogami provides a block diagram of the system
`
`and these components (and respective subcomponents):
`
`
`(Nogami, Fig. 2 (highlighting added).) Figure 2 above shows stimulation generation
`
`
`
`units 10 and 11 mounted on the body of the user and head-mounted display (HMD)
`
`100 mounted on the user’s head. (Nogami, ¶0049.) Figure 2 also shows position
`
`and orientation sensor 400, and calculation processing apparatus 200. Figure 1 of
`
`Nogami shows an exemplary embodiment of the system including these
`
`components, when in actual use by a user:
`
`
`
`
`
`-9-
`
`
`
`
`
`Petition for Inter Partes Review of
`U.S. Patent No. 10,664,143 B2
`
`
`
`(Nogami, Fig. 1 (highlighting added); see also id., ¶¶0046-0059 (describing
`
`
`
`components of Figure 1).) Figure 1 shows stimulation generation units 10, 11, and
`
`12 worn on the user’s hand, elbow and knee, respectively, and HMD 100 worn on
`
`the user’s head. (Nogami, ¶¶0049, 0051.)
`
`Figure 1 also depicts position and orientation sensor 400, in this case in the
`
`form of a camera physically apart from HMD 100. (See, e.g., Nogami, ¶0074.) In
`
`this example, sensor 400 acquires position and orientation information for HMD 100
`
`by identifying a physical marker 40 attached to HMD 100. (Id.) But this is just one
`
`embodiment. As discussed in more detail below, Nogami discloses an alternative
`
`
`
`
`
`-10-
`
`
`
`
`
`Petition for Inter Partes Review of
`U.S. Patent No. 10,664,143 B2
`
`embodiment in which the position and orientation sensor takes the form of a
`
`magnetic sensor included as part of HMD 100. (Nogami, ¶0077.) Under this
`
`embodiment, “a transmitter for generating a magnetic field is provided in the real
`
`space, and the HMD 100… ha[s] a receiver for detecting a change in the magnetic
`
`field occurring accordingly as the HMD 100… change[s] [its] position and
`
`orientation within the magnetic field.” (Nogami, ¶0077.) As shown below,
`
`Petitioner has relied upon this magnetic sensor embodiment for the claimed
`
`“position sensor” of limitation 1[a] below.
`
`Finally, Figures 1 and 2 show calculation processing apparatus 200. As shown
`
`in Figure 2 above, it includes a number of components including CPU 201, RAM
`
`202 and ROM 203, data recording unit 205, among others. (Nogami, ¶¶83-86.)
`
`“The calculation processing apparatus 200 includes a general-purpose personal
`
`computer (PC), for example.” (Nogami, ¶0082.) As explained below, Nogami’s
`
`system, which comprises the components described above, discloses and renders
`
`obvious each claim limitation.
`
`(a)
`“a position sensor;” (Limitation 1[a])
`As just explained, Nogami discloses an embodiment for position and
`
`orientation sensor 400 in which the sensor takes the form of a magnetic sensor
`
`included as part HMD 100. (Nogami, ¶¶0077-0078.) As Nogami explains:
`
`
`
`
`
`-11-
`
`
`
`
`
`Petition for Inter Partes Review of
`U.S. Patent No. 10,664,143 B2
`
`
`For example, a method that uses a magnetic sensor can be used for
`acquiring the position and orientation information about the HMD 100
`and the stimulation generation units 10, 11, and 12. In this method, a
`transmitter for generating a magnetic field is provided in the real space,
`and the HMD 100 and the stimulation generation units 10, 11, and 12
`each have a receiver for detecting a change in the magnetic field
`occurring accordingly as the HMD 100 or the stimulation generation
`units 10, 11, and 12 change their position and orientation within the
`magnetic field.
`In addition, the method acquires the position and orientation
`information in a sensor coordinate system of each of the receivers based
`on a signal indicating a result of the detection by the receiver. Here, the
`sensor coordinate system has an origin point that denotes a position of
`the transmitter and an x-axis, a y-axis, and a z-axis orthogonal to one
`another at the origin point.
`
`(Nogami, ¶¶0077-0078.)1
`
`Nogami therefore discloses “a position sensor” in the form of a magnetic
`
`sensor that is part of HMD 100, i.e., a receiver that senses a magnetic field generated
`
`
`1 Except as otherwise noted, all underlining and other emphasis added to quotations
`
`was added by Petitioner.
`
`
`
`
`-12-
`
`
`
`
`
`Petition for Inter Partes Review of
`U.S. Patent No. 10,664,143 B2
`
`by a transmitter.2 This magnetic sensor qualifies as a “position sensor” because it
`
`detects a change in the magnetic field as HMD 100 changes its position and
`
`orientation within the magnetic field. (Nogami, ¶0077 (noting that magnetic sensor
`
`“can be used for acquiring the position and orientation information about the HMD
`
`100….”).) This is consistent with the ’143 patent, which similarly discloses a
`
`magnetic sensor (which the ’143 patent calls a “magnetometer”) as an example of a
`
`position sensor. (See ’143, 5:35-38; Cooperstock, ¶54.)
`
`(b)
`(c)
`
`“a processor; and” (Limitation 1[b])
`“a
`non-transitory
`computer-readable medium
`comprising program code that is executable by the
`processor to cause the processor to:” (Limitation 1[c])
`Because these two limitations are closely-related, Petitioner will address them
`
`together. The claimed “processor” and “non-transitory computer-readable
`
`
`2 As noted above for the preamble, Nogami discloses an alternative embodiment in
`
`which position and orientation sensor 400 includes one or more cameras that are
`
`physically apart from HMD 100. (Nogami, ¶0044, Fig. 1.) Petitioner has not relied
`
`upon that embodiment as the claimed “position sensor” for this Petition because of
`
`challenged dependent claim 2, discussed below, which recites that the position
`
`sensor must be “positioned on the peripheral,” which has been mapped to HMD
`
`100 of Nogami as explained for limitation 1[f] below.
`
`
`
`
`-13-
`
`
`
`
`
`Petition for Inter Partes Review of
`U.S. Patent No. 10,664,143 B2
`
`medium” are disclosed by calculation processing apparatus 200 of Nogami, as
`
`shown in Figure 2:
`
`
`(Nogami, Fig. 2 (excerpt; highlighting added); see also id., ¶¶0088-0090.)
`
`
`
`Figure 2 confirms that Nogami discloses “a processor,” i.e., central
`
`processing unit (CPU) 201, and “a non-transitory computer-readable medium,”
`
`which can correspond to either data recording unit 205 and random access memory
`
`(RAM) 202, or both. (Cooperstock, ¶56.) Nogami explains that data recording unit
`
`205 is “a mass storage device, such as a hard disk drive (HDD).” (Nogami, ¶0088.)
`
`CPU 201, as discussed below, uses the program and data loaded on data
`
`recording unit 205 and/or RAM 202 in order to perform the processing tasks
`
`described in Nogami. Nogami therefore discloses “program code that is
`
`
`
`
`
`-14-
`
`
`
`
`
`Petition for Inter Partes Review of
`U.S. Patent No. 10,664,143 B2
`
`executable by the processor [i.e., CPU 201] to cause the processor to” perform
`
`various operations, including the steps recited in the limitations described below.
`
`For example, Nogami explains that its methods can be “achieved by providing
`
`a system or a device with a storage medium (or a recording medium) which stores
`
`program code of software implementing the functions of the embodiments and by
`
`reading and executing the program code stored in the storage medium with a
`
`computer of the system or the device (a CPU or an MPU).” (Nogami, ¶0284; see
`
`also id., ¶¶0285-0287.) In one embodiment, data recording unit 205 stores “an
`
`operating system (OS) and a program and data” used by the CPU 201, which
`
`includes “a program for executing each processing according to a flow of processing
`
`to be executed by the CPU 201.” (Nogami, ¶¶0088-0089.)
`
`Nogami further explains that “[t]he program and the data stored in the data
`
`recording unit 205 are loaded therefrom to the RAM 202 under the control of the
`
`CPU 201.” (Nogami, ¶0090.) “The CPU 201 uses the program and data loaded on
`
`the RAM 202 to perform each [sic] processing.” (Nogami, ¶0091; see also id., ¶0083
`
`(“[E]ach of the following processing operations performed by the calculation
`
`processing apparatus 200 is performed with the CPU 201 as the main unit of
`
`performing the processing.”).) It would have been obvious that Nogami’s
`
`“processor” (CPU 201) executes “program code” to cause the processor to perform
`
`
`
`
`
`-15-
`
`
`
`
`
`Petition for Inter Partes Review of
`U.S. Patent No. 10,664,143 B2
`
`each of the functions in Nogami that correspond to the steps of claim 1 addressed
`
`below. (Cooperstock, ¶¶59, 68, 70, 72, 74, 84, 88.)
`
`(d)
`
`“output first interactive content to a display, the first
`interactive content comprising a virtual environment;”
`(Limitation 1[d])
`Limitations 1[d] and 1[g] recite “first interactive content” and “second
`
`interactive content,” respectively. The ’143 patent does not appear to expressly
`
`define “interactive content” but identifies various examples including (among
`
`others) “a virtual object” that may be output visually and/or haptically to a user.
`
`(’143, 3:2-6 (“Interactive content may include a representation of the real world, a
`
`representation of a user, a representation of a real-world object, a virtual object, real
`
`and/or artificial sounds, other content, and/or combinations thereof.”), 4:51-52
`
`(“Interactive content may be outputted visually, audibly and/or haptically.”).) As
`
`explained below, Nogami similarly discloses the claimed “interactive content.”
`
`With respect to limitation 1[d], Nogami discloses “first interactive content”
`
`in the form of a collection of one or more “background virtual objects” combined
`
`with images of a user’s real-world surroundings, which are output for display to the
`
`user through HMD 100. (Cooperstock, ¶¶61-64.)
`
`By way of background, the system in Nogami includes two types of “virtual
`
`objects”: (1) “region virtual objects” and (2) “background virtual objects.”
`
`
`
`
`
`-16-
`
`
`
`
`
`Petition for Inter Partes Review of
`U.S. Patent No. 10,664,143 B2
`
`(Nogami, e.g., ¶0111-0114.) Nogami describes a “region virtual object” as a virtual
`
`representation of a stimulation generation unit – for example, a user’s glove-shaped
`
`stimulation generation unit could have a corresponding glove-shaped “region virtual
`
`object.” (Nogami, e.g., ¶0268 (“[T]he region virtual object has a shape similar to
`
`that of the stimulation generation unit in the real space. However, the shape is not
`
`limited to this.”); see also id., ¶¶0269-0270, 0141 (“[R]egion virtual objects 311 and
`
`312 is a virtual object generated by imitating a shape of the stimulation generation
`
`units 10 and 11 corresponding thereto.”).) In one embodiment, as discussed below,
`
`a region virtual object can be displayed to the user as a virtual work tool such as a
`
`screwdriver or wrench. (Nogami, ¶¶0270, 0272-0273.) Petitioner has mapped the
`
`region virtual object in Nogami the “second interactive content” of limitation 1[g],
`
`and as such, Petitioner will discuss it separately below.
`
`A “background virtual object” in Nogami, on the other hand, corresponds to
`
`a virtual representation of another object with which the user can interact in virtual
`
`space. (Nogami, e.g., ¶0114.) As explained below, background virtual objects in
`
`Nogami, combined with images of a user’s surroundings, provide an example of
`
`“first interactive content” comprising at least “a virtual environment.”
`
`Figure 1 of Nogami illustrates an example of a virtual space including a
`
`background virtual object 300 in the form of “a virtually expressed imitation image
`
`
`
`
`
`-17-
`
`
`
`
`
`Petition for Inter Partes Review of
`U.S. Patent No. 10,664,143 B2
`
`of an automobile in an actual size” (Nogami, ¶50), highlighted in yellow below:
`
`
`(Nogami, Fig. 1 (highlighting added).) Figure 6 of Nogami provides another
`
`
`
`example of background virtual objects, shapes 302 and 303 (highlighted below):
`
`
`
`
`
`-18-
`
`
`
`
`
`Petition for Inter Partes Review of
`U.S. Patent No. 10,664,143 B2
`
`
`
`
`(Nogami, Fig. 6 (highlighting added); see also id., ¶¶0140 (“[B]ackground virtual
`
`objects 302 and 303 are disposed in the virtual space.”), 0137 (“FIG. 6 illustrates an
`
`example of a virtual space when the state of the real space is in the state illustrated
`
`[in] FIG. 5. Accordingly, the state that is a combination of the real space (FIG. 5)
`
`and the virtual space (FIG. 6) is presented to the user 1 who wears the HMD 100.”).)
`
`Nogami makes clear that a background virtual object provides “interactive
`
`content” because “the user 1 can experience a virtual space including a virtual object
`
`300 with both a visual sense and a tactile sense.” (Nogami, ¶0049.) The user can
`
`interact with a background virtual object in a number of ways. (Cooperstock, ¶64.)
`
`
`
`
`
`-19-
`
`
`
`
`
`Petition for Inter Partes Review of
`U.S. Patent No. 10,664,143 B2
`
`
`For example, a user can visually interact with a background virtual object by
`
`changing their position relative to the object, thereby changing how the object is
`
`displayed. (Nogami, e.g., ¶¶0005 (“In the case of using an HMD in such a
`
`conventional system, a user can feel a greater sense of immersion into a virtual space
`
`because the virtual space is displayed according to a viewpoint of the user and a
`
`direction of a line of sight (visual axis) of the user.”), 0027 (“…generate an image
`
`in a virtual space including a virtual object based on position and orientation
`
`information about a viewpoint of a user…”), 0140-0147 & Fig. 6 (describing how
`
`virtual objects including background virtual objects are generated for display based
`
`in part on current position and orientation information).) Additionally, a user can
`
`also interact with a background virtual object by touch. For example, a stimulation
`
`generation unit (via its corresponding region virtual object) can make contact with
`
`the background virtual object. (Nogami, e.g., ¶0097 (“[T]he calculation processing
`
`apparatus 200 detects whether the stimulation generation unit 10, 11, or 12 contacts
`
`a virtual object.”); see also id., ¶¶0098-0103, Fig. 4 (e.g., steps S4004 – S4007),
`
`0117-0123 (describing steps S4004 – S4007).) This can in turn result in a haptic
`
`stimulation being generated for the user. (Nogami, ¶¶0122-0123, Fig. 4 (Step
`
`S4006, 4007).) Both of these examples confirm that a background virtual object is
`
`an example of “interactive content,” as claimed.
`
`
`
`
`
`-20-
`
`
`
`
`
`Petition for Inter Partes Review of
`U.S. Patent No. 10,664,143 B2
`
`
`Nogami discloses that the first interactive content also “compris[es] a virtual
`
`environment,” as claimed. This is because images of the virtual objects may also
`
`be combined with images of the user’s real-world surroundings (captur