throbber
Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 1 of 21
`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 1 of 21
`
`EXHIBIT KK
`EXHIBIT KK
`
`

`

`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 2 of 21
`
`Gentex Corporation and Indigo Technologies, LLC (collectively, “Gentex”) presently
`contend that Facebook, Inc. and Facebook Technologies, LLC (collectively, “Facebook”)
`infringe claim 1 (the “Asserted Claim”) of U.S. Patent No. 8,224,024, directly and/or indirectly,
`either literally or under the doctrine of equivalents. This chart sets forth Gentex’s preliminary
`infringement contentions relating to the Asserted Claims and the accused products, i.e., the
`Oculus Rift S, Oculus Quest, and Oculus Quest 2 (collectively, the “Accused Products”). In the
`event Facebook releases new products or services that infringe the ’024 patent, or further
`investigation reveals that other products or services infringe the ’024 patent, Gentex reserves the
`right to update these contentions as appropriate under the Order Governing Proceedings.
`
`
`These contentions articulate the structure and acts that constitute direct and/or indirect
`infringement of the ’024 patent and identify specifically where each element of each asserted
`claim is found within each Accused Product. Exemplary references to publicly available
`information concerning the Accused Products is provided where appropriate. Exemplary
`references to specific Accused Products are not intended and should not be read to exclude
`Accused Products not exemplified. On information and belief, the Accused Products are
`materially the same with respect to the claims of the ’024 patent discussed below. This
`disclosure is not intended to describe all acts of direct, induced, or contributory infringement
`Facebook has and continues to commit by making, using, selling, providing, developing,
`installing, testing, deploying, and/or directing the use of the Accused Products by customers and
`end users. The parties have not engaged in any discovery. The parties also have not discussed
`proposed constructions for, and the Court has not yet construed, any of the claims of the ’024
`patent. As a result, and consistent with the Order Governing Proceedings, Gentex reserves the
`right to modify, amend, or otherwise supplement these initial infringement contentions as
`discovery and the pre-trial phase of the litigation proceed and as additional information comes to
`light, including with respect to which claims Gentex is asserting, the infringement analysis for
`one or more of the claims, and whether and how limitations of one or more claims are met
`literally or under the doctrine of equivalents.
`
`
`U.S. Patent 8,224,024
`
`Claim 1
`Claim Limitation
`(1pre) A method
`comprising obtaining a
`camera image from a
`camera and processing
`said camera image in a
`data processor by
`computing the spatial
`location and azimuth of
`an object from the
`locations, in said camera
`image, of exactly two
`points on the object, and
`information about an
`orientation of the object,
`
`Accused Products
`Facebook encourages, directs, or promotes users to use the Accused Products to
`carry out the claimed method, and Facebook performs the claimed method, as
`set forth below. For example, Facebook obtains a camera image, and
`encourages users to use the Accused Products to obtain a camera image from a
`camera and process said camera image in a data processor (e.g., in the Oculus
`Quest or Quest 2 headset processor or in a desktop processor for the Oculus
`Rift S) by computing the spatial location and azimuth of an object from the
`locations, in said camera image, of exactly two points on the object (e.g., the
`locations of two LED markers on an Oculus Touch Controller), and
`information about an orientation of the object (e.g., information about
`orientation from an Oculus Touch Controller’s inertial measurement unit, or
`“IMU”), and generating one or more signals representative of the location and
`azimuth of the object, wherein computing the azimuth of the object comprises
`the steps below. The Accused Products are especially adapted to carry out this
`
`2
`
`

`

`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 3 of 21
`
`
`See also From the Lab, Sensor Placement at 0:23.
`
`
`
`(1a) receiving coordinate
`information for images,
`on an imaging device of
`a camera, of two points
`on an object,
`
`
`
`
`
`Facebook encourages, directs, or promotes users to use the Accused Products to
`receive coordinate information for images, on an imaging device of a camera
`(e.g., the cameras on the headset), of two points on an object (e.g., two LED
`markers on an Oculus Touch Controller (for example, “it is fairly common for
`cameras to only see 3, 2 or even 1 LED(s) at a time,” LED Matching)), and
`Facebook performs such step itself. The Accused Products are especially
`adapted to carry out this method, which is a material part of the claimed
`invention, and have no substantial noninfringing uses. Further, on information
`
`13
`
`

`

`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 4 of 21
`
`and belief, Facebook conditions a user’s use of the Accused Products, and
`therefore the user’s receipt of the benefits of the Accused Products, upon this
`method and establishes the manner or timing of that use (e.g., through its
`software and/or user instructions, which have not been provided at this stage of
`the litigation).
`
`To the extent this limitation is not met literally, the Accused Products also
`satisfy this limitation under the doctrine of equivalents. Any difference
`between the Accused Products and the claim element is insubstantial.
`
`See, e.g., Oculus Quest 2 Instructions.
`
`
`
`
`
`See also From the Lab.
`
`
`14
`
`

`

`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 5 of 21
`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 5 of 21
`
`
`
`FB Sensor Placement
`Tech@Facebook @ -
`
`eaetter
`Pegaenad
`
`
`See also Lang.
`See also Lang.
`
`Ty
`playspace.
`
`
`
`
`
`Image courtesy BadVR, Jad Meouchy
`
`Around the mainboard wecan also see the headset's four cameras mountedat very
`
`purposeful angles at the corners. The cameras are essential to enabling 6DOF tracking
`
`on both the headset and the controllers; their views are also merged together to allow a
`
`pass-through vision mode on the headset which is used to trace the boundary of your
`
`
`See also Heaney.
`See also Heaney.
`
`15
`15
`
`

`

`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 6 of 21
`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 6 of 21
`
`
`
`Me)roe dee Erledire]
`
`ICM-20601 IMU from late 2015.
`
`A function for infrared LED calibration exists, suggesting this controlleris
`optically tracked in the same wayas the current Touch— camerason the
`headsetfollow the movementof the LED constellation, and this is fused with
`the accelerometerreadings to achieve sub-mm precision.
`
`
`See also Heaney.
`The driver also reveals the series model numberofthe controller'sinertial
`measurementunit (IMU)- the chip within all VR controllers which contains
`the accelerometer.
`
`Teardowns and the FCCfilings for the current Touch showedit uses TDK’s
`
`
`
`
`
`
`See also ICM-20601 Specification.
`See also ICM-20601 Specification.
`
`16
`16
`
`

`

`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 7 of 21
`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 7 of 21
`
`
`
`FEATURES
`
`e
`
`e
`
`3-Axis Gyroscope with Programmable FSR of
`+500dps, +100dps, +2000dps and +4000dps
`3-Axis Accelerometer with Programmable FSR of
`+4g, +8g, +16g, and +329
`e User-programmable interrupts
`e Wake-on-motion interrupt for low power operation
`of applications processor
`512 byte FIFO buffer enables the applications
`processorto read the datain bursts
`On-Chip 16-bit ADCs and Programmable Filters
`Host interface: 8 MHz SPI or 400k Hz Fast Mode I7C
`
`e
`
`Digital-output temperature sensor
`VDD operating range of 1.71 to 3.45V
`MEMS structure hermetically sealed and bonded at
`wafer level
`
`e
`
`ROHS and Green compliant
`
`
`
`
`
`See also LED Matching.
`See also LED Matching.
`Computing more with less data
`Theoretically, given only one camera image, you need at least three LEDs to be in view to solve
`for the controller's pose. However,utilizing only three points leads to multiple possible solutions,
`therefore we require at least four correct matches to robustly solve for the pose.
`
`It is fairly common for cameras to only see 3, 2 or even 1 LED(s) at a time, so we designed
`solvers that could use other information and work with fewer LEDsin view.In turn, this included
`the following new solvers whichallow us to track in those particularly challenging orientations:
`
`P2P pose solver
`+ Uses 2 matches and prior poseorientation information to solve for position componentof the pose.
`+ Reduces minimum matching requirement to 3 matches (2 hypothesis generating matches and 1 validating match)
`
`predicted pose as a hard constraint in solving the problem.
`
`P1P pose solver
`« Uses the predicted pose to validate matches directly, instead of validating via statistical or nearest neighbor predictions.
`+ Reduces minimum matching requirement to 2 matches (Ensurestranslation and scale is constrained properly for stereo-pose optimization)
`+ Uses position-only stereo-pose optimization in the case of = 4 LEDs to avoid under-constraining orientation.
`
`After extensive experimentation, we discovered that both P2P and P1P solvers require very
`accurate prior information (good tracking state and accurate prediction), as they rely on the
`
`
`See also Andrew Melim,Increasing Fidelity with Constellation-Tracked
`See also Andrew Melim, Increasing Fidelity with Constellation-Tracked
`Controllers, https://developer.oculus.com/blog/increasing-fidelity-with-
`Controllers, https://developer.oculus.com/blog/increasing-fidelity-with-
`constellation-tracked-controllers/ (Sept. 20, 2019).
`constellation-tracked-controllers/ (Sept. 20, 2019).
`
`17
`17
`
`
`
`

`

`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 8 of 21
`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 8 of 21
`
`
`
`Challenges
`
`The key underlying principle that drives Constellation tracking is the detection and
`
`triangulation of infrared LEDs within camera images which have extremely short
`
`exposure times. For each controller, the tracking system attempts to solve for the
`
`3D pose, which is the 3D position and orientation. Within each frame, the system
`
`executes the following steps:
`
`® Search camera images for bright volumesof infrared light
`
`® Determine a matching scheme between image projections and the underlying
`3D model of the controller
`
`® Compute the 3D poseof the controller with respect to the headset and fuse
`with inertial data
`
`
`
`See also From the Lab.
`See also From the Lab.
`
`
`on a mobile chipset.
`
`There are other complications, too. The infrared LEDs in the two hand controllers
`drastically change appearance when they movecloser or farther away from the
`headset as you swing a virtual sword or maneuvera virtual spaceship. Oculus
`Insight also uses other sensors, drawing acceleration and velocity data from the
`inertial measurement units (IMUs) located in the headset and controllers. The
`system must processall of these data pointsin real time and, in the case of Quest,
`
`
`
`
`
`
`See also From the Lab.
`See also From the Lab.
`
`18
`18
`
`

`

`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 9 of 21
`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 9 of 21
`
`
`
`efficient computer vision algorithms that work well on mobile devices.
`
`Taking SLAM technology...
`
`The foundation of Oculus Insight's inside-out tracking is simultaneous localization
`and mapping, or SLAM, which uses computer vision CV algorithms to essentially
`fuse incoming data from multiple sensors in orderto fix the position of an object
`within a constantly updated digital map. SLAM has beenused in robotics and in
`AR camera effects on smartphones and was demoedin the Oculus Santa Cruz VR
`headset prototype in 2016. But Oculus Insight required an unprecedented level of
`precision and efficiency, and that meant adapting the latest research on tracking
`and computervision.
`
`“A lot of these technologies really start in academia — inside the lab," Kozminski
`notes.
`It's no coincidence, then, that she’s part of Facebook's Zurich-based team
`of engineers, many of whom came from Zurich Eye — a joint program from the
`prestigious ETH University and University of Zurich that researchedself-
`navigating systems.
`
`To build a new, more advanced version of SLAM, the engineering team drew from
`Facebook's years of Al research and engineering work, building systems to
`understand the objects and actions that appear in videos and creating highly
`
`
`
`
`See also Compare Headsets,
`See also Compare Headsets,
`https:/Awww.oculus.com/compare/?products=quest%2Cquest-2.
`https://www.oculus.com/compare/?products=quest%2Cquest-2.
`
`19
`19
`
`

`

`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 10 of 21
`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 10 of 21
`
`
`
`Oculus Quest
`All-In-One VR Gaming
`
`
`
`TRACKING
`
`Six Degrees of Freedom
`
`With 6DOF, the headsettracks the movementof both your
`head and body, then translates them into VR with realistic
`precision. No external sensors required.
`
`
`
`
`20
`20
`
`

`

`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 11 of 21
`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 11 of 21
`
`
`
`CONTROLLERS
`
`
`
`PE
`
`Touch Controllers
`
`Oculus Touch controllers transport your hands, gestures and
`interactions into VR with intuitive accuracy.
`
`
`
`Oculus Quest 2
`AdvancedAll-In-One VR Gaming
`
`Starting At $299 USD’
`
`
`
`21
`21
`
`

`

`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 12 of 21
`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 12 of 21
`
`
`
`TRACKING
`
`Six Degrees of Freedom
`
`With 6DOF, the headsettracks the movement of both your
`head and body, then translates them into VR with realistic
`precision. No external sensors required.
`
`
`
`CONTROLLERS
`
`SS
`a)
`a)
`
`Redesigned Touch Controllers
`
`Quest 2 Touch controllers have been upgraded with improved
`ergonomics. Anew thumb rest adds stability when needed.
`
`
`
`
`
`See also OculusRift S.
`See also Oculus Rift S.
`
`
`22
`22
`
`

`

`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 13 of 21
`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 13 of 21
`
`
`
`PC-Powered VR
`Gaming
`ur best VR game
`
`=a $399USD"
`
`View PC requirements
`
`Improved opti
`d
`“scre
`
`Ergonomic Design
`halo
`t
`dbar
`re
`mfort
`
`4
`
`J
`
`4
`
`J
`
`Oculus Touch Controllers
`IR
`Your
`slashes,
`throws
`and grabs appearin VR with
`realistic precisior
`
`<>—
`

`uv
`
`Improved Optics Ss
`to listen to music,” says Kozminski.
`
`
`See also From the Lab.
`See also From the Lab.
`
`“We wanted to create a system that lets you move and explore a VR world just as
`naturally and easily as you would in real life,” says Kozminski.
`
`Kozminski joined a team whose mission was to create thefirst full-featured “inside-out”
`tracking system for a consumer VR device. The technology would have to track thefull
`range of a person’s movements (known as six degrees of freedom) and be able to
`pinpoint the location of the two handheld controllers as well as the headset.
`
`Previously, VR devices relied on external sensors to track these movements. These
`cameras attach to a PC, and while they work well, they make VR less portable and more
`complicated to set up.
`
`“With inside-out tracking in the headset, VR becomes as easy as putting on headphones
`
`
`
`
`
`
`See also Powered by Al.
`See also Powered by AI.
`
`23
`23
`
`

`

`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 14 of 21
`
`
`See also From the Lab, Sensor Placement at 0:23.
`
`
`
`(1b) receiving pitch
`information from a
`sensor on the object,
`
`
`
`
`
`
`Facebook encourages, directs, or promotes users to use the Accused Products to
`receive pitch information from a sensor on the object (e.g., to receive pitch
`information from the Oculus Touch Controller IMU), and Facebook performs
`such step itself. The Accused Products are especially adapted to carry out this
`method, which is a material part of the claimed invention, and have no
`substantial noninfringing uses. Further, on information and belief, Facebook
`conditions a user’s use of the Accused Products, and therefore the user’s receipt
`of the benefits of the Accused Products, upon this method and establishes the
`
`24
`
`

`

`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 15 of 21
`
`manner or timing of that use (e.g., through its software and/or user instructions,
`which have not been provided at this stage of the litigation).
`
`To the extent this limitation is not met literally, the Accused Products also
`satisfy this limitation under the doctrine of equivalents. Any difference
`between the Accused Products and the claim element is insubstantial.
`
`See, e.g., LED Matching.
`
`
`See also Andrew Melim, Increasing Fidelity with Constellation-Tracked
`Controllers, https://developer.oculus.com/blog/increasing-fidelity-with-
`constellation-tracked-controllers/ (Sept. 20, 2019).
`
`
`
`
`
`See also From the Lab.
`
`25
`
`
`
`

`

`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 16 of 21
`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 16 of 21
`
`
`There are other complications, too. The infrared LEDs in the two hand controllers
`drastically change appearance when they movecloser or farther away from the
`headset as you swing a virtual sword or maneuver a virtual spaceship. Oculus
`Insight also uses other sensors, drawing acceleration and velocity data from the
`inertial measurement units (IMUs) located in the headset and controllers. The
`system must processall of these data pointsin real time and,in the case of Quest,
`
`on a mobile chipset.
`
`
`
`
`See also Compare Headsets.
`See also Compare Headsets.
`
`
`Six Degrees of Freedom
`
`With 6DOF, the headsettracks the movementof both your
`head and body, then translates them into VR with realistic
`precision. No external sensors required.
`
`
`
`CONTROLLERS
`
`
`
`Touch Controllers
`
`Oculus Touch controllers transport your hands, gestures and
`interactions into VR with intuitive accuracy.
`
`
`
`
`
`26
`26
`
`

`

`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 17 of 21
`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 17 of 21
`
`
`
`TRACKING
`
`Six Degrees of Freedom
`
`With 6DOF, the headsettracks the movement of both your
`head and body, then translates them into VR with realistic
`precision. No external sensors required.
`
`
`
`CONTROLLERS
`
`SS
`a)
`a)
`
`Redesigned Touch Controllers
`
`Quest 2 Touch controllers have been upgraded with improved
`ergonomics. Anew thumb rest adds stability when needed.
`
`
`
`
`
`
`See also OculusRift S.
`See also Oculus Rift S.
`
`27
`27
`
`

`

`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 18 of 21
`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 18 of 21
`
`
`
`Controllers
`data usedfor this process comes from three types of sensors built into the Quest andRift S hardware:
`controller position drift caused by integrating multiple IMUs.
`
`OculusRift S
`
`Headset
`
`Two Touch
`
`
`
`
`See also Powered byAI.
`See also Powered by AI.
`Academic research has been done on SLAM techniques for several decades, but the technology has only
`recently become mature enough for consumerapplications, such as driverless cars and mobile AR apps.
`Facebook previously released a version of SLAM for AR on mobile devices which uses a single camera and
`inertial measurement unit (IMU) to track a phone's position and enable world-locked content — content that's
`visually anchored to real objects in the world. Oculus Insight is the second generationofthislibrary, and it
`incorporates significantly more information from a combination of multiple IMUs and ultra-wide-angle
`cameras, as well as infrared LEDstojointly track the 6DoF position of a VR headset and controllers.
`
`The Oculus Insight system uses a custom hardwarearchitecture and advanced computervision algorithms —
`including visual-inertial mapping, place recognition, and geometry reconstruction — to establish the location
`of objects in relation to other objects within a given space. This novel algorithm stack enables a VR device to
`pinpoint its location, identify aspects of room geometry (such as floor location), and track the positions of the
`headset and controllers with respect to a 3D mapthat is generated and constantly updated byInsight. The
`
`See also Powered by Al.
`See also Powered by AI.
`The Oculus Insight system uses a custom hardware architecture and advanced computervision algorithms —
`including visual-inertial mapping, place recognition, and geometry reconstruction — to establish the location
`of objects in relation to other objects within a given space.This novel algorithm stack enables a VR device to
`pinpointits location,identify aspects of room geometry (such as floor location), and track the positions of the
`headset and controllers with respect to a 3D mapthatis generated and constantly updated by Insight. The
`data usedfor this process comesfrom three types of sensors built into the Quest and Rift S hardware:
`
`4. Linear acceleration and rotational velocity data from IMUsin the headset and controllers are integrated
`to track the orientation and position of each with low latency.
`
`2. mage data from camerasin the headset helps generate a 3D map of the room, pinpointing landmarks
`like the corners of furniture or the patterns on yourfloor. These landmarks are observed repeatedly,
`which enables Insight to compensatefor drift (a common challenge with IMUs, where eventiny
`measurement discrepancies build up overtime, resulting in inaccurate location tracking).
`
`3. Infrared LEDsin the controllers are detected by the headset cameras,letting the system bound the
`
`
`
`
`
`
`
`28
`28
`
`

`

`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 19 of 21
`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 19 of 21
`
`
`
`See alsoid.
`See also id.
`At last year’s Oculus Connect event we shared some details about OculusInsight, the cutting-edge
`technology that powers both Quest and Rift S. Now that both of those products are available, we're providing
`a deeperlook at the Al systems and techniques that power this VR technology. Oculus Insight marks thefirst
`time that fully untethered six-degree-of-freedom (6DoF) headset and controller tracking has shipped ina
`consumer AR/VR device. Built from the ground up, the Insight stack leverages state-of-the-art computer
`
`vision (CV) systems and visual-inertial simultaneous localization and mapping, or SLAM.
`
`
`
`
`See also From the Lab.
`See also From the Lab.
`
`“We wanted to create a system that lets you move and explore a VR world just as
`naturally and easily as you would in real life,” says Kozminski.
`
`to listen to music,” says Kozminski.
`
`Kozminski joined a team whose mission was to create thefirst full-featured "inside-out"
`tracking system for a consumer VR device. The technology would have to track the full
`range of a person’s movements (known as six degrees of freedom) and be able to
`pinpoint the location of the two handheld controllers as well as the headset.
`
`Previously, VR devices relied on external sensors to track these movements. These
`cameras attach to a PC, and while they work well, they make VR less portable and more
`complicated to set up.
`
`“With inside-out tracking in the headset, VR becomes as easy as putting on headphones
`
`
`
`
`See also Powered byAl.
`See also Powered by AI.
`Headsettracking compute architecture
`MAPPER THREAD
`
`Cesare
`
`TRACKER THREAD
`
`DSP
`
`El
`
`
`sar
`Convergedpoints
`Selectseedsasbackup a Newseeds
`|
`
`auUpdateMe
`
`coo=
`
` Veet ad
`
`IMU THREAD
`
`Ea:
`
`Update& reintegrate
`
`
`
`——-=Oculus Insight processes multiple threads of data at once, in real-time — the mapperthread modifies the map, sending updated copies to the
`tracker thread, which uses camera frames to estimate posesin the mapper-provided frames, while the IMU thread uses measurements from
`the IMU's to update the latest SLAM state.
`
`
`See also From the Lab, Sensor Placementat 0:30.
`See also From the Lab, Sensor Placement at 0:30.
`
`
`
`29
`29
`
`

`

`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 20 of 21
`
`
`
`FB Sensor Placement
`Tech@Facebook @ - Follow
`
`CRTemtlid
`Poegaten
`Controller IMU
`
`on@) & 2 O
`
`
`
`
`See also Heaney.
`The driver also reveals the series model numberofthe controller'sinertial
`measurementunit (IMU)- the chip within all VR controllers which contains
`the accelerometer.
`
`Teardowns and the FCCfilings for the current Touch showedit uses TDK’s
`
`ICM-20601 IMU from late 2015.
`
`
`
`
`See also ICM-20601 Specification.
`See also ICM-20601 Specification.
`
`30
`30
`
`

`

`Case 6:21-cv-00755-ADA Document 70-4 Filed 06/10/22 Page 21 of 21
`
`(1c) using the coordinate
`information and the pitch
`information to obtain
`candidate values for the
`azimuth of the object,
`
`
`
`
`
`Facebook encourages, directs, or promotes users to use the Accused Products to
`use the coordinate information and the pitch information to obtain candidate
`values for the azimuth of the object, and Facebook performs such step itself.
`For example, on information and belief, and subject to discovery which has not
`yet occurred, the coordinate information from the LED markers and pitch
`information from the Oculus Touch Controller IMU are fused at the headset to
`obtain candidate values for the azimuth of the Oculus Touch Controller (e.g.,
`using the “P2P pose solver”). The Accused Products are especially adapted to
`carry out this method, which is a material part of the claimed invention, and
`have no substantial noninfringing uses. Further, on information and belief,
`Facebook conditions a user’s use of the Accused Products, and therefore the
`user’s receipt of the benefits of the Accused Products, upon this method and
`establishes the manner or timing of that use (e.g., through its software and/or
`user instructions, which have not been provided at this stage of the litigation).
`
`To the extent this limitation is not met literally, the Accused Products also
`satisfy this limitation under the doctrine of equivalents. Any difference
`between the Accused Products and the claim element is insubstantial.
`
`See, e.g., LED Matching.
`
`31
`
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket