`
`Exhibit 5
`
`
`
`Case 2:21-cv-00040-JRG Document 145-4 Filed 12/03/21 Page 2 of 17 PageID #: 5778
`Case 2:21-cv-00040-JRG Document 145-4 Filed 12/03/21 Page 2 of 17 PagelD #: 5778
`
`IN THE UNITED STATES DISTRICT COURT
`FOR THE EASTERN DISTRICT OF TEXAS
`MARSHALL DIVISION
`
`GESTURE TECHNOLOGY
`PARTNERS, LLC,
`
`Plaintiff,
`
`Vs
`
`HUAWEI DEVICE CO., LTD.,
`HUAWEI DEVICEUSA,INC.,
`
`CASE NO. 2:21-cv-00040-JRG
`(Lead Case)
`
`ied Phen RE
`
`Defendants.
`
`
`
`GESTURE TECHNOLOGY
`PARTNERS, LLC,
`
`-
`Plaintiff,
`
`¥:
`
`SAMSUNG ELECTRONICS CoO., LTD.
`AND SAMSUNGELECTRONICS
`AMERICA,INC.,
`
`CASENO. 2:21-cv-00041-IRG
`(MemberCase)
`
`JURY TRIAL DEMANDED
`
`Defendants.
`
`
`CORRECTED EXPERT REPORT OF DR. ROBERT STEVENSON REGARDING
`INVALIDITY
`
`A Jo/22 |2
`
`Robert Stevenson, Ph.D.
`
`Date
`
`
`
`Case 2:21-cv-00040-JRG Document 145-4 Filed 12/03/21 Page 3 of 17 PageID #: 5779
`Case 2:21-cv-00040-JRG Document 145-4 Filed 12/03/21 Page 3 of 17 PagelD #: 5779
`
`reflections of laser light from an object for purposes of controlling a device where known since
`
`the 1970s. Pryor Tr. Vol. 1 at 121:6-130:3:
`
`-60-
`The Corrected Expert Report of Robert Stevenson, Ph.D. on the Invalidity of U.S. Patent Nos.
`7,933,431, 8,194,924, 8,553,079, and 8,878,949
`
`
`
`Case 2:21-cv-00040-JRG Document 145-4 Filed 12/03/21 Page 4 of 17 PageID #: 5780
`Case 2:21-cv-00040-JRG Document 145-4 Filed 12/03/21 Page 4 of 17 PagelD #: 5780
`
`
`
`158. Additionally, the following technologies and references werepart of the state of the
`
`art in 1998-1999, and would have been knownto a person of skill in the art at the time of the
`
`alleged inventions:
`
`e Cellular phones, e.g., Nokia 5160
`
`e Cellular camera phones, e.g., Novarro, Today’s Phones Do Everything, San
`Diego Union Tribute (Aug. 23, 1998) (describing the Sony Cosm)
`
`e Cameras of varying resolution, including CCD and CMOSdetectors,e.g.,
`Nokia 5160, Motorola STAR TAC
`
`e Electro-optical sensors, e.g., "924 Patent, 11:57-64
`
`e TV cameras, e.g., 924 Patent, 11:57-64
`
`e
`
`Stereo and 3D cameras, e.g., MDScope, Speech/Gesture Interface to a
`Visual Computing Environment for Molecular Biologists
`
`e Video cameras, e.g., MERL, Computer Vision for Interactive Computer
`Graphics
`
`e Microprocessors, e.g., Intel 80286; see also MERL, Computer Vision for
`Interactive Computer Graphics
`
`e Handheld processing personal assistant devices, e.g., the Apple Newton,
`and Palm Pilot
`
`e Hand-hand electronic gaming devices, e.g., Nintendo Gameboy
`
`e Display screens, e.g., on an Apple Newton, Palm Pilot, Nintendo Gameboy
`
`-6l1-
`The Corrected Expert Report of Robert Stevenson, Ph.D. on the Invalidity of U.S. Patent Nos.
`7,933,431, 8,194,924, 8,553,079, and 8,878,949
`
`
`
`Case 2:21-cv-00040-JRG Document 145-4 Filed 12/03/21 Page 5 of 17 PageID #: 5781
`
`
`
`
`
` User interfaces device, e.g., personal computer, Apple Newton, Palm Pilot,
`Nintendo Gameboy
`
` Data transceiver, e.g., Nokia 5160 data modem
`
` Data transfer and exchange, e.g., over the internet
`
` Light sources, including LEDs, e.g., U.S. Patent 6,144,366 (“Numazaki”)
`
` Computer vision techniques, including object detection and recognition,
`e.g., MERL, Computer Vision for Interactive Computer Graphics
`
` Hand and finger gesture detection and recognition techniques, e.g., MERL,
`Computer Vision for Interactive Computer Graphics
`
` Computer imaging and digital image capturing, e.g., MERL, Computer
`Vision for Interactive Computer Graphics
`
` Hand gestures to control camera zooming, e.g., U.S. Patent No. 6,351,222
`(“Swan”)
`
` Hand gestures that control a CD Player, a computer video game, a computer
`mouse, or a robot, e.g., MERL, Visual Interpretation of Hand Gestures for
`Human- Computer Interaction: A Review, Tbl. 1
`
` Digital cameras capable of detecting gestures, e.g., U.S. Patent Publication
`2003/0189658 (“Morris”)
`
` Control of mobile/cellular phones using gestures, e.g., JPH1144703A
`(“Fujimoto”)
`
` Triggering the start of camera recording using an open eye gesture, e.g., JP
`H0736610A (“Kishi”)
`
` Fingerprint readers, including portable fingerprint readers, e.g., U.S. Patent
`No. 5,222,152 (“Fishbine”), CA 2,304,560 (“Diehl”).
`
`
`
`Iris recognition for security or payment authorization, e.g., U.S. Patent No.
`4,641,349 (“Flom”)
`
` Wearable devices with cameras and sensors for use in home automation,
`medical monitoring, and sign language recognition, e.g., Starner et al., The
`Gesture Pendant: A Self-illuminating, Wearable, Infrared Computer
`VisionSystem for Home Automation Control and Medical Monitoring.
`
`
`-62-
`The Corrected Expert Report of Robert Stevenson, Ph.D. on the Invalidity of U.S. Patent Nos.
`7,933,431, 8,194,924, 8,553,079, and 8,878,949
`
`
`
`Case 2:21-cv-00040-JRG Document 145-4 Filed 12/03/21 Page 6 of 17 PageID #: 5782
`
`
`
`asserted patents. A POSITA would not have understood from the disclosures of the asserted
`
`patents that they cover the specific accused technology described in these patents such as
`
`recognizing an iris using infrared sensors, recognizing a face, keeping a display on based on the
`
`presence of a face or eyes, or replicating the movements or facial expressions of a person in a 3D
`
`avatar. This evidence further shows that the disclosures of using cameras in the ’949 Patent does
`
`not sufficiently describe the use of a more general “electro-optical sensor,” unlike the use of
`
`infrared sensors in Samsung’s patented iris scanning technology. It also shows that the disclosures
`
`in the ’431 Patent do not sufficiently describe a handheld device that can itself be controlled based
`
`upon a detected movement or position of a user of the device or an object being held or positioned
`
`by a user of the device, unlike what is alleged to satisfy the claims in the various accused Samsung
`
`applications. This evidence shows that the ’924 Patent fails to disclose the wide range of control
`
`functions for a handheld device based on any camera output that is claimed, unlike what is alleged
`
`satisfy the claims in the various accused Samsung applications. That GTP now accuses such
`
`technology as within the scope of the claims, is evidence that the broad scope of the claims are
`
`incommensurate with that which is disclosed in the asserted patents.
`
`XIV. INVALIDITY UNDER 35 U.S.C. § 101
`
`328.
`
`I have reviewed Defendants’ Subject Matter Eligibility Contentions and agree with
`
`their characterization of the focus of the claims for each Asserted Patent. It is also my opinion that
`
`the asserted claims do not recite any technological improvement, but instead recite than well-
`
`known, conventional, and routine elements. It is therefore my opinion that the Asserted Claims of
`
`the Asserted Patents are invalid under 35 U.S.C. § 101.
`
`329. With respect to the ’431 Patent, a POSITA would have understood the focus of the
`
`claims to be taking action based on an observed movement or position. The claims recite a generic
`
`
`-128-
`The Corrected Expert Report of Robert Stevenson, Ph.D. on the Invalidity of U.S. Patent Nos.
`7,933,431, 8,194,924, 8,553,079, and 8,878,949
`
`
`
`Case 2:21-cv-00040-JRG Document 145-4 Filed 12/03/21 Page 7 of 17 PageID #: 5783
`
`
`
`handheld computer apparatus with a generic sensor or camera and a computer for analyzing the
`
`output of the sensor or camera to determine position or movement information. The patent does
`
`not purport to invent a new handheld device. Rather, the patent explains that the handheld device
`
`may be a cellphone, a device that was well-known in the art. ’431 Patent at 11:62-67. The patent
`
`itself explains that calculating the position or movement information of an object based on camera
`
`output was well-known. For example, the patent explains that “a subtraction of successive images
`
`can aid in identifying zones in an image having movement of features as is well known.” ’431
`
`Patent at 6:66-7:1. The patent also describes using other known algorithms for calculating the
`
`position of an object based on the output of a camera:
`
`For example, the system described above for FIGS. 1 and 2
`involving the photogrammetric resolution of the relative position of
`three or more known target points as viewed by a camera is known
`and is described in a paper entitled “A Single Camera Method for
`the 6-Degree of Freedom Sprung Mass Response of Vehicles
`Redirected by Cable Barriers” presented by M. C. van Wijk and
`H.F.L. Pinkney to The Society of Photo-optical Instrumentation
`Engineers.
`
`
`’431 Patent at 4:20-28.
`
`
`330.
`
`In overcoming a written description rejection during prosecution of the parent
`
`application, the applicant admitted that the principles behind the patent’s handheld image analysis
`
`technology had been known since 1980:
`
`In particular, the examiner asserted that the steps of ‘processing said
`camera image to determine location of points in said image’ and
`‘using said determined location, determining the orientation of said
`handheld device’ did not comply with the written description
`requirement. … It is believed that the first patent disclosing the
`principles behind the above noted steps was USP 4219847 issued
`August 1980 to Pinkney et al., a patent which is incorporated by
`reference in the present application (see page 3, line 8). The
`principles disclosed therein (having been used with the robot arm of
`the NASA space shuttle!) are now well known
`in
`the
`
`
`-129-
`The Corrected Expert Report of Robert Stevenson, Ph.D. on the Invalidity of U.S. Patent Nos.
`7,933,431, 8,194,924, 8,553,079, and 8,878,949
`
`
`
`Case 2:21-cv-00040-JRG Document 145-4 Filed 12/03/21 Page 8 of 17 PageID #: 5784
`
`
`
`photogrammetry art, or more particularly in the field of computer
`vision, fields to which the present application applies.
`
`
`’710 App. File History, Applicant Remarks at 1-2 (July 20, 2009).
`
`
`331. The patentee acknowledged that the ’431 Patent did not purport to invent any new
`
`technology for analyzing a camera image to determine position or movement information in the
`
`context of a handheld device, but instead admitted that technology that was known for nearly two
`
`decades could be used. Indeed, the only handheld device embodiments do not provide any detail
`
`as to the image analysis used to determine position or movement information, leaving the reader
`
`to assume that the admittedly well-known image analysis technology was suitable. ’431 Patent at
`
`FIGS. 8A, 8B, 11:53-13:44. The claims amount to nothing more than applying well-known
`
`technology onto a generic handheld device (examples of which were well-known, e.g., a cell
`
`phones, PDAs, etc.) for purposes of controlling the device. Thus, the focus of the claims is not on
`
`any specific technology, but rather on the idea of taking action based on an observed movement
`
`or position using well-known technology.
`
`332.
`
`Indeed, as I explained above in Section VI, each claimed element in the asserted
`
`claims of the ’431 Patent was well-known. As explained above, handheld devices with a computer
`
`and camera were well-known, and of camera images to determine position and movement of an
`
`object, including fingers were well-known. Using a camera to acquire images of a portion of a
`
`person has been a well-known and fundamental use of cameras since their advent. Sensing
`
`movement of two fingers or of one finger relative to another was well-known as was done in
`
`analyzing finger gesture. See, e.g., Numazaki at FIGS. 28-29, 31:11-26; US 20020057383A1 at
`
`FIG. 12, [0041]. As explained with the ’079 Patent, using a light source to assist with analysis of
`
`camera images was also well-known. Mack at 2:67-3:8; US5227985A at 5:10-34; Zeller et al, A
`
`
`-130-
`The Corrected Expert Report of Robert Stevenson, Ph.D. on the Invalidity of U.S. Patent Nos.
`7,933,431, 8,194,924, 8,553,079, and 8,878,949
`
`
`
`Case 2:21-cv-00040-JRG Document 145-4 Filed 12/03/21 Page 9 of 17 PageID #: 5785
`
`
`
`Visual Computing Environment for Very Large Scale Biomolecular Modeling, 1997 at 8.
`
`Controlling display functions and virtual images therein based on the analysis of a camera image
`
`was well-known. US20020057383A1 at [0033]; US5594469A at 5:6-23; US6498628B2 at 4:61-
`
`5:17; Freeman et al, Computer Vision for Interactive Computer Graphics, IEEE Computer
`
`Graphics and Applications, May/June 1998 at 42, 47, 61; Freeman et al, Television Control by
`
`Hand Gestures, TR94-24, December 1994 at 1-2; BECKMAN00000003 at pp. 7, 10;
`
`DEFTS_PA_00004994 at p. 964; DEFTS_PA_00005316 at p. 864. Obtaining three dimensional
`
`position and movement information of an object was also well-known. US6198485B1 at 5:27-35;
`
`US5454043A at 6:21-29; BECKMAN00000003 at p. 10; Gavrila, The Visual Analysis of Human
`
`Movement: A Survey, Gavrila, 73:1 Computer Vision and Image Understanding, pp. 82-998, Jan.
`
`1999 at p. 91; Wu & Huang, Vision-Based Gesture Recognition: A Review, 1739 Lecture Notes
`
`in Computer Science, Gesture-Based Communication in Human-Computer Interaction 103 at pp.
`
`106-08. Transmitting information from a handheld device to another device was well-known, and
`
`indeed, was a conventional feature of cell phones at the time. WO9726744 at p. 2; Game
`
`BoyColor_InstructionBooklet at ¶ 10; Game BoyNormal_TetrisInstructionBooklet at p. 7;
`
`Motorola_Accessories Guide: Cellular Modems. That cameras could operate at 30 frames per
`
`second was well-known. US5666157A at 4:19-38; US5982853A at 4:60-64. Determining
`
`pinching gestures or other expressions was also well-known. US6002808A at 6:30-41;
`
`US6351222B1
`
`at 3:15-26; US6128003A
`
`at 10:5-9; US6498628B2
`
`at 6:26-34;
`
`BECKMAN00000732 at p. 35; Wu & Huang, Vision-Based Gesture Recognition: A Review, 1739
`
`Lecture Notes in Computer Science, Gesture-Based Communication in Human-Computer
`
`Interaction 103 at pp. 108-09. The ordered combination of these well-known and conventional
`
`elements amounts to taking action based on an observed movement or position using well-known
`
`
`-131-
`The Corrected Expert Report of Robert Stevenson, Ph.D. on the Invalidity of U.S. Patent Nos.
`7,933,431, 8,194,924, 8,553,079, and 8,878,949
`
`
`
`Case 2:21-cv-00040-JRG Document 145-4 Filed 12/03/21 Page 10 of 17 PageID #: 5786
`
`
`
`technology.
`
`333. And the claims provide no specific detail as to how the position and movement
`
`information is used to control the handheld device. Therefore, the claims of the ’431 Patent do not
`
`recite any specific technological solution.
`
`334. With respect to the ’924 Patent, a POSITA would have understood the focus of the
`
`claims to be taking action based on an observation. The claims recite a generic handheld device
`
`with two cameras oriented to view a user or object other than the user and having non-overlapping
`
`fields of view and a computer for controlling the device based on the at least one of the first camera
`
`output and the second camera output. I understand GTP is contending that features that use only
`
`one of the camera outputs infringe the claims. Thus, according to GTP, the claims do not require
`
`using the second camera, in which case the claims amount to nothing more than controlling a
`
`device based on the output of a camera, where the device is a generic handheld device with well-
`
`known and generic components. As I explained above in Section VI, each of the claimed elements
`
`were well-known, and the ordered combination of the claim elements amounts to nothing more
`
`than observing a gesture using generic technology that was well-known at the time, without any
`
`specifics on how to observe a gesture from the output of a camera. Handheld devices comprising
`
`a computer and cameras were well-known in the art, and it was well-known that handheld devices
`
`could include two or more cameras. For example, in explained in further detail in the
`
`accompanying chart on the invalidity of the ’924 Patent over Mann, Mann discloses handheld and
`
`wearable devices with two cameras having non-overlapping fields of view. CA Patent No.
`
`2,237,939A1 at FIGS. 1, 3, pp. 12-13, 18, 21.
`
`335. The dependent claims add nothing further other than well-known and conventional
`
`technology. Cell phones were well-known in the art (e.g., Nokia 5160), including for determining
`
`
`-132-
`The Corrected Expert Report of Robert Stevenson, Ph.D. on the Invalidity of U.S. Patent Nos.
`7,933,431, 8,194,924, 8,553,079, and 8,878,949
`
`
`
`Case 2:21-cv-00040-JRG Document 145-4 Filed 12/03/21 Page 11 of 17 PageID #: 5787
`
`
`
`the position or movement of an object. Liebermann at FIG. 6, 4:21-22, 5:62-67; US5982853A at
`
`5:62-6:6, 6:42-52; Pryor Dep., 451:8-12; 453:21-23; Novarro, Today’s Phones Do Everything, San
`
`Diego Union Tribute (Aug. 23, 1998) (describing the Sony Cosm); JPH1144703A. Using a camera
`
`to acquire images of a portion of a person or of an object has been a well-known and fundamental
`
`use of cameras since their advent. It was well-known in the art that a camera could be used to
`
`capture video of an object, as with well-known and conventional video cameras, including those
`
`in handheld devices. Mann at 11-14; Robb at 1, 5, 12. Determining gestures and facial expressions
`
`was also well-known in the art, whether performed by the user or another person, including on
`
`handheld devices. US5982853A at 5:62-6:6, 6:42-52; US5821922A at 5:1-40. Recognizing
`
`objects based on camera images was also well-known in the art. US5982853A at 5:62-6:10, 6:42-
`
`52; US5821922A at 5:61-6:17; WO9726744 at pp. 12-13. Determining the reference frame of an
`
`object was well-known in the art. US5856844A at 15:55-67; US6335985B1 at 3:41-50;
`
`US6351222B1 at 6:51-55. And transmitting information over an internet connection was well-
`
`known in the art, including in handheld devices. Mann at 13; Doi at FIGS. 13A, 13B, 13C, 11:60-
`
`67; US6401085 at Abstract, FIG. 17, 1:35-40, 2:19-39, 3:14-28.
`
`336. The patent does not purport to invent a new handheld device or new computer
`
`vision technology, but instead simply uses existing handheld devices and computer vision
`
`technology to implement the idea of taking action based on an observation. And the claims do not
`
`provide any specific detail as to how to control the handheld device using camera outputs, and thus
`
`do not claim a specific technological improvement.
`
`337. With respect to the ’079 Patent, a POSITA would have understood the focus of the
`
`claims to be observing a gesture. The claims recite a generic computer apparatus or method that
`
`uses a camera to observe gestures illuminated by a light source. As I explained above in Section
`
`
`-133-
`The Corrected Expert Report of Robert Stevenson, Ph.D. on the Invalidity of U.S. Patent Nos.
`7,933,431, 8,194,924, 8,553,079, and 8,878,949
`
`
`
`Case 2:21-cv-00040-JRG Document 145-4 Filed 12/03/21 Page 12 of 17 PageID #: 5788
`
`
`
`VI, each of the claimed elements were well-known, and the ordered combination of the claim
`
`elements amounts to nothing more than observing a gesture using generic technology that was
`
`well-known at the time, without any specifics on how to observe a gesture from the output of a
`
`camera. Indeed, detecting gestures from the output of a camera using a computer was already
`
`conventional at the time of the invention. US20020057383A1 at [0032]; US6002808A at 4:57-
`
`5:11; US5982853A at 4:60-64; US6128003A at 5:8-12; US6351222B1 at 3:15-26; US6498628B2
`
`at 4:49-60; US5594469A at 4:66-5:7; US5821922A at 5:61-6:17; Freeman et al, Computer Vision
`
`for Interactive Computer Graphics, IEEE Computer Graphics and Applications, May/June1998 at
`
`pp. 42-45; Freeman et al, Computer Vision for Computer Games, TR96-35, October 1996 at 2;
`
`Freeman et al, Orientation Histograms for Hand Gesture Recognition, TR94-03, October 1994 at
`
`1; Freeman et al, Television Control by Hand Gestures, TR94-24, December 1994 at 1;
`
`BECKMAN00000003 at 3; BECKMAN00000732 at p. 32; DEFTS_PA_00004610 at p. 678;
`
`Gavrila, The Visual Analysis of Human Movement: A Survey, Gavrila, 73:1 Computer Vision and
`
`Image Understanding, pp. 82-998, Jan. 1999 at 91; Wu & Huang, Vision-Based Gesture
`
`Recognition: A Review, 1739 Lecture Notes in Computer Science, Gesture-Based Communication
`
`in Human-Computer Interaction 103 at 106-07; Crowley, Vision for man-machine interaction, 19
`
`Robotics and Autonomous Systems, 347-358 (1997) at 31-32. The ’079 Patent itself provides no
`
`specifics as to how to determine a gesture from the output of a camera, but rather assumes it was
`
`well-known to a POSITA. And it was well-known that using a light source to illuminate the
`
`gesture would help the system detect the gesture from the camera output. Numazaki at Abstract,
`
`FIGS. 2, 74, 11:9-19, 50:29-62; US6198485B1 at 2:67-3:8; US 5594469 at 3:7-9;
`
`BECKMAN00000003 at p. 8; BECKMAN00000732 at p. 32; Freeman Tr. 74:2-20; Freeman01 at
`
`or around 4:27; Freeman et al, Television Control by Hand Gestures, TR94-24, December 1994 at
`
`
`-134-
`The Corrected Expert Report of Robert Stevenson, Ph.D. on the Invalidity of U.S. Patent Nos.
`7,933,431, 8,194,924, 8,553,079, and 8,878,949
`
`
`
`Case 2:21-cv-00040-JRG Document 145-4 Filed 12/03/21 Page 13 of 17 PageID #: 5789
`
`
`
`4, 7. It was well-known that the light source could be positioned in different fixed locations.
`
`Freeman Tr. 74:2-20.
`
`338. The dependent claims add nothing further other than well-known and conventional
`
`technology. For example, using light-emitting diodes as a light source, including a plurality of
`
`light-emitting diodes, was well-known in the art. Numazaki ’863 at FIG. 4, 14:63-64, 16:36-57,
`
`Claim 2; US5227985 at 4:19-31, 5:8-13, 8:38-42. It was well-known that sequential images of a
`
`camera had to be analyzed in order to detect movement as is necessary to detect a gesture.
`
`US5821922A at 5:61-6:4; US5982853A at 4:60-64; US6128003A at 5:8-12; US6351222B1 at
`
`6:29-3; US6498628B2 at 5:18-43; Freeman et al, Computer Vision for Interactive Computer
`
`Graphics, IEEE Computer Graphics and Applications, May/June1998 at 42-45; Freeman et al,
`
`Orientation Histograms for Hand Gesture Recognition, TR94-03, October 1994 at 1;
`
`BECKMAN00000003 at p. 10; BECKMAN00000732 at p. 34; DEFTS_PA_00004610 at p. 688;
`
`DEFTS_PA_00004994 at p. 967; DEFTS_PA_00005316 at p. 865 Gavrila, The Visual Analysis
`
`of Human Movement: A Survey, Gavrila, 73:1 Computer Vision and Image Understanding, pp.
`
`82-998, Jan. 1999 at 91-93. It was well-known that three-dimensional position of a point on a
`
`user’s hands and fingers could be determined, and as explained above, the asserted patents do not
`
`recite any new technology for determining the position of an object, including three-dimensional
`
`position. US6198485B1 at 5:27-35; US5454043A at 6:21-29; BECKMAN00000003 at p 10;
`
`Gavrila, The Visual Analysis of Human Movement: A Survey, Gavrila, 73:1 Computer Vision and
`
`Image Understanding, pp. 82-998, Jan. 1999 at 91; Wu & Huang, Vision-Based Gesture
`
`Recognition: A Review, 1739 Lecture Notes in Computer Science, Gesture-Based Communication
`
`in Human-Computer Interaction 103 at 106-08. And it was also well-known that a camera and
`
`light source could have been positioned in fixed relation relative to a keypad, and in fact such
`
`
`-135-
`The Corrected Expert Report of Robert Stevenson, Ph.D. on the Invalidity of U.S. Patent Nos.
`7,933,431, 8,194,924, 8,553,079, and 8,878,949
`
`
`
`Case 2:21-cv-00040-JRG Document 145-4 Filed 12/03/21 Page 14 of 17 PageID #: 5790
`
`
`
`implementation would have been necessary in the case of a laptop where components would have
`
`to be fixed within the housing to maintain portability. Numazaki at FIG. 74, 50:29-43, 50:53-62;
`
`Mack at 3:9-10; US6580448B1 at 10:66-11:11; WO9921122 at p. 5.
`
`339. The claims of the ’079 Patent do not recite any specific technological solution, but
`
`instead recite nothing more than observing a gesture using well-known and conventional
`
`technology with no specifics as to how the technology is applied to observe a gesture.
`
`340. With respect to the ’949 Patent, a POSITA would have understood the focus of the
`
`claims to be capturing an image based on an observed gesture. The claims recite a portable or
`
`image capture device with a processor that determines a gesture from the output of an electro-
`
`optical sensor and controls a digital camera to capture an image in response to the gesture. As
`
`explained above in Section VI, each of the claim elements were well-known at the time, and the
`
`ordered combination of the claim elements amounts to nothing more than capturing an image based
`
`on an observed gesture using generic technology that was well-known at the time, without any
`
`specifics on how to observe a gesture from an electro-optical sensor output or to control a digital
`
`camera to take a picture in response.
`
`341. Portable devices with a camera and separate electro-optical sensor were well-
`
`known in the art at the time. US 5856844A at Abstract, 18:41-49; US6580448B1 at 10:11-25.
`
`Taking a picture in response to a gesture was already conventional at the time. The patent explains
`
`that it was already conventional in the art to take pictures based on poses, and a POSITA would
`
`have known that it was similarly conventional to take pictures based on gestures: “Today, in a
`
`conventional context, one can as a photographer, choose to shoot a fashion model or other subject,
`
`and when you see a pose you like record the picture.” ’949 Patent at 7:57-59. The patent goes on
`
`to explain that it was desirable for one to do this on their own. ’949 Patent at 7:59-61. The patent
`
`
`-136-
`The Corrected Expert Report of Robert Stevenson, Ph.D. on the Invalidity of U.S. Patent Nos.
`7,933,431, 8,194,924, 8,553,079, and 8,878,949
`
`
`
`Case 2:21-cv-00040-JRG Document 145-4 Filed 12/03/21 Page 15 of 17 PageID #: 5791
`
`
`
`then explains in reference to Figure 3 an embodiment taking pictures in response to computer-
`
`detected gestures and poses. ’949 Patent at 7:64-8:38. The patent goes on to explain that the
`
`necessary technology for analyzing the image to determine if a gesture was performed was
`
`conventional: “This being the case ,conventional 2D machine vision type image processing (e.g.
`
`‘Vision Bloks’ software from Integral Vision Corp.) can be used to extract object features and their
`
`locations in the images retained.” ’949 Patent at 10:40-44. The patent also explains that the image
`
`analysis techniques required for the alleged invention was already known in the art, e.g. “as
`
`described in the Lobo et al patent reference.” ’949 Patent at 8:13-18. Thus, it is clear from the
`
`patent that no new technology is being advanced. Rather, the patent seeks only to allow users to
`
`act as their own photographer by claiming well-known technology for capturing an image based
`
`on an observed gesture.
`
`342. The dependent claims add nothing further other than well-known and conventional
`
`technology. Claims 2 and 3 say nothing about the technology used, but simply specify the type of
`
`gesture detected. It was well-known in the art that gestures including hand motions or poses could
`
`be detected. US20020057383A1 at [0033]; US5982853A at 4:60-64; US6128003A at 5:8-12;
`
`US6351222B1 at 3:15-26; US6498628B2 at 4:61-64; US5454043A at 10:57-11:8; US5594469A
`
`at 5:6-10; JPH473631 at pp. 1-10, US6002808A at 6:7-20; US5821922A at 4:55-67; Freeman et
`
`al, Computer Vision for Interactive Computer Graphics, IEEE Computer Graphics and
`
`Applications, May/June1998 at 42-43; Freeman et al, Computer Vision for Computer Games,
`
`TR96-35, October 1996 at 2; Freeman et al, Orientation Histograms for Hand Gesture Recognition,
`
`TR94-03, October 1994 at 1; Freeman et al, Television Control by Hand Gestures, TR94-24,
`
`December 1994 at 1; BECKMAN00000003 at p. 3; BECKMAN00000732 at pp. 29, 32, 34-35;
`
`DEFTS_PA_00004610 at pp. 678, 686-88; Gavrila, The Visual Analysis of Human Movement: A
`
`
`-137-
`The Corrected Expert Report of Robert Stevenson, Ph.D. on the Invalidity of U.S. Patent Nos.
`7,933,431, 8,194,924, 8,553,079, and 8,878,949
`
`
`
`Case 2:21-cv-00040-JRG Document 145-4 Filed 12/03/21 Page 16 of 17 PageID #: 5792
`
`
`
`Survey, Gavrila, 73:1 Computer Vision and Image Understanding, pp. 82-998, Jan. 1999 at 82, 84;
`
`Wu & Huang, Vision-Based Gesture Recognition: A Review, 1739 Lecture Notes in Computer
`
`Science, Gesture-Based Communication in Human-Computer Interaction 103 at 107-11; Crowley,
`
`Vision for man-machine interaction, 19 Robotics and Autonomous Systems, 347-358 (1997) at 31.
`
`It was well-known in the art that an electro-optical sensor could be fixed in relation to a digital
`
`camera, including to take a picture of a subject in the same area as in which a gesture is performed.
`
`Sears at 18:9-18, 22:1-3; US6198485B1 at 2:58-65; US6498628B2 at 8:24-37;
`
`US20020057383A1 at [0048-0049]; BECKMAN00000732 at p. 29; DEFTS_PA_00004994 at p.
`
`967; DEFTS_PA_00005316 at p. 865. It was well-known in the art to include a light source facing
`
`the same direction as the electro-optical sensor and camera to illuminate the gesture being
`
`performed and the subject whose picture was being taken. US6198485B1 at 2:67-3:8;
`
`US6580448B1 at 10:11-25; BECKMAN00000003 at p. 8; BECKMAN00000732 at p. 31. Using
`
`a lower resolution sensor for detecting gestures than the camera used to take a picture was also
`
`well-known in the art. Sears at 17:12-17, 18:39-45, 21:10-14, 27:39-46; Freeman et al, Computer
`
`Vision for Interactive Computer Graphics, IEEE Computer Graphics and Applications,
`
`May/June1998 at 44-45; Freeman et al, Computer Vision for Computer Games, TR96-35, October
`
`1996 at 1-3. Electro-optical sensors comprising CCD or CMOS detectors were well-known in the
`
`art. US5227985A at 5:25-35; US5666157A at 5:14-25; US5686942A at 4:62-5:1, 5:40-45;
`
`US6191773B1 at 4:31-45; US6580448B1 at 12:5-8; US5856844A at 2:56-3:5; WO9921122 at
`
`p.4; Freeman et al, Computer Vision for Interactive Computer Graphics, IEEE Computer Graphics
`
`and Applications, May/June1998 at 44-45; Freeman et al, Computer Vision for Computer Games,
`
`TR96-35, October 1996 at 1; Crowley, Vision for man-machine interaction, 19 Robotics and
`
`Autonomous Systems, 347-358 (1997) at 31.
`
`
`-138-
`The Corrected Expert Report of Robert Stevenson, Ph.D. on the Invalidity of U.S. Patent Nos.
`7,933,431, 8,194,924, 8,553,079, and 8,878,949
`
`
`
`Case 2:21-cv-00040-JRG Document 145-4 Filed 12/03/21 Page 17 of 17 PageID #: 5793
`
`
`
`343. The claims of the ’949 Patent do not recite any specific technological solution, but
`
`instead recite nothing more than capturing an image based on an observed gesture using well-
`
`known and conventional technology with no specifics as to how the technology is applied to
`
`determine a gesture and to then capture an image based on that observed gesture.
`
`344. Further evidence showing that each of the claim elements of each of the asserted
`
`patents was well-known in the art can be found in the invalidity charts accompanying my report
`
`XV.
`
`IMPROPER INVENTORSHIP
`
`345.
`
`I understand that if a patent does not reflect its true inventorship, it is invalid. I
`
`further understand that inventorship is a legal conclusion premised on underlying factual findings,
`
`and depends on claim construction, and that who should be listed on the face of a patent varies
`
`depending on what is claimed and what the Court determines the claim scope to be.
`
`346.
`
`I understand that the Patent Act allows a listing of inventors to be corrected either
`
`upon petition to the Director under 35 U.S.C. § 256(a) or upon court order, pursuant to 35 U.S.C.
`
`§ 256(b). I further understand that a patent cannot be invalidated if inventorship can be corrected
`
`instead.
`
`347.
`
`I have been advised that when an invention is made jointly, the joint inventors need
`
`not contribute equally to its conception. All that is required is that the joint inventor made a
`
`significant contribution to the conception or reduction to practice of the invention.
`
`348.
`
`
`
`
`
`
`
`It is my opinion that, absent a correction, the Asserted Claims of the Asserted Patents are invalid
`
`under § 102(f) for naming the wrong people as inventors.
`
`
`-139-
`The Corrected Expert Report of Robert Stevenson, Ph.D. on the Invalidity of U.S. Patent Nos.
`7,933,431, 8,194,924, 8,553,079, and 8,878,949
`
`