throbber
UNITED STATES PATENT AND TRADEMARK OFFICE
`____________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`____________
`
`APPLE INC.
`Petitioner
`
`v.
`
`GESTURE TECHNOLOGY PARTNERS LLC
`Patent Owner
`_________________
`
`Inter Partes Review Case No. IPR2021-00923
`U.S. Patent No. 8,194,924
`
`SUPPLEMENTAL DECLARATION OF DR. BENJAMIN B. BEDERSON
`
`IPR2021-00923
`Apple EX1018 Page 1
`
`

`

`IPR2021-00923
`U.S. Patent No. 8,194,924
`
`I, Benjamin B. Bederson, hereby declare the following:
`
`1. My name is Benjamin B. Bederson, Ph.D and I am over 21 years of age
`
`and otherwise competent to make this Declaration. I make this Declaration based on
`
`facts and matters within my own knowledge and on information provided to me by
`
`others.
`
`2.
`
`I submitted an initial declaration in support of Apple’s petition for Inter
`
`Partes Review of U.S. Patent No. 8,194,924 (“the ’924 Patent”). I understand the
`
`PTAB instituted the requested review and that the proceeding involves the full scope
`
`of the proposed grounds addressed in my initial declaration. I have been asked to
`
`address a few additional issues in response to Patent Owner’s Response (Paper 12)
`
`and Patent Owner’s expert’s declaration (Ex. 2002).
`
`I.
`
`A POSITA would have been motivated to combine Mann and Numazaki
`
`3.
`
`I understand Patent Owner and its expert argue that it would not have
`
`been obvious to implement Numazaki’s no-touch gesture recognition technology
`
`upon Mann’s device because “physically interacting with a watch or PDA is what
`
`would be expected,” noting the long history of users physically interacting with
`
`those types of devices in order to control them. Paper 12, 11. The argument
`
`continues, alleging Mann’s touch-based gestures actually “provide a cover for the
`
`user to trigger the recording” while no-touch gesture recognition would be “more
`
`
`
`2
`
`IPR2021-00923
`Apple EX1018 Page 2
`
`

`

`IPR2021-00923
`U.S. Patent No. 8,194,924
`
`likely to intrigue the [target] subject and draw their attention.” Paper 12,13-14; Ex.
`
`2002, ¶¶ 49, 54, 55. I disagree.
`
`4.
`
`Just the opposite is true. In my initial declaration, I explained that
`
`Mann’s goal is to capture a recording of a subject without drawing the subject’s
`
`attention. Ex. 1003, ¶ 47. Mann’s goal is to avoid “creat[ing] a visual disturbance to
`
`others and attract[ing] considerable attention on account of the gesture of bringing
`
`the camera up to the eye.” Ex. 1004, 1-2. I explained that physically touching Mann’s
`
`device “runs the risk of being noticed by the subject.” Ex. 1003, ¶ 47. Whether
`
`touching Mann’s device or bringing it up to the eye, in both circumstances the user
`
`risks drawing the subject’s attention by performing an action that the subject may
`
`recognize as interacting with the device. For example, when a user brings a camera
`
`to the eye, it’s unavoidable that the subject will assume she is being recorded.
`
`Similarly, when the user physically interacts with the watch or PDA, it risks the
`
`subject recognizing that the user has in fact interacted with the device and may have
`
`initiated some process within the device (e.g., a recording). This is one of the key
`
`reasons a POSITA would have understood no-touch gestures draw less attention that
`
`Mann’s native touch-based gestures. Touch-based gestures are easily recognizable
`
`as the user interacts with the device, which is precisely why they draw more
`
`attention. When a user seeks to initiate a recording on a device without the subject
`
`
`
`3
`
`IPR2021-00923
`Apple EX1018 Page 3
`
`

`

`knowing, she should avoid actions that suggest a function has been initiated on the
`
`IPR2021-00923
`U.S. Patent No. 8,194,924
`
`device.
`
`5.
`
`From the perspective of avoiding attention, Mann’s touch-based
`
`gestures improve upon raising a camera to one’s eye to capture video. But Mann’s
`
`touch-based gestures still depend on physical actions that a subject is likely to
`
`associate with the user interacting with and controlling the device. As Patent Owner
`
`and its expert admit, the proposed no-touch gestures have no such association. Paper
`
`12, 11; Ex. 2002, ¶ 49 (“physically interacting with a watch or PDA is what would
`
`be expected, while no-touch gesture recognition is much more recent”).
`
`Accordingly, contrary to Patent Owner’s argument, a POSITA would have
`
`understood that using no-touch gestures as proposed are less likely to draw the
`
`subject’s attention to the fact that the user is interacting with an electronic device.
`
`6.
`
`I understand Patent Owner argues Numazaki’s lighting unit would
`
`flicker when detecting gestures, drawing attention to it and undermining Mann’s
`
`intention to record covertly. Paper 12, 12; Ex. 2002, ¶ 51. I disagree. Mr.
`
`Occhiogrosso’s argument assumes the emitted light is visible to the human eye.
`
`Numazaki is unequivocal that it is not. Numazaki discloses “it is preferable to use a
`
`device that can emit the near infrared light which is invisible to the human eyes, as
`
`the lighting unit 101 . . . so that the target human being will not sense the glare of
`
`the light.” Ex. 1005, 12:1-6. Indeed, Mr. Occhiogrosso did not later dispute that
`
`
`
`4
`
`IPR2021-00923
`Apple EX1018 Page 4
`
`

`

`IPR2021-00923
`U.S. Patent No. 8,194,924
`
`Numazaki’s emitted light could be invisible to the human eye. Ex. 1019, 46:14-47:4.
`
`Accordingly, Numazaki’s use of invisible infrared light would not cause the lighting
`
`unit to flicker in a way perceptible to the recorded subject.
`
`7.
`
`In response to an opinion I expressed in my initial declaration that
`
`Mann’s touch-based gestures would result in the user’s finger inadvertently touching
`
`the glass over the camera, reducing its fidelity over time, I understand Mr.
`
`Occhiogrosso opines that the user-facing camera would be left “untouched” because
`
`it is “separate” from and “above” the portion of the watch face with which the user
`
`interacts. Ex. 2002, ¶¶ 56-57. I disagree. Although Mann does illustrate physical
`
`separation between the camera and area within which a user interacts, a PHOSITA
`
`would have understood that a number of factors support my conclusion. The close
`
`proximity of the camera and the area within which the user interacts means a user’s
`
`finger would need to stay precisely within the designated touch-based gesture area
`
`to avoid touching the camera. Given the very small space available, as discussed in
`
`detail below, I would expect a user’s finger to regularly extend beyond the gesture
`
`area, which means it will often touch the very nearby camera. Indeed, Mr.
`
`Occhiogrosso’s opinion assumes a level of precision with which a user interacts with
`
`the screen that is simply not realistic. Mann teaches that display 400—an area that
`
`contains the circle within which a user performs gestures—is only “0.7 inches on the
`
`diagonal.” Ex. 1004, 14. Accordingly, Mr. Occhiogrosso assumes that a user can
`
`
`
`5
`
`IPR2021-00923
`Apple EX1018 Page 5
`
`

`

`IPR2021-00923
`U.S. Patent No. 8,194,924
`
`precisely swipe her finger in this very small space, starting and stopping with
`
`sufficient precision to avoid going beyond the boundaries and touching the glass
`
`overlaying the camera. I disagree that such precision is possible.
`
`8.
`
`This becomes apparent when considering entry of the symbol, “#” upon
`
`the clockface as described by Mann, which is entered as a vector using a finger
`
`stroke. Following the same examples Mann describes for entering entering the
`
`number “3” with a “stroke from left to right,” entering “#” requires starting a finger
`
`stroke at the 5 o’clock position and swiping across the clock face toward the 11
`
`o’clock position. Ex. 1004, 15. The following thus depicts Mann’s proposed entry
`
`of the “#” symbol:
`
`
`
`6
`
`IPR2021-00923
`Apple EX1018 Page 6
`
`

`

`IPR2021-00923
`U.S. Patent No. 8,194,924
`
`
`
`Id. at Figs. 3, 4 (combined and annotated to show effect of touch-based clock face
`
`entry). That a user must target what would appear to cover less than half the available
`
`watch face real estate, means those gestures will often be off target and will often
`
`touch nearby portions of the watch face, including user-facing camera 350.
`
`Accordingly, as I stated in my initial declaration, such inadvertent touches will
`
`“ultimately result in a loss of fidelity over time due to grease and grime from the
`
`user’s finger (or at least require regular cleanings to avoid such fidelity loss).” Ex.
`
`1003, ¶ 48.
`
`
`
`7
`
`IPR2021-00923
`Apple EX1018 Page 7
`
`

`

`IPR2021-00923
`U.S. Patent No. 8,194,924
`
`9. Mr. Occhiogrosso’s testimony raises a related problem with Mann’s
`
`native touch-based finger strokes. He stresses that only a small area of an already
`
`small watch face is designated for touch-based finger strokes. Given a confined
`
`space that cannot exceed 0.7 inches on the diagonal as described above, a POSITA
`
`would understand that performing such touch-based gestures with sufficient
`
`precision would require an elevated level of focus from the user. The user would not
`
`be able to casually swipe her finger in a general direction across the watch face, but
`
`must instead start and stop that finger swipe in precise locations within a very small
`
`portion of the watch face. Such movements would not be “very brief” and “of little
`
`concern” as Mr. Occhiogrosso opines. Ex. 2002, ¶ 53. Rather, this motivates the
`
`proposed combination. With Numazaki’s no-touch gestures, the user’s finger can
`
`simply be swiped above the watch face in a particular direction. No starting target
`
`or stopping target are required for the gesture to be correctly identified, which means
`
`much less focus is demanded of the user in contrast to the deliberate, space-confined
`
`entry I described above. With Mann’s goal of drawing minimal attention to the fact
`
`that the user is initiating a recording, simplifying the gesture process by allowing the
`
`user to casually swipe a finger with significantly less precision (and focus) than
`
`Mann’s native process would be a significant improvement.
`
`10.
`
`I understand Patent Owner and its expert also argue that it would not
`
`have been obvious to combine Numazaki’s gesture recognition technology with
`
`
`
`8
`
`IPR2021-00923
`Apple EX1018 Page 8
`
`

`

`IPR2021-00923
`U.S. Patent No. 8,194,924
`
`Mann’s PDA or wristwatch devices because Numazaki’s gesture-recognition
`
`hardware creates a reflected light image that is “not a regular image that would be
`
`displayed to a person, either as a single still image or within a video (e.g.,
`
`documentary)” and “the average person would be confused upon viewing the
`
`‘reflected light image.’” Paper 12, 18-19, Ex. 2002, ¶ 60. I disagree.
`
`11. Numazaki teaches a system that captures images of an illuminated
`
`object in order to detect gestures performed by that object. As I stated in my initial
`
`declaration at ¶¶ 39-40, Numazaki uses its controlled lighting and two-camera
`
`arrangement to illuminate the target object (e.g., the user’s hand) in a controlled
`
`manner such that a precise image of the user’s hand and hand movement can be
`
`ascertained. Ex. 1005, 11:9-23. A timing control unit turns lighting unit 101 on to
`
`illuminate a target object while Numazaki’s first camera unit 109 is active, then off
`
`when the second camera unit 110 is active. Id. at 11:20-32. Using this lighting
`
`control, the first camera captures an image of a target object illuminated by both
`
`natural light and directed light from lighting unit 101, while the second camera
`
`captures an image of the target object illuminated by only natural light. Id. at 11:33-
`
`39. The difference calculation unit 111 extracts and outputs a “reflected light image”
`
`created by subtracting the first camera’s captured aggregate of natural and reflected
`
`light image information from the second camera’s captured natural light-only image
`
`information. Id. at 11:43-51.
`
`
`
`9
`
`IPR2021-00923
`Apple EX1018 Page 9
`
`

`

`IPR2021-00923
`U.S. Patent No. 8,194,924
`
`12. Numazaki’s two-sensor structure improves upon a single sensor
`
`structure by ensuring that resulting image reflects only the illuminated target object,
`
`excluding extraneous image information and negating environmental light effects,
`
`to create a precise image. Uniquely, Numazaki’s lighted two-camera configuration
`
`is capable of “extracting a specific target object at high precision easily even under
`
`an environment [with] . . . external light fluctuation” in an “optimum state even when
`
`a distance with respect to the target object is changing.” Id. at 4:31-40 (emphasis
`
`added). This enables Numazaki’s feature data generation unit to use that precise
`
`image in a myriad of applications that includes, but is not limited to, determining
`
`gestures, pointing, etc. of the target object as I describe in my initial declaration at ¶
`
`40 (citing Ex. 1005 at 10:57-66).
`
`13. Accordingly, Numazaki’s two-sensor structure improves upon a single
`
`sensor structure by ensuring that the produced image captures the illuminated object
`
`while excluding extraneous image information. In the context of Mann’s two-sided
`
`recording, this improved fidelity benefits the functionality, contrary to Mr.
`
`Occhiogrosso’s testimony.
`
`14.
`
`I also understand Patent Owner argues no-touch gestures would be
`
`more cumbersome to use in conjunction with Mann’s wristwatch because, according
`
`to Mr. Occhiogrosso, no-touch gestures would be cumbersome “if the user-facing
`
`camera on the wristwatch has a narrow field of view.” Ex. 2002, ¶ 62. However, Mr.
`
`
`
`10
`
`IPR2021-00923
`Apple EX1018 Page 10
`
`

`

`IPR2021-00923
`U.S. Patent No. 8,194,924
`
`Occhiogrosso never explains why a POSITA would use such a narrow field of view.
`
`In fact, a POSITA would not have implemented the user-facing camera with so
`
`narrow a field of view that slight wrist motions make the gesture detection
`
`ineffective. Instead, the field of view would have been configured such that a user
`
`could inconspicuously swipe a finger across the screen in a predetermined direction.
`
`So long as the finger passes over the face of the device, the gesture is recognized. A
`
`POSITA would understand this is far simpler and less conspicuous than Mann’s
`
`native touch-based gestures, which require high precision and significant focus from
`
`the user as I explained above.
`
`II. A POSITA would have been motivated to combine Mann and Numazaki
`with Amir
`
`15.
`
`I understand Patent Owner and its expert argue “Amir’s pupil-detection
`
`functionality is unnecessary and redundant,” suggesting that Mann’s native video-
`
`based functionality is sufficient because a recipient can simply search the received
`
`video for a frame in which the subject’s eye are open. Paper 12, 32-33; Ex. 2002, ¶
`
`83. I disagree. This argument ignores the benefits of communicating images over
`
`video. As I explained in my initial declaration at ¶ 59, Mann expressly contemplates
`
`taking pictures and sending those to remote locations. I explained, given its
`
`investigative journalism focus, Mann’s system would benefit from functionality that
`
`ensures captured images are of high quality and capture the subject’s facial features.
`
`The proposed combination with Amir does just this, ensuring that any captured and
`
`
`
`11
`
`IPR2021-00923
`Apple EX1018 Page 11
`
`

`

`IPR2021-00923
`U.S. Patent No. 8,194,924
`
`subsequently transmitted image captures important facial features of the subject,
`
`including the subject’s open eyes.
`
`16. A captured image requires far less bandwidth to transmit than video.
`
`Amir’s functionality ensures that whatever image is ultimately captured and
`
`transmitted will be useful for subject identification at least because it ensure the
`
`subject’s eyes are open. Further, sending an isolated image require far less work on
`
`the receiving end than the alternative Mr. Occhiogrosso proposes—transmitting high
`
`bandwidth video and forcing the recipient to search, frame-by-frame, for an image
`
`where the subject’s eyes are open.
`
`III. A POSITA would have been motivated to combine Mann and Numazaki
`with Aviv
`
`17.
`
`I understand Patent Owner and its expert argue Aviv teaches away from
`
`a combination with Mann because the “scenarios contemplated by Aviv are very
`
`different from those disclosed by Mann.” Paper 12, 34-36; Ex. 2002, ¶¶ 85-86. Mr.
`
`Occhiogrosso also argues Aviv teaches away from Mann because Aviv intends to
`
`mount the camera high so as to “minimize occlusion between the [] camera and the
`
`movement of individuals.” Ex. 1007, 5:55-58; Ex. 2002, ¶ 85. I disagree.
`
`18.
`
`In my original declaration, I explained that given Mann’s focus on
`
`surreptitious monitoring, a POSITA would have recognized the applicability and
`
`benefit of Aviv’s functionality that automatically detects criminal conduct and
`
`trigger image capture. Ex. 1003, ¶¶ 65-67. A POSITA would have also recognized
`
`
`
`12
`
`IPR2021-00923
`Apple EX1018 Page 12
`
`

`

`Aviv’s goal to mitigate occlusion aligns with, rather than against, Mann’s device.
`
`Indeed, Mann expressly recommends a user use the device in a manner that avoids
`
`IPR2021-00923
`U.S. Patent No. 8,194,924
`
`drawing the attention of a subject. Ex. 1004, Abstract, 13.
`
`IV. Conclusion
`
`19.
`
`I declare that all statements made herein of my knowledge are true, and
`
`that all statements made on information and belief are believed to be true, and that
`
`these statements were made with the knowledge that willful false statements and the
`
`like so made are punishable by fine or imprisonment, or both, under Section 1001 of
`
`Title 18 of the United States Code.
`
`Date:
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`By: _______________________________
`Dr. Benjamin B. Bederson
`
`
`
`13
`
`
`
`
`
`
`IPR2021-00923
`Apple EX1018 Page 13
`
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket