throbber
Case IPR2016-01372
`Patent No. 8,659,571
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`___________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`___________________
`
`
`APPLE INC.,
`Petitioner,
`
`v.
`
`IMMERSION CORPORATION,
`Patent Owner.
`___________________
`
`Case IPR2016-01372
`Patent No. 8,659,571
`___________________
`
`
`
`
`DECLARATION OF YON VISELL, PH.D.
`
`IN SUPPORT OF IMMERSION CORPORATION’S
`
`PATENT OWNER PRELIMINARY RESPONSE
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Immersion Ex 2001-1
`Apple v Immersion
`IPR2016-01372
`
`

`

`1.
`
`I, Yon Visell, declare as follows:
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`
`I.
`
`INTRODUCTION
`2. My name is Yon Visell. I am an Assistant Professor in the
`
`Department of Electrical and Computer Engineering and Media Arts and
`
`Technology Graduate Program at the University of California, Santa Barbara
`
`(“UCSB”).
`
`3.
`
`I have been engaged by Immersion Corporation (“Immersion”) as an
`
`expert in connection with matters raised in the Petition for Inter Partes Review
`
`(“Petition”) of U.S. Patent No. 8,659,571 (the “’571 patent”) filed by Apple Inc.
`
`(“Apple” or “Petitioner”).
`
`4.
`
`This declaration is based on the information currently available to me.
`
`To the extent that additional information becomes available, I reserve the right to
`
`continue my investigation and study, which may include a review of documents
`
`and information that may be produced, as well as testimony from depositions that
`
`have not yet been taken.
`
`II.
`
`SUMMARY OF OPINIONS
`5.
`
`The ’571 patent is entitled “Interactivity Model for Shared Feedback
`
`on Mobile Devices.” The ’571 patent is directed to a novel way of producing
`
`haptic effects in electronic devices. The fundamental insight that is described and
`
`claimed in the ’571 patent is that the user’s gesture interactions with the device
`
`9698602
`
`
`- 2 -
`
`
`
`Immersion Ex 2001-2
`Apple v Immersion
`IPR2016-01372
`
`

`

`
`need to be tracked and analyzed in order to properly synchronize haptic feedback
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`with a user’s input. Reflecting this focus, the claims specify that both a first and a
`
`second gesture signal (each based on a user’s gestural inputs) are used to generate
`
`something called a “dynamic interaction parameter.” The petition challenges
`
`claims 1-7, 12-18, and 23-29 of the ’571 patent.
`
`6.
`
`The petition raises three grounds, each based on obviousness under
`
`pre-AIA 35 U.S.C. § 103(a). Ground 1 argues that claims 1-7, 12-18, and 23-29 of
`
`the ’571 are obvious in light of U.S. Patent Application Publication No. US
`
`2010/0156818 (“Burrough”), Ex. 1005. Based on studying the petition and the
`
`exhibits cited in the petition as well as other documents, it is my opinion that
`
`claims 1-7, 12-18, and 23-29 are not rendered obvious by Burrough.
`
`7.
`
`Ground 2 argues that claims 1, 2, 4-6, 12, 13, 15-18, 23, 24, and 26-29
`
`are obvious in light of U.S. Patent No. 5,734,373 (“Rosenberg ’373”), Ex. 1004.
`
`Based on studying the petition and the exhibits cited in the petition as well as other
`
`documents, it is my opinion that claims 1, 2, 4-6, 12, 13, 15-18, 23, 24, and 26-29
`
`are not rendered obvious by Rosenberg ’373.
`
`8.
`
`Finally, ground 3 argues that claims 3, 14, and 25, which concern an
`
`“on-screen signal,” are obvious under the combination of Rosenberg ’373 and U.S.
`
`Patent No. 6,429,846 (“Rosenberg ’846”), Ex. 1006. Based on studying the
`
`petition and the exhibits cited in the petition as well as other documents, it is my
`
`
`
`
`- 3 -
`
`
`
`Immersion Ex 2001-3
`Apple v Immersion
`IPR2016-01372
`
`

`

`
`opinion that claims 3, 14, and 25 are not rendered obvious by the combination of
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`Rosenberg ’373 and Rosenberg ’846.
`
`III. QUALIFICATIONS AND EXPERIENCE
`9.
`I obtained my Ph.D. degree in Electrical and Computer Engineering
`
`from McGill University in 2011. Before that, I received my MA in Physics from
`
`the University of Texas at Austin in 1999, and my BA in Physics from Wesleyan
`
`University in 1995.
`
`10. Since 2015, I have worked as an Assistant Professor at UCSB. From
`
`2013 to 2015, I worked as an Assistant Professor in the Department of Electrical
`
`and Computer Engineering at Drexel University.
`
`11. At UCSB, I lead the RE Touch Lab as its Director and Principal
`
`Investigator. The RE Touch Lab includes six Ph.D. students and numerous
`
`affiliated researchers and undergraduate students. Some of the topics that my
`
`teams at the RE Touch Lab have explored include computational perception, such
`
`as how the mechanical signatures of contact elicit conscious perception of touch,
`
`and the creation of novel haptic devices for simulating the feel of touched objects.
`
`12. My personal research focuses on haptic engineering, robotics, and the
`
`mechanics and neuroscience of touch. My work is motivated by creative
`
`applications in haptic human-computer interaction, sensorimotor augmentation,
`
`and interaction in virtual reality.
`
`
`
`
`- 4 -
`
`
`
`Immersion Ex 2001-4
`Apple v Immersion
`IPR2016-01372
`
`

`

`13.
`
`In addition to my research at the RE Touch Lab, I also teach classes,
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`
`
`including linear and nonlinear control systems, haptics, human-computer
`
`interaction, interactive arts, artificial intelligence, and robotics.
`
`14.
`
`I am the author of over 60 articles in journals and conference
`
`proceedings. I hold one issued patent, U.S. Patent No. 9,041,521 (“Floor-Based
`
`Haptic Communication System”), and one pending patent application (“Stretchable
`
`Tactile Sensing Array”), both pertaining to haptic technology. I am the editor of
`
`two books on virtual reality, including Human Walking in Virtual Reality. I have
`
`received several awards and honors, including the Google Faculty Research Award
`
`in 2016, and several best paper awards at haptics symposia. I have chaired and
`
`edited several conferences and symposia.
`
`15.
`
`I also have experience working in industry. Before receiving my
`
`Ph.D., I worked for several years as the Principal DSP developer, audio at Ableton,
`
`a renowned music software company. Before that I worked for several years as a
`
`Research Scientist investigating speech recognition at Loquendo Inc., which is
`
`now part of Nuance.
`
`16. My curriculum vitae is attached as Exhibit 2007.
`
`17.
`
`I am being compensated by Immersion for my time spent in
`
`developing this declaration at a rate of $400 per hour, and for any time spent
`
`testifying in connection with this declaration at a rate of $500 per hour. My
`
`
`
`
`- 5 -
`
`
`
`Immersion Ex 2001-5
`Apple v Immersion
`IPR2016-01372
`
`

`

`
`compensation is not contingent upon the substance of my opinions, the content of
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`this declaration or any testimony I may provide, or the outcome of the inter partes
`
`review or any other proceeding.
`
`18.
`
`I have no financial interest in Immersion, and have financial interests
`
`of less than $3000 in Apple through long-term mutual fund investments
`
`representing less than 1% of my portfolio.
`
`19. My opinions expressed in this declaration are based on the petition
`
`and exhibits cited in the petition, and other documents and materials identified in
`
`this declaration, including the ’571 patent and its prosecution history, the prior art
`
`references and materials discussed in this declaration, and any other references
`
`specifically identified in this declaration.
`
`20.
`
`I am aware of information generally available to, and relied upon by,
`
`persons of ordinary skill in the art at the relevant times, including technical
`
`dictionaries and technical reference materials (including, for example, textbooks,
`
`manuals, technical papers, articles, and relevant technical standards).
`
`21.
`
`I reserve the right to supplement my opinions to address any
`
`information obtained, or positions taken, based on any new information that comes
`
`to light throughout this proceeding.
`
`
`
`
`- 6 -
`
`
`
`Immersion Ex 2001-6
`Apple v Immersion
`IPR2016-01372
`
`

`

`
`IV. LEVEL OF ORDINARY SKILL IN THE ART
`22.
`It is my understanding that the ’571 patent should be interpreted based
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`on how it would be read by a person of ordinary skill in the art at the time of the
`
`effective filing date of the application. It is my understanding that factors such as
`
`the education level of those working in the field, the sophistication of the
`
`technology, the types of problems encountered in the art, the prior art solutions to
`
`those problems, and the speed at which innovations are made may help establish
`
`the level of skill in the art.
`
`23.
`
`I am familiar with the technology at issue and the state of the art at the
`
`earliest priority date of the ’571 patent.
`
`24.
`
`It is my opinion, based upon a review of the ’571 patent, its file
`
`history, and my knowledge of the field of the art, a person of ordinary skill in the
`
`art for the field of the ’571 patent would have at least: (1) a Bachelor's of Science
`
`degree in an engineering discipline such as Mechanical Engineering or Computer
`
`Science, or (2) at least two years' experience working with human machine
`
`interface systems, graphical user interfaces, haptic feedback systems, robotics,
`
`biomechanics, or mobile devices or equivalent embedded systems. This level of
`
`skill is commensurate with the interdisciplinary nature of the ’571 patent, which
`
`combines knowledge of computer software and user interface design with
`
`knowledge of electrical and/or mechanical systems for producing haptic effects.
`
`
`
`
`- 7 -
`
`
`
`Immersion Ex 2001-7
`Apple v Immersion
`IPR2016-01372
`
`

`

`25.
`
`I have considered the issues discussed in the remainder of this
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`
`
`declaration from this perspective of a person of ordinary skill in the art. Although
`
`I use this perspective, I do not believe that any of my opinions would change if a
`
`slightly higher or lower level of skill were assumed. For example, I understand
`
`that Apple contends that a person of ordinary skill in the art would have both a
`
`Bachelor's degree and two to three years of professional experience. Petition at
`
`6. My opinions concerning the validity of the ’571 patent would not change under
`
`Apple’s proposed level of ordinary skill in the art. Similarly, it is my opinion that
`
`Immersion’s proposed claim constructions should be adopted under either
`
`Immersion or Apple’s proposed level of ordinary skill in the art.
`
`V. LEGAL PRINCIPLES
`A. Claim Construction
`26.
`I am not a patent attorney and my opinions are limited to what I
`
`believe a person of ordinary skill in the art would have understood, based on the
`
`patent documents. I use the principles below, however, as a guide in formulating
`
`my opinions.
`
`27. My understanding is that a primary step in determining validity of
`
`patent claims is to properly construe the claims to determine claim scope and
`
`meaning.
`
`
`
`
`- 8 -
`
`
`
`Immersion Ex 2001-8
`Apple v Immersion
`IPR2016-01372
`
`

`

`28.
`
`In an inter partes review proceeding, as I understand from Immersion
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`
`
`counsel, claims are to be given their broadest reasonable construction (“BRC”) in
`
`light of the patent’s specification. 37 C.F.R. § 42.100(b). In other forums, such as
`
`in federal courts, different standards of proof and claim interpretation control,
`
`which are not applied by the patent office for inter partes review. Accordingly, I
`
`reserve the right to argue for a different interpretation or construction of the
`
`challenged claims in other proceedings, as appropriate.
`
`29.
`
`It is my understanding that in determining whether a patent claim is
`
`anticipated or obvious in view of the prior art, the patent office must construe the
`
`claim by giving the claim its broadest reasonable construction consistent with the
`
`specification. For the purposes of this review, I have construed each claim term in
`
`accordance with its plain and ordinary meaning under the required broadest
`
`reasonable construction.
`
`B. Anticipation
`30.
`It is my understanding that a claim is anticipated under 35 U.S.C. §
`
`102 if each and every element and limitation of the claim is found either expressly
`
`or inherently in a single prior art reference. I understand that anticipation is a
`
`question of fact. I further understand that the requirement of strict identity
`
`between the claim and the reference is not met if a single element or limitation
`
`required by the claim is missing from the applied reference.
`
`
`
`
`- 9 -
`
`
`
`Immersion Ex 2001-9
`Apple v Immersion
`IPR2016-01372
`
`

`

`
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`C. Obviousness
`31.
`It is my understanding that a claim is unpatentable under 35 U.S.C. §
`
`103 if the claimed subject matter as a whole would have been obvious to a person
`
`of ordinary skill in the art at the time of the alleged invention. I understand that the
`
`determination of obviousness is made with respect to the subject matter as a whole,
`
`not separate pieces of the claim. I understand that obviousness is a question of law
`
`based on underlying factual issues. I also understand that an obviousness analysis
`
`takes into account the scope and content of the prior art, the differences between
`
`the claimed subject matter and the prior art, the level of ordinary skill in the art at
`
`the time of the invention, and the existence of secondary consideration such as
`
`commercial success or long-felt but unresolved needs.
`
`VI. THE ’571 PATENT
`32.
`I have read and reviewed the ’571 patent and have an understanding
`
`of its background as well as its particular improvements over the prior art. I
`
`understand that the ’571 patent is entitled “Interactivity Model for Shared
`
`Feedback on Mobile Devices.” In my opinion, the ’571 patent is directed to a
`
`novel way of producing haptic effects in electronic devices. A person of ordinary
`
`skill in the art would recognize that the fundamental insight that is described and
`
`claimed in the ’571 patent is that the user’s gestural interactions with the device
`
`need to be tracked and analyzed in order to properly synchronize haptic feedback
`
`
`
`
`- 10 -
`
`
`
`Immersion Ex 2001-10
`Apple v Immersion
`IPR2016-01372
`
`

`

`
`with a user’s input. Reflecting this focus, the claims specify that both a first and a
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`second gesture signal (each based on a user’s gestural inputs) are used to generate
`
`something called a “dynamic interaction parameter.” Ex. 1001 at claim 1
`
`(“receiving a first gesture signal; receiving a second gesture signal; generating a
`
`dynamic interaction parameter using the first gesture signal and the second gesture
`
`signal”). I understand that given the format of the claims, a single gesture signal is
`
`insufficient to form the dynamic interaction parameter.
`
`33.
`
`In my opinion, the dynamic interaction parameter is meant to
`
`accurately and responsively track the user’s behavior. As such, a person of
`
`ordinary skill in the art would understand that the dynamic interaction parameter
`
`changes or reacts in real time to the user’s interactions, and is used to alter the
`
`haptic effects produced by the device. This allows the device to provide
`
`responsive haptic feedback to the user. Ex. 1001 at 1:29-33 (“[V]ibrotactile haptic
`
`effects . . . may be useful in providing cues to users of electronic devices to alert
`
`the user to specific events, or provide realistic feedback to create greater sensory
`
`immersion within a simulated or virtual environment.”). A person of ordinary skill
`
`in the art would recognize that the approach of the ’571 patent is an improvement
`
`over the prior art because the ’571 patent’s techniques can improve the timing
`
`and/or nature of haptic feedback: “[B]ecause these user gestures and system
`
`animations have variable timing, the correlation to haptic feedback [in the prior art]
`
`
`
`
`- 11 -
`
`
`
`Immersion Ex 2001-11
`Apple v Immersion
`IPR2016-01372
`
`

`

`
`may be static and inconsistent and therefore less compelling to the user.” Id. at
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`1:49-56.
`
`34. Other ingredients may be used in addition to a first gesture signal and
`
`a second gesture signal to generate the dynamic interaction parameter. For
`
`example, additional device sensor signals may be used. Id. at claim 7. A person of
`
`ordinary skill in the art would recognize that using these additional ingredients is
`
`another improvement over the prior art. E.g., id. at 1:56-60 (“Further, device
`
`sensor information is typically not used in combination with gestures to produce
`
`haptic feedback.”). The various ingredients may be combined and processed in
`
`several different ways to generate the dynamic interaction parameter. See, e.g., id.
`
`at Table 2 (listing 14 different example “methods of synthesis” that may be
`
`employed). In my opinion, the dependent claims of the ’571 patent show that the
`
`generation of the dynamic interaction parameter using both a first gesture signal
`
`and a second gesture signal, including the selection and processing of the
`
`ingredients, is the inventive focus. A person of ordinary skill in the art reading the
`
`patent would understand that the claims require specific ingredients in specific
`
`numbers to be used to generate the dynamic interaction parameter. E.g., id. at
`
`claim 7 (“receiving a first device sensor signal; receiving a second device sensor
`
`signal; and wherein generating a dynamic interaction parameter comprises
`
`generating a dynamic interaction parameter using the first gesture signal and the
`
`
`
`
`- 12 -
`
`
`
`Immersion Ex 2001-12
`Apple v Immersion
`IPR2016-01372
`
`

`

`
`second gesture signal and the first device sensor signal and the second device
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`sensor signal”).
`
`35. Once the dynamic interaction parameter has been generated using a
`
`first gesture signal, a second gesture signal, and potentially other ingredients, it is
`
`used to modify the haptic output of the system. Id. at 15:8-9 (“At 1313, a drive
`
`signal is applied to a haptic actuator according to the interaction parameter.”); see
`
`also claim 1 (“applying a drive signal to a haptic output device according to the
`
`dynamic interaction parameter”). For example, in one embodiment, a user may
`
`scroll between different film frames on an electronic device with a touchscreen,
`
`and may receive haptic feedback for that interaction. Id. at 13:56-61 (“By using
`
`gestures or device sensor data, a user may scroll the filmstrip from left to right or
`
`right to left, and the filmstrip application may then dynamically provide a haptic
`
`effect for a first photograph 1101 which is different from a haptic effect for a
`
`second photograph 1103 based upon the gestures or device sensor data.”).
`
`VII. CLAIM CONSTRUCTION
`36.
`I have reviewed Apple and Immersion’s proposed claim constructions
`
`and evaluated them from the perspective of a person of ordinary skill in the art at
`
`the time of the invention. It is my opinion that Immersion’s proposed claim
`
`constructions best reflect the broadest reasonable construction from the relevant
`
`terms, as would be understood by a person of ordinary skill in the art.
`
`
`
`
`- 13 -
`
`
`
`Immersion Ex 2001-13
`Apple v Immersion
`IPR2016-01372
`
`

`

`Case IPR2016-01372
`Patent No. 8,659,571
`
`“gesture signal” (claims 1-7, 12-18, 23-29)
`
`In my opinion, the broadest reasonable construction of the term
`
`
`
`A.
`37.
`
`“gesture signal” in view of the relevant evidence is “an electronic signal,
`
`representing a recognized movement of the body that conveys meaning or user
`
`intent.” It is also my opinion that all relevant claim construction factors, including
`
`the plain meaning of the term, the specification, the prosecution history, and the
`
`extrinsic evidence, support this construction.
`
`38. A person of ordinary skill in the art would draw a distinction between
`
`the claimed “gesture signal” and the claimed “device sensor signal.” Further, a
`
`person of ordinary skill in the art would understand that the inclusion of the word
`
`“gesture” within “gesture signal” indicates that the gesture signal is associated with
`
`a user gesture.
`
`39. A person of ordinary skill in the art would consult the specification for
`
`guidance, and would utilize the disclosures in column 3 of the ’571 patent to craft
`
`the appropriate construction. See Ex. 1001 at 3:34-35 (“A gesture is any
`
`movement of the body that conveys meaning or user intent.”); 3:56-59 (“A gesture
`
`can also be any form of hand movement recognized by a device having an
`
`accelerometer, gyroscope, or other motion sensor, and converted to electronic
`
`signals.”). It is my opinion that these two quotations from the specification
`
`effectively set forth a definition of “gesture signal,” which is appropriately
`
`
`
`
`- 14 -
`
`
`
`Immersion Ex 2001-14
`Apple v Immersion
`IPR2016-01372
`
`

`

`
`captured by Immersion’s proposed construction, and not by Apple’s proposed
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`construction.
`
`40.
`
`In my opinion, the specification consistently refers to movements of
`
`the body as the genesis of a gesture signal, which agrees with the plain and
`
`ordinary meaning of “gesture”:
`
`It will be recognized that simple gestures may be combined to form
`more complex gestures. For example, bringing a finger into contact
`with a touch sensitive surface may be referred to as a “finger on”
`gesture, while removing a finger from a touch sensitive surface may
`be referred to as a separate “finger off” gesture. If the time between
`the “finger on” and “finger off” gestures is relatively short, the
`combined gesture may be referred to as “tapping”; if the time between
`the “finger on” and “finger off” gestures is relatively long, the
`combined gesture may be referred to as “swiping”; if the distance
`between the two-dimensional (x,y) positions of the “finger on” and
`“finger off” gestures is relatively small, the combined gesture may be
`referred to as “smearing”, “smudging” or “flicking”.
`Id. at 3:36-52.
`
`41.
`
`It is also my opinion that there are movements of the body that are not
`
`full gestures and could not comprise an entire gesture signal. A person of ordinary
`
`skill in the art would recognize that the ability of a particular system to parse
`
`gestures would depend on the implementation of that system. For example, in one
`
`embodiment disclosed in the specification, swiping a finger across a touch screen
`
`
`
`
`- 15 -
`
`
`
`Immersion Ex 2001-15
`Apple v Immersion
`IPR2016-01372
`
`

`

`
`does produce multiple position signals (i.e., a “device sensor signals”) from the
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`touchscreen hardware at different times, but a person of ordinary skill in the art
`
`would understand that those position signals are together the constituents of a
`
`single “swipe” gesture:
`
`FIG. 9B shows a screen view of a user gesture using a single index
`finger being swiped across the touch sensitive display from right to
`left in order to display the next photograph. Multiple inputs from the
`index finger are received from the single gesture. Each of the
`multiple inputs may occur at a different time and may indicate a
`different two dimensional position of the contact point of the index
`finger with the touch sensitive display.
`Id. at 10:36-43; see also id. at 14:18-22 (a single “scrolling gesture” is detected
`
`based on moving a finger across the touchscreen over time). Therefore, it is my
`
`opinion that recording the position of a contact of a user’s finger at one point in
`
`time is not sufficient in this embodiment to recognize a full gesture. A person of
`
`ordinary skill in the art would not think that position data, standing alone, has
`
`enough information to “convey meaning or user intent.” Id. at 3:35-36.
`
`42.
`
`It is further my opinion that the number and type of gesture signals
`
`that are present in a system varies based on how the system is implemented. The
`
`same movements of the body may result in one, two, or more gesture signals,
`
`depending on the system. E.g., Ex. 1001 at 3:37-52 (explaining that “finger on”
`
`and “finger off” gestures could be interpreted as a single, more complex gestures
`
`
`
`
`- 16 -
`
`
`
`Immersion Ex 2001-16
`Apple v Immersion
`IPR2016-01372
`
`

`

`
`such as “tapping,” “long tapping,” “swiping,” “smearing,” “smudging,” or
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`“flicking”); 14:18-22 (describing a single “scrolling gesture” that implicitly
`
`involves subsidiary “finger on,” “finger off,” and “swipe” actions).
`
`43. A person of ordinary skill in the art would identify the common
`
`element among these embodiments as the fact that the gestures that are recognized
`
`by a given system and encapsulated in the claimed “gesture signals” depend on
`
`how the system is programmed and architected. For example, a person of
`
`ordinary skill in the art would realize that the specification discloses that it is
`
`possible to develop a system that recognizes separate “finger on” and “finger off”
`
`gestures and has no concept of “compound” gestures. Cf. id. at 3:34-62. Similarly,
`
`a person of ordinary skill in the art would recognize from this that a different
`
`system could use the same hardware as the first, could be manipulated with the
`
`same user hand movements, and could process the same underlying sensor signals
`
`from the touchscreen, but instead may be configured to recognize only a “swiping”
`
`gesture. Id. at 3:46-49. In sum, it is my opinion that if the software in this
`
`different system is not equipped to track and package the underlying sensor signals
`
`into distinct “finger on” and “finger off” gestures, then the system does not detect a
`
`“gesture signal” for those gestures. In my opinion, Immersion’s proposed
`
`construction preserves this aspect of the claimed “gesture signals,” while Apple’s
`
`construction does not.
`
`
`
`
`- 17 -
`
`
`
`Immersion Ex 2001-17
`Apple v Immersion
`IPR2016-01372
`
`

`

`44.
`
`It is also my opinion that this distinction between simple position
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`
`
`information and a gesture signal was also made by the applicant during
`
`prosecution. See Ex. 2003 at 9 (August 2, 2012 Applicant Remarks in prosecution
`
`of U.S. Patent No. 8,279,193, a prior patent in the same family as the ’571 patent)
`
`(“[Prior art reference raised by the examiner] Marvit describes gestures in the
`
`context of motion sensor engagement for a handheld device. For example, input
`
`movement may be in the form of translation and/or gestures. Translation-based
`
`input focuses on a beginning point and endpoint of a motion and difference
`
`between such beginning points and endpoints.”). A person of ordinary skill in the
`
`art reading this section of the prosecution history would understand that the
`
`applicant’s comments, in addition to the disclosure of the Marvit reference, show
`
`that a translation-based input is not necessarily a gesture.
`
`45.
`
`It is also my opinion that Immersion’s interpretation is also consistent
`
`with the use of the term “gesture” in the art of human-computer interaction. In this
`
`field, which I am very familiar with, a “gesture” represents a movement of the
`
`body that expresses a full interaction with or command to a device—not merely a
`
`partial or inchoate interaction. See Ex. 2002 at 1
`
`(http://www.dictionary.com/browse/gesture?s=t) (“gesture . . . 4. Digital
`
`Technology. a particular movement of the body, typically the fingers or hand, used
`
`to control or interact with a digital device (often used attributively): a gesture
`
`
`
`
`- 18 -
`
`
`
`Immersion Ex 2001-18
`Apple v Immersion
`IPR2016-01372
`
`

`

`
`command; Use a two-finger pinching gesture on your touchscreen to zoom in or
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`out.”). The design of real products follows this definition. For example, electronic
`
`devices typically include haptic feedback that is delivered once a gesture has been
`
`recognized. Providing feedback based on intermediate changes in position or
`
`pressure is certainly possible, but a person of ordinary skill in the art would not
`
`automatically characterize that as based on the recognition or processing of a
`
`“gesture signal”—it would depend on how the particular system treated those
`
`intermediate changes.
`
`46.
`
`In my opinion, Apple’s construction of gesture signal (“a signal
`
`indicating user interaction with a user interface device”) is inappropriate and casts
`
`a net that is too wide. First, Apple’s construction is not limited to a movement of
`
`the hand, or even the body. For example, a user could “interact” with a “user
`
`interface device” via electrodes attached to the scalp that measure voltage
`
`fluctuations resulting from ionic current within the neurons of the brain (i.e., using
`
`electroencephalography (EEG) to measure thoughts). Ex. 1001 at 11:13, 11:29.
`
`But a person of ordinary skill in the art would not think that the ’571 patent
`
`contemplates referring to that as a “gesture.” A person of ordinary skill in the art
`
`would instead turn to the section of the specification that clarifies that a gesture is a
`
`“movement of the body.” Ex. 1001 at 3:35-36. In my opinion, Apple’s proposed
`
`construction goes beyond the specification to encompass any user interaction,
`
`
`
`
`- 19 -
`
`
`
`Immersion Ex 2001-19
`Apple v Immersion
`IPR2016-01372
`
`

`

`
`including interactions that do not involve hand or body movement. A person of
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`ordinary skill in the art would not consider these interactions to be gestures, nor
`
`would they consider signals based on those interactions to be “gesture signals.”
`
`47.
`
`It is further my opinion that Apple’s proposed construction does not
`
`require that the gesture signal correspond to a particular recognized gesture—i.e. to
`
`an action that demonstrates “meaning or user intent.” Ex. 1001 at 3:35-36 (“A
`
`gesture is any movement of the body that conveys meaning or user intent.”). A
`
`person of ordinary skill in the art would understand that not every movement of the
`
`body has an ascertainable meaning. Therefore, monitoring movement over time
`
`may be necessary to determine the user’s intent. In my opinion, Apple’s
`
`construction does not preserve this important aspect of the term “gesture signal.”
`
`48.
`
`It is additionally my opinion that Apple’s proposed construction
`
`conflates a “device sensor signal,” which is separately described and claimed (e.g.,
`
`id. at claim 7) with a “gesture signal.” A person of ordinary skill in the art would
`
`recognize that there are many user interactions that could result in a “device sensor
`
`signal” that would also qualify as a “gesture signal” under Apple’s construction,
`
`but which have no direct relationship to a purposeful, gestural movement of the
`
`body. See, e.g., id. at 11:11-14 (device sensor signals that are based on some form
`
`of user interaction include, in addition to EEG signals, “any type of bio monitor
`
`such as skin or body temperature, blood pressure (BP), heart rate monitor
`
`
`
`
`- 20 -
`
`
`
`Immersion Ex 2001-20
`Apple v Immersion
`IPR2016-01372
`
`

`

`
`(HRM), . . . or galvanic skin response (GSR)”). Further, in the case where a device
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`sensor signal is used in the recognition of a gesture, a person of ordinary skill in
`
`the art would recognize that the specification draws a distinction between that
`
`signal and the gesture itself. Ex. 1001 at 3:16-18 (“A device sensor signal may be
`
`generated by any means, and typically may be generated by capturing a user
`
`gesture with a device.”).
`
`49. Moreover, it is my opinion that Apple’s construction does not respect
`
`the claim requirements that both a “first gesture signal” and a “second gesture
`
`signal” be received by the system. Under Apple’s construction, several underlying
`
`sensor signals that corresponded to only a single user gesture could be interpreted
`
`as the “first gesture signal” and the “second gesture signal.” In my opinion, one
`
`inventive focus of the ’571 patent was the recognition and processing of two
`
`separate underlying gestures, which is reflected in the claim requirements for a
`
`“first gesture signal” and “second gesture signal.”
`
`50.
`
`In sum, it is clear to me that the practical effect of Apple’s proposed
`
`construction is to replace “gesture signal” with “interaction signal” in the claim
`
`language. But a person of ordinary skill in the art would understa

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket