throbber

`
`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`________________
`
`
`APPLE, INC.
`Petitioner,
`
`v.
`
`UNIVERSAL SECURE REGISTRY LLC
`Patent Owner
`________________
`
`Case CBM-00024
`U.S. Patent No. 8,577,813
`________________
`
`
`
`PATENT OWNER’S EXHIBIT 2004
`DECLARATION OF ALLAN M. SCHIFFMAN IN SUPPORT
`OF PATENT OWNER’S PRELIMINARY RESPONSE
`
`
`
`
`
`
`
`USR Exhibit 2004
`
`

`

`
`
`1.
`
`I have been retained on behalf of Universal Secure Registry LLC
`
`(“Patent Owner”) in connection with the above-captioned covered business method
`
`review (CBM). I have been retained to provide my opinions in support of USR’s
`
`Preliminary Response. I am being compensated for my time at the rate of $475 per
`
`hour. I have no interest in the outcome of this proceeding.
`
`2.
`
`In preparing this declaration, I have reviewed the Petition for Covered
`
`Business Method Review, CBM2018-00024, U.S. Patent No. 8,577,813 and its file
`
`history, the declaration of Dr. Victor Shoup, and the materials cited herein.
`
`3.
`
`The statements made herein are based on my own knowledge and
`
`opinion. This Declaration represents only the opinions I have formed to date. I may
`
`consider additional documents as they become available or other documents that
`
`are necessary to form my opinions. I reserve the right to revise, supplement, or
`
`amend my opinions based on new information and on my continuing analysis.
`
`I.
`
`QUALIFICATIONS
`4. My qualifications can be found in my Curriculum Vitae, which
`
`includes my detailed employment background, professional experience, and list of
`
`technical publications and patents. Ex. 2005.
`
`5.
`
`I am a principal at CommerceNet, a non-profit organization with the
`
`mission of supporting innovation, standards-setting, and entrepreneurship in
`
`applying the Internet to public benefit. My current work primarily involves
`
`
`
`
`USR Exhibit 2004, Page 1
`
`

`

`
`
`advising new companies on product development. I am President of
`
`CommerceNet, and I have held that position since 2005.
`
`6.
`
`I received a Master of Science degree in Computer Science from
`
`Stanford University in 1986.
`
`7.
`
`In my early career (through 1991), I held a variety of hardware and
`
`software engineering positions, eventually leading to Vice-President of
`
`Technology at ParcPlace Systems, a company specializing in software
`
`development tools, which ultimately became a successful public company.
`
`8.
`
`In early 1991, I co-founded Enterprise Integration Technologies (EIT)
`
`and served as Chief Technical Officer. I continued as CTO through mid-1995
`
`when EIT was acquired by Verifone, then (and now) the global market leader in
`
`point-of-sales payment systems. At EIT I led projects in secure distributed network
`
`computer systems, focusing on the then-novel web. Many of those projects
`
`involved the design and development of new network security protocols as well as
`
`analysis and application of existing protocols. This work had a common theme of
`
`applying cryptography to the problems of identification, authentication,
`
`confidentiality, non-repudiability, and privacy in Internet electronic commerce.
`
`9.
`
`In 1994, EIT chartered the non-profit organization CommerceNet to
`
`create Internet electronic commerce standards, receiving a Federal research award
`
`from the Technology Reinvestment Program; I served as Principal Investigator.
`
`
`
`
`USR Exhibit 2004, Page 2
`
`

`

`
`
`Our accomplishments included the first secure Internet credit-card transactions and
`
`use of XML-based messaging.
`
`10. From 1993 through 1998, I was an active participant in multiple
`
`standards bodies beyond CommerceNet, including the World-Wide Web
`
`Consortium (W3C) and the Internet Engineering Task Force (IETF).
`
`11.
`
`I was a member of the team that designed the Secure Electronic
`
`Transactions payment card protocol commissioned by MasterCard and Visa. This
`
`protocol included novelties such as dual signatures, use of OAEP, in-band
`
`certificate revocation and employment of ASN.1v3 certificate attributes.
`
`12.
`
`In 1995, I founded Terisa Systems and served as CTO and acting
`
`CEO, with backing from Verifone, RSA Data Security, Netscape, IBM and AOL.
`
`Terisa’s mission was to standardize Internet communications security protocols
`
`and provide implementations of these. Our products included officially-
`
`commissioned “reference” implementations of the SSL and SET protocols. Terisa
`
`was acquired by Spyrus, Inc. in 1997.
`
`13.
`
`In 2007, I co-founded Usable Security Systems as CTO to provide
`
`“plug-in” multi-factor authentication system for the web. In the process of product
`
`development I investigated and experimented with the integration of hardware for
`
`user authentication (such as fingerprint readers) into our product. Usable was
`
`acquired by Webroot, Inc. in 2010.
`
`
`
`
`USR Exhibit 2004, Page 3
`
`

`

`
`
`14.
`
`In addition to the standardization activities already mentioned, I have
`
`served as a member on several security-oriented committees: the W3C Security
`
`Advisory Board, Netscape’s Security Advisory Board, and the National Research
`
`Council Committee on Information Systems Trustworthiness.
`
`15.
`
`I have authored six peer-reviewed papers, I am a named inventor on
`
`two US patents, and a co-author of two IETF RFCs.
`
`16.
`
`I have over twenty-five years of research and development experience
`
`in the areas of secure distributed systems, secure network protocols, application of
`
`cryptography/encryption, authentication, electronic commerce transactions, and
`
`online payment systems.
`
`II. LEGAL UNDERSTANDING
`A. The Person of Ordinary Skill in the Art
`17.
`I understand that a person of ordinary skill in the relevant art (also
`
`referred to herein as “POSITA”) is presumed to be aware of all pertinent art, thinks
`
`along conventional wisdom in the art, and is a person of ordinary creativity—not
`
`an automaton.
`
`18.
`
`I understand that it is Patent Owner’s position that a person of
`
`ordinary skill in the art (“POSITA”) relevant to the ’813 patent at the time of the
`
`invention would have a Bachelor of Science degree in electrical engineering,
`
`computer science or computer engineering, and three years of work or research
`
`
`
`
`USR Exhibit 2004, Page 4
`
`

`

`
`
`experience in the fields of secure transactions and encryption; or a Master’s degree
`
`in electrical engineering, computer science or computer engineering, and two years
`
`of work or research experience in related fields.
`
`19.
`
`I have reviewed the declaration of Dr. Victor Shoup, including his
`
`opinions regarding the POSITA. Ex. 1202 at ¶ 38. I understand that it is
`
`Petitioner’s position that a POSITA relevant to the ’813 patent at the time of the
`
`invention would have a Bachelor of Science degree in electrical engineering,
`
`computer science or related scientific field, and approximately two years of work
`
`experience in the computer science field, including for example, operating systems,
`
`database management, encryption, security algorithms, and secure transaction
`
`systems, though additional education can substitute for less work experience and
`
`vice versa.
`
`20. The opinions set forth in this Declaration would be the same under
`
`either Patent Owner or Petitioner’s proposal.
`
`21.
`
`I was a person of at least ordinary skill in the art at the time of the
`
`priority date of the ‘813 patent in 2006. Regardless if I do not explicitly state that
`
`my statements below are based on this timeframe, all of my statements are to be
`
`understood as a POSITA would have understood something as of the priority date
`
`of the ’813 patent.
`
`
`
`
`USR Exhibit 2004, Page 5
`
`

`

`
`
`B.
`22.
`
`Legal Principles
`
`I am not a lawyer and will not provide any legal opinions. Though I
`
`am not a lawyer, I have been advised that certain legal standards are to be applied
`
`by technical experts in forming opinions regarding the meaning and validity of
`
`patent claims.
`
`1. Obviousness
`
`23.
`
`I understand that to obtain a patent, a claimed invention must have, as
`
`of the priority date, been nonobvious in view of prior art in the field. I understand
`
`that an invention is obvious when the differences between the subject matter
`
`sought to be patented and the prior art are such that the subject matter as a whole
`
`would have been obvious at the time the invention was made to a person having
`
`ordinary skill in the art.
`
`24.
`
`I understand that to prove that prior art, or a combination of prior art,
`
`renders a patent obvious, it is necessary to: (1) identify the particular references
`
`that singly, or in combination, make the patent obvious; (2) specifically identify
`
`which elements of the patent claim appear in each of the asserted references; and
`
`(3) explain how the prior art references could have been combined to create the
`
`inventions claimed in the asserted claim.
`
`25.
`
`I understand that a patent composed of several elements is not proved
`
`obvious merely by demonstrating that each of its elements was, independently,
`
`
`
`
`USR Exhibit 2004, Page 6
`
`

`

`
`
`known in the prior art, and that obviousness cannot be based on the hindsight
`
`combination of components selectively culled from the prior art to fit the
`
`parameters of the patented invention.
`
`26.
`
`I understand that a proposed modification to a reference is improper if
`
`such a modification would render the reference inoperable for its intended purpose.
`
`2. My Understanding of Claim Construction Law
`
`27.
`
`I understand that in this CBM review the claims must be given their
`
`broadest reasonable interpretation, but that interpretation must be consistent with
`
`the patent specification. In this Declaration, I have used the broadest reasonable
`
`interpretation (“BRI”) standard when interpreting the claim terms.
`
`28.
`
`I understand Petitioner identifies five terms that purportedly require
`
`construction. Pet. at 19-24. The construction for these terms do not impact my
`
`opinion in this declaration.
`
`III. OVERVIEW OF JAKOBSSON (EXHIBIT 1214)
`29.
`Jakobsson discloses a user device (the authentication device) that uses
`
`a “combination function 230” to generate authentication codes from inputs such as
`
`a “Secret (K),” a “Dynamic Value (T),” an “Event State (E),” and “User Data (P),”
`
`as shown below:
`
`
`
`
`USR Exhibit 2004, Page 7
`
`

`

`
`
`
`
`Jakobsson at Fig. 2. The authentication code that is generated by the authentication
`
`device (called AD) is transmitted to a back-end (called the “verifier”). Id., ¶ 118.
`
`30. On the back-end, the verifier uses the combination function and the
`
`inputs it has stored in its system to generate an authentication code (Av). Id. ¶ 117
`
`(“[T]he authentication code generated by the authentication device 120 is
`
`represented by (Ad) in order to distinguish it from the authentication code
`
`generated by the verifier…which will be referred to as (Av).”), 60 (“Referring to
`
`FIG. 2, in one embodiment of the user authentication device 120 and verifier 105
`
`of FIG. 1, various values are combined by a combination function 230 to generate
`
`an authentication code 290.”). The verifier compares the value of the
`
`authentication code it generates (A1V) “with the value of the received
`
`authentication code (AD) generated by the authentication device 120.” Id. ¶ 118.
`
`USR Exhibit 2004, Page 8
`
`
`

`

`
`
`“The authentication code (AD) is accepted by the verifier if (AD) is equal to (A1V).”
`
`Id.
`
`VI. THERE IS NO MOTIVATION TO COMBINE BECAUSE THE
`COMBINED SYSTEM WOULD BE INOPERABLE
`31.
`
`I understand that each of the independent claims of the ’813 patent
`
`require generating a specific type of encrypted authentication information. For
`
`example, claim limitation 1[f] of the ’813 patent recites:
`
`the processor also being programmed such that once the
`
`electronic ID device is activated the processor is
`
`configured to generate a non-predictable value and to
`
`generate encrypted authentication information from the
`
`non-predictable value, information associated with at
`
`least a portion of the biometric input, and the secret
`
`information.
`
`Ex. 1001 at 51:65-52:29. Claim limitations 16[e] and 24[b] include similar
`
`limitations. Id. at 53:25-48, 54:24-46.
`
`32.
`
`I understand that Petitioner argues that limitations 1[f], 16[e] and
`
`24[b] are “obvious in view of Maes and Jakobsson.” I disagree. A person of
`
`ordinary skill in the art would not be motivated to combine Maes and Jakobsson in
`
`the manner Petitioner proposes because the combined system would be inoperable.
`
`
`
`
`USR Exhibit 2004, Page 9
`
`

`

`
`
`33.
`
`In my opinion, one skilled in the art would not be able to implement
`
`the combination of Maes and Jakobsson in the manner Petitioner proposes. As
`
`explained in the prior section, Jakobsson discloses the back-end system (the
`
`verifier) authenticates the user by generating an authentication code A1V and
`
`compares it to the authentication code of the user device (AD) to determine if there
`
`is a match. For the verifier to generate an authentication code A1V that matches AD
`
`(i.e., for the verifier to be able to authenticate the authentication code of the user
`
`device AD), it is essential that the authentication device use inputs that can be
`
`exactly determined by the verifier. Unless all inputs that are supplied to the
`
`combination function can be determined by the verifier, the verifier will be unable
`
`to produce a matching authentication code with the combination function. Put
`
`another way, if the authentication device is using inputs that are unknown or not
`
`synchronized with the verifier, the verifier will be unable to use its combination
`
`function and stored (or computed) values to match the authentication code
`
`generated by the authentication device. The differing inputs will result in different
`
`authentication codes being produced, as shown below:
`
`
`
`
`USR Exhibit 2004, Page 10
`
`

`

`
`
`
`
`34.
`
`I understand that Petitioner contends that it would be obvious to
`
`combine Jakobsson’s combination function with the authentication system of
`
`Maes, such that the combination would generate “an authentication code A (K, T,
`
`E, P) 292.” Pet. at 42-50. I further understand that Petitioner contends that in its
`
`combination “user data (P)” can be the PIN and biometric data provided by the
`
`user. Pet. at 46 (citing paragraph [0072] of Jakobsson as discloses that “[t]he user
`
`data (P) can be the actual PIN, password, biometric data,, etc. that is provided by
`
`the user”), id (citing paragraph [0074] (“[i]t should be understood that there can be
`
`more than one item of user data (P), for example, provided by PIN entry and a
`
`fingerprint reader.”). I disagree because the combination would not render an
`
`operable device.
`
`35.
`
`Jakobsson’s combination function and inputs is reproduced below:
`
`
`
`
`USR Exhibit 2004, Page 11
`
`

`

`
`
`
`
`36.
`
`Jakobsson teaches that the values of “K” and “E” are precise values.
`
`Jakobsson, ¶ 64 (“The stored secret (K) is a unit of information such as a numerical
`
`value,” for example a value “manufactured into and stored inside the device.”), 83
`
`(“event state (E) is one bit, and there are two secrets…event state of 0 selects one
`
`secret … even state of 1 selects the other secret”). Similarly, a PIN is also a string
`
`with a precise value (e.g., “1234”). Jakobsson also teaches that “T” is a time value
`
`that is “synchronized between the device 120 and the verifier 105.” Jakobsson, ¶
`
`66 (“The dynamic value (T) is uniquely associated with a particular pre-determined
`
`time interval demarcated by a particular starting time and ending time….only
`
`requirement is that the time interval schedule be roughly synchronized between the
`
`device 120 and the verifier 105.”). Each of these items is a precise, reproducible
`
`value.
`
`
`
`
`USR Exhibit 2004, Page 12
`
`

`

`
`
`37. But a person of skill in the art would understand that biometric inputs
`
`are necessarily approximate (one could say “fuzzy”), such that two sensor readings
`
`(samples) are extremely unlikely to produce the same result. Indeed, a person of
`
`skill in the art would understand that biometric devices, such as fingerprint sensors,
`
`are known to produce different outputs for each sample. For example, if the same
`
`user provides two biometric inputs it is almost certain that the user’s finger will
`
`have been placed on the fingerprint sensor at a slightly different orientation and
`
`therefore different scanned fingerprints would be generated. Similarly, since
`
`fingerprint readers read the valleys and the ridges of the fingerprint, they are very
`
`sensitive to changing pressure, causing readings to change as a user cannot help
`
`but apply different finger pressure to the sensor at each reading. The image below,
`
`for instance, shows three fingerprint readings that have different results each time:
`
`
`
`
`USR Exhibit 2004, Page 13
`
`
`
`

`

`
`
`38. Other types of biometrics, such as voice or facial recognition, also
`
`produce different outputs per sample for similar reasons (e.g., the voice and facial
`
`sample will differ somewhat in each instance).
`
`39.
`
`In fact, a major component of using biometrics to recognize a user’s
`
`identity is addressing the difficult problem of necessarily approximate (as opposed
`
`to exact) matching of noisy, varying sensor sample data against a verifier’s stored
`
`parameters (“templates”), and making the engineering trade-offs between false-
`
`negative and false-positive recognition. When a biometric input is used to verify a
`
`user’s identity there is no need for the biometric input to have exact equality to the
`
`stored parameters or templates. Instead, a complex computation is used that
`
`measures goodness of fit (i.e., how well the match is) between the sample and the
`
`stored template, accepting approximate matches if they are within an allowed
`
`tolerance.
`
`40. Since we cannot expect an exact value for biometric data, a person of
`
`skill in the art would not be motivated to use the biometric input “P" for the
`
`combination function in the authentication device as Petitioner proposes in its
`
`combination of Maes and Jakobsson. Indeed, if “P” is a biometric input the value
`
`P at Maes’ PDA is extremely unlikely to be exactly the same as any biometric
`
`value previously captured and stored by the verifier. For example, in the image
`
`below the value of P entered into the combination function of the authentication
`
`
`
`
`USR Exhibit 2004, Page 14
`
`

`

`
`
`device varies each time (e.g., because the user cannot avoid placing their finger on
`
`the fingerprint sensor at a slightly different orientation each time) such that the
`
`authentication code generated by the authentication device will not match or be the
`
`one expected by the verifier:
`
`And given that the verifier will not know the exact value of the biometric, since it
`
`is variable, the verifier could not produce an authentication code A1V that matches
`
`
`
`AD.
`
`41.
`
`I have reviewed the Petition and the declaration of Dr. Shoup. Neither
`
`provides any explanation of how the proposed combination would operate,
`
`including how the biometric value could be used to generate an authentication code
`
`that can be matched by the verifier.
`
`
`
`
`USR Exhibit 2004, Page 15
`
`

`

`
`
`42. Nor does Jakobsson provide this disclosure. Indeed, Jakobsson does
`
`not teach one skilled in the art how a biometric can be used as an input to the
`
`combination function. For example, Jakobsson does not provide any disclosures of
`
`how a biometric value can be generated that is reproducible and not fuzzy. It does
`
`not disclose how the system (e.g., the combination function on the verifier) could
`
`account for the fuzziness of the biometric input. In fact, Jakobsson does not include
`
`any disclosure of how the verifier would use the combination function to generate
`
`a matching authentication code A1V using a biometric input. Nor does Jakobsson
`
`explain how it would deal with the fact that biometric devices, such as fingerprint
`
`scanners, produce different outputs for each sample. One skilled in the art would
`
`not know how to use biometric data as an input into the combination function such
`
`that the verifier could predict its exact value.
`
`43. Accordingly, in my opinion, a person of skill in the art would not be
`
`motivated to make the combination Petitioner proposes because the combination
`
`would lead to an inoperable device. A person of skill in the art would not know
`
`how to use a biometric as an input to the combination function since doing so
`
`would lead to authentication codes that could not be matched between the
`
`authentication device and verifier.
`
`
`
`
`USR Exhibit 2004, Page 16
`
`

`

`
`
`VII. CONCLUSION
`
`44.
`
`In signing this declaration, I recognize that the declaration will be
`
`filed as evidence in a contested case before the Patent Trial and Appeal Board of
`
`the United States Patent and Trademark Office. I also recognize that I may be
`
`subject to cross-examination in the case and that cross-examination will take place
`
`within the United States. If cross-examination is required of me, I will appear for
`
`cross-examination within the United States during the time allotted for cross-
`
`examination.
`
`45.
`
`I hereby declare that all statements made herein of my own
`
`knowledge are true and that all statements made on the information and belief are
`
`believed to be true; and further that these statements were made with the
`
`knowledge that willful false statements and the like so made are punishable by fine
`
`or imprisonment, or both, under Section 1001 of Title 18 of the United States
`
`Code.
`
`Executed: August 29, 2018
`
`
`
`
`
`
`
`
`
`
`
`
`
`Allan M. Schiffman
`
`USR Exhibit 2004, Page 17
`
`

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket