throbber
UNITED STATES PATENT AND TRADEMARK OFFICE
`____________________________________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`____________________________________________
`
`EMC Corporation
`Petitioner
`
`v.
`
`ActivIDentity, Inc.
`Patent Owner
`
`&
`
`Intellectual Ventures I, LLC,
`Exclusive Licensee.
`
`Case IPR2017-00338
`
`DECLARATION OF B. CLIFFORD NEUMAN, PH.D.
`REGARDING U.S. PATENT NO. 9,098,685
`CLAIMS 1, 3, 5, 7-9, 11, 13, 15, 16, and 19
`
`EMC v. IV
`IPR2017-00338
`Ex. 1002
`
`

`
`U.S. Patent 9,098,685
`Declaration of B. Clifford Neuman, Ph.D.
`
`
`TABLE OF CONTENTS
`
`Page
`
`BACKGROUND ............................................................................................. 1(cid:3)
`I.(cid:3)
`LEGAL PRINCIPLES ..................................................................................... 5(cid:3)
`II.(cid:3)
`PERSON OF ORDINARY SKILL IN THE ART .......................................... 6(cid:3)
`III.(cid:3)
`IV.(cid:3) OVERVIEW OF THE ’685 PATENT ............................................................ 7(cid:3)
`A.(cid:3)
`The Claimed Invention .......................................................................... 7(cid:3)
`Summary of the Prosecution History ............................................................ 13(cid:3)
`V.(cid:3)
`VI.(cid:3) CLAIM CONSTRUCTION .......................................................................... 18(cid:3)
`A.(cid:3)
`“security policy” (claims 1, 9, and 19) ................................................ 18(cid:3)
`B.(cid:3)
`“authorization method” (claims 1, 9, and 19 ) .................................... 19(cid:3)
`VII.(cid:3) THE CHALLENGED CLAIMS ARE UNPATENTABLE ..................... 21(cid:3)
`A.(cid:3)
`Background ......................................................................................... 22(cid:3)
`1.(cid:3) Overview of Wood ....................................................................... 24(cid:3)
`2.(cid:3) Overview of the Neuman 1999 IETF Draft ................................. 33(cid:3)
`Ground 1: Claims 1, 3, 5, 7-9, 11, 13, 15, 16, and 19 are
`anticipated by Wood ............................................................................ 39(cid:3)
`1.(cid:3) Claim 1 ......................................................................................... 40(cid:3)
`2.(cid:3) Claim 9 ......................................................................................... 53(cid:3)
`3.(cid:3) Claim 19 ....................................................................................... 58(cid:3)
`4.(cid:3) Claims 3 and 11 ............................................................................ 59(cid:3)
`5.(cid:3) Claims 5 and 13 ............................................................................ 60(cid:3)
`6.(cid:3) Claims 7 and 15 ............................................................................ 60(cid:3)
`7.(cid:3) Claims 8 and 16 ............................................................................ 61(cid:3)
`Ground 2: Claims 1, 3, 5, 7-9, 11, 13, 15, 16, and 19 are
`obvious over Wood in view of the Neuman 1999 IETF Draft ........... 62(cid:3)
`1.(cid:3) Claims 1, 9, and 19 ....................................................................... 62(cid:3)
`2.(cid:3) Claims 3 and 11 ............................................................................ 74(cid:3)
`3.(cid:3) Claims 5 and 13 ............................................................................ 74(cid:3)
`4.(cid:3) Claims 7 and 15 ............................................................................ 75(cid:3)
`
`C.(cid:3)
`
`B.(cid:3)
`
`i
`
`

`
`U.S. Patent 9,098,685
`Declaration of B. Clifford Neuman, Ph.D.
`
`5. Claims 8 and 16 ............................................................................ 75
`6. Motivation to Combine Wood and Neuman ................................ 76
`VIII. AVAILABILITY FOR CROSS-EXAMINATION ...................................... 80
`IX. RIGHT TO SUPPLEMENT .......................................................................... 80
`JURAT ........................................................................................................... 81
`X.(cid:3)
`
`ii
`
`

`
`I declare as follows:
`
`U.S. Patent 9,098,685
`Declaration of B. Clifford Neuman, Ph.D.
`
`
`1.
`
`My name is B. Clifford Neuman.
`
`I.
`
`BACKGROUND
`2.
`
`I received a Ph.D. in Computer Science in 1992 and an M.S. in
`
`Computer Science in 1988 from the University of Washington, and a B.S. in
`
`Computer Science and Engineering in 1985 from the Massachusetts Institute of
`
`Technology.
`
`3.
`
`Since receiving my doctorate, I have devoted my professional career
`
`to the research, design, development, study, and teaching of numerous aspects of
`
`computer systems. I have studied, taught, practiced, and researched in the field of
`
`computer science for over thirty years.
`
`4.
`
`I am currently an Associate Professor of Computer Science Practice in
`
`the Department of Computer Science at the University of Southern California
`
`(USC), where I have taught since 1992. I am also the Director of the Center for
`
`Computer Systems Security and Associate Director of the Informatics Program at
`
`USC and a Research Scientist at USC’s Information Sciences Institute.
`
`5.
`
`I teach and have taught numerous courses at USC, including advanced
`
`courses in computer science for upper-level undergraduates and graduate students,
`
`on topics such as distributed systems and computer and network security.
`
`1
`
`

`
`U.S. Patent 9,098,685
`Declaration of B. Clifford Neuman, Ph.D.
`
`6.
`
`As part of my research at USC, I have worked in a number of areas,
`
`including research in distributed computer systems with emphasis on scalability
`
`and computer security, especially in the areas of authentication, authorization,
`
`policy, electronic commerce, and protection of cyber-physical systems and critical
`
`infrastructure such as the power grid. I have worked on the design and
`
`development of scalable information, security, and computing infrastructure for the
`
`Internet. I am also the principal designer of the Kerberos system, an encryption
`
`based authentication system used among other things as the primary authentication
`
`method for most versions of Microsoft’s Windows, as well as many other systems.
`
`I developed systems which used Kerberos as a base for more comprehensive
`
`computer security services supporting authorization, accounting, and audit. My
`
`research includes managing computer security policies in systems that span
`
`multiple organizations and the use of policy as a unifying element for integrating
`
`security services including authorization, audit, and intrusion detection with
`
`systems and applications.
`
`7.
`
`In addition to my academic experience, I have many years of practical
`
`experience designing computer security systems. For example, from 1985-1986, I
`
`worked on Project Athena at MIT, to produce a campus-wide distributed
`
`computing environment. I also served as Chief Scientist at CyberSafe Corporation
`
`from 1992-2001. I have designed systems for network payment which build upon
`
`2
`
`

`
`security infrastructure to provide a secure means to pay for services provided over
`
`U.S. Patent 9,098,685
`Declaration of B. Clifford Neuman, Ph.D.
`
`
`the Internet. For example, I designed the NetCheque and NetCash systems, which
`
`are suitable for micropayments (payments on the order of pennies where the cost of
`
`clearing a credit card payment would be prohibitive). I am also the principal
`
`designer of the Prospero system which is used to organize and retrieve information
`
`distributed on the Internet. At one time the Prospero system was embedded in
`
`several commercial products, including early internet services provided by
`
`America Online.
`
`8.
`
`I have authored or co-authored over 50 academic publications in the
`
`fields of computer science and engineering. In addition, I have been a referee or
`
`editor for ACM Transaction on Information and Systems Security and
`
`International Journal of Electronic Commerce. My curriculum vitae includes a list
`
`of publications on which I am a named author.
`
`9.
`
`I am also a member of IEEE, Association for Computer Machinery
`
`(ACM), and the Internet Society (ISOC), among others. I have also served as
`
`program and/or general chair of the following conferences: Internet Society
`
`Symposium on Network and Distributed System Security, ACM Conference on
`
`Computer and Communications Security, and Internet Society Symposium on
`
`Network and Distributed System Security.
`
`3
`
`

`
`U.S. Patent 9,098,685
`Declaration of B. Clifford Neuman, Ph.D.
`
`
`A copy of my curriculum vitae is attached as Appendix A, which
`
`10.
`
`contains further details regarding my experience, education, publications, and other
`
`qualifications to render an expert opinion in connection with this proceeding.
`
`11.
`
`I have reviewed the specification, claims, and file history of U.S.
`
`Patent No. 9,098,685 (“the ’685 patent,” Ex. 1001), as well as the specification and
`
`claims of U.S. 7,137,008, to which the ’685 patent claims priority (“the ’008
`
`patent,” Ex. 1031).
`
`12.
`
`I have also reviewed the following references, all of which I
`
`understand to be prior art to the ’685 patent:
`
`(cid:120) U.S. Patent No. 6,691,232 to Wood et al., filed Aug. 5, 1999. (“Wood”, Ex.
`
`1011).
`
`(cid:120) Tatyana Ryutov and Clifford Neuman, “Access Control Framework for
`
`Distributed Applications,” Internet-Draft published with the Internet
`
`Engineering Task Force (IETF) on June 23, 1999. (“Neuman 1999 IETF
`
`Draft” or “Neuman”, Ex. 1005).
`
`13.
`
`I have also reviewed the exhibits and references cited in this
`
`Declaration.
`
`14.
`
`I am being compensated at my normal consulting rate for my work.
`
`15. My compensation is not dependent on and in no way affects the
`
`substance of my statements in this Declaration.
`
`4
`
`

`
`U.S. Patent 9,098,685
`Declaration of B. Clifford Neuman, Ph.D.
`
`16.
`
`I have no financial interest in the Petitioners. I similarly have no
`
`financial interest in the ’685 patent.
`
`II.
`
`LEGAL PRINCIPLES
`
`17.
`
`I have been informed that a claim is invalid as anticipated under 35
`
`U.S.C. § 102(a) if “the invention was known or used by others in this country, or
`
`patented or described in a printed publication in this or a foreign country, before
`
`the invention thereof by the applicant for patent.” I have also been informed that a
`
`claim is invalid as anticipated under 35 U.S.C. § 102(b) if “the invention was
`
`patented or described in a printed publication in this or a foreign country or in
`
`public use or on sale in this country, more than one year prior to the date of the
`
`application for patent in the United States.” Furthermore, I have been informed
`
`that a claim is invalid as anticipated under 35 U.S.C. § 102(e) if “the invention was
`
`described in … an application for patent, published under section 122(b), by
`
`another filed in the United States before the invention by the applicant for patent
`
`….” It is my understanding that for a claim to be anticipated, all of the limitations
`
`must be present in a single prior art reference, either expressly or inherently.
`
`18.
`
`I have been informed that a claim is invalid as obvious under 35
`
`U.S.C. § 103(a):
`
`“if the differences between the subject matter sought to be patented
`and the prior art are such that the subject matter as a whole would have
`
`5
`
`

`
`U.S. Patent 9,098,685
`Declaration of B. Clifford Neuman, Ph.D.
`
`
`been obvious at the time the invention was made to a person having
`ordinary skill in the art to which [the] subject matter pertains.”
`
`19.
`
`I understand that a claimed invention would have been obvious, and
`
`therefore not patentable, if the subject matter claimed would have been considered
`
`obvious to a person of ordinary skill in the art at the time that the invention was
`
`made. I understand that when there are known elements that perform in known
`
`ways and produce predictable results, the combination of those elements is likely
`
`obvious. Further, I understand that when there is a predictable variation and a
`
`person would see the benefit of making that variation, implementing that
`
`predictable variation is likely not patentable. I have also been informed that
`
`obviousness does not require absolute predictability of success, but that what does
`
`matter is whether the prior art gives direction as to what parameters are critical and
`
`which of many possible choices may be successful.
`
`III. PERSON OF ORDINARY SKILL IN THE ART
`20.
`The ’685 patent relates to the field of computer systems security. At
`
`the time the ’685 patent was filed, a person of ordinary skill in this field would
`
`have had at least a bachelor’s degree in computer science or electrical engineering
`
`and 3-5 years of professional experience in computer systems security, or a
`
`master’s or doctorate and 1-2 years of professional experience in computer systems
`
`security, or equivalent academic experience. Such a person would have been
`
`6
`
`

`
`familiar with designing and implementing computer systems security, and would
`
`U.S. Patent 9,098,685
`Declaration of B. Clifford Neuman, Ph.D.
`
`
`have been aware of design trends relating to selecting and applying security
`
`policies, and methods of authenticating and authorizing users.
`
`IV. OVERVIEW OF THE ’685 PATENT
`A. The Claimed Invention
`21.
`The ’685 patent claims to describe an improved method of authorizing
`
`a user to access a workstation or secured data. (Ex. 1001, 1:13-19; id. at 2:64-3:2.)
`
`The patent recognizes that security systems based on pre-set codes, passwords,
`
`biometric identification, and “predetermined combinations” of these measures
`
`were well known in the art. (Ex. 1001, 1:22-53; 2:48-50.) The ’685 patent also
`
`admits that organizations typically included additional security processes for
`
`remote access to their sites. (Ex. 1001, 2:54-63.) However, the patent criticizes
`
`these prior art systems as “fixed” and “predetermined.” (Ex. 1001, 2:46-63; see
`
`also 1:40-45 and 2:22-29.)
`
`22.
`
`The ’685 patent claims to solve these shortcomings by using a
`
`“flexible” approach to authorization that varies based on “computing conditions”.
`
`These computing conditions can include any one or more of: (1) the type of
`
`communication link being used, (2) the geographical location of the workstation,
`
`and/or (3) the time of access. In the claimed invention, a “security policy” is
`
`determined from a set of predetermined security policies based on previously
`
`7
`
`

`
`stored policy data and the computing conditions. An authorization method is then
`
`U.S. Patent 9,098,685
`Declaration of B. Clifford Neuman, Ph.D.
`
`
`determined from this security policy and the computing conditions. (Ex. 1001,
`
`claim 1; 3:19-34; 5:55-6:2.)
`
`23.
`
`According to the patent, this approach “provid[es] a method of user
`
`authorization that is flexible enough to work on different workstations and to
`
`accommodate user needs of different users at those different workstations.” (Ex.
`
`1001, 2:64-67), and “allows for many variations and adaptions according to
`
`external circumstances.” (Ex. 1001, 10:4-6.)
`
`24.
`
`Figures 3A and 3B of the ’685 patent, which are reproduced below,
`
`provide examples of the relevant components. In Figure 3A, a workstation 10
`
`(shown in red) is connected to a security server 13 (blue) though a communication
`
`link 15. (Ex. 1001, 5:18-22.) The security server 13 stores policy data and also
`
`controls access to secured data on data server 19 (green). Workstation 10 is also
`
`connected to a user data input device 14 (orange) (e.g., smart card reader or a
`
`biometric sampling device), and to keyboard 12. (orange) (Ex. 1001, 5:22-28.)
`
`25.
`
`The description of Figure 3B is similar to Figure 3A but concerns a
`
`mobile workstation 10a (red) connected to the security server 13 (blue) using an
`
`unsecured communication link 15a (e.g., using a wireless connection, telephone
`
`line connection, or other form of publicly used connection). (Ex. 1001, 5:33-41.)
`
`8
`
`

`
`Mobile workstation 10a is also connected to a portable user data input device 14a
`
`U.S. Patent 9,098,685
`Declaration of B. Clifford Neuman, Ph.D.
`
`
`(orange), and to a keyboard 12a (orange). (Ex. 1001, 5:41-45.)
`
`’685 Patent, Fig. 3a (annotated)
`
`
`’685 Patent, Fig. 3b (annotated)
`
`26.
`
`A user requesting access to secured data stored in data server 19
`
`provides user information (e.g., a password or fingerprint scan) to the user input
`
`device 14 or 14a, which is then provided along with “workstation data” to the
`
`security server 13. (Ex. 1001, 5:46-54; see also 6:63-65; 7:35-46.) The ’685
`
`patent discusses different types of “workstation data” (which are referred to in the
`
`claims as “security data relating to computing conditions”) such as “the
`
`geographical location of the workstation, the time the request for access is being
`
`performed, the type of the request, and so forth.” (Ex. 1001, 7:43-46; see also 6:3-
`
`4.)
`
`27.
`
`An applicable security policy is then determined from a plurality of
`
`predetermined security policies based on previously stored policy data and the
`
`9
`
`

`
`U.S. Patent 9,098,685
`Declaration of B. Clifford Neuman, Ph.D.
`
`workstation data relating to computing conditions. (Ex. 1001, 5:64-6:2.) A
`
`security policy is a set of rules specifying conditions for accessing a secure
`
`resource. (See Section VI.A, supra.) The security server determines the applicable
`
`security policy based on previously stored policy data and “computing conditions”
`
`such as the type of user data input device, the geographic location of the
`
`workstation, the type of communication link between the workstation and the
`
`security server, user ID, the data being accessed, the type of data being accessed,
`
`and the country. (Ex. 1001, 6:29-33; see also 7:17-30.) For example, the patent
`
`discusses a security policy where a user requesting access to information is
`
`automatically denied between the hours of midnight and 6 a.m. (Ex. 1001, 7:55-
`
`58.) The patent also describes different security policies for military personnel,
`
`including policies that vary based on location (Ex. 1001, 9:55-62.)
`
`28.
`
`The security server 13 then determines an authorization method (Ex.
`
`1001, 5:55-58; see also 6:40-42) from the determined security policy along with
`
`the workstation data relating to computing conditions. (Ex. 1001, 5:64-6:2.) As
`
`discussed below, the ’685 patent discloses several examples of different
`
`authorization methods, including methods that use a “smart card reader” (Ex. 1001,
`
`5:24-27), a “biometric sampling device such as a fingerprint imager, a voice
`
`recognition system, a retinal imager or the like” (id.), “password[s]” (id. at 4:63-
`
`65), and “card based user authentication” (id.; see also 6:49-65 (“Granting the user
`
`10
`
`

`
`access 23 to the secured data is in accordance with the determined at least an
`
`U.S. Patent 9,098,685
`Declaration of B. Clifford Neuman, Ph.D.
`
`
`authorization method…. In dependence upon the type of access being sought by
`
`the user, the previously stored policy data determines the type of user data that is
`
`required from the security device…. Examples of user data are biometric data and
`
`password data, but are not limited thereto.”).)
`
`29.
`
`For example, the ’685 patent explains that a mobile workstation 10a
`
`located in a less than secured location preferably uses a “high” security
`
`authentication process, whereas the same workstation at corporate headquarters
`
`uses a more “normal” level of security authentication. (Ex. 1001, 7:30-35.)
`
`Therefore, a general that requests access to a protected resource from an allied
`
`country might be subjected to one authorization method, whereas the same general
`
`may be subjected to another, more rigorous authorization method when requesting
`
`access from a non-allied country. (Ex. 1001, 8:26-45; 9:8-15.)
`
`30.
`
`After the authorization method is determined, the security server then
`
`uses the determined authorization method to authorize the user’s request to access
`
`the protected resource. This involves receiving user identification data (e.g., a
`
`password or fingerprint) (Ex. 1001, 6:63-65), and comparing the user identification
`
`data with previously stored user data (e.g., a previously stored password or
`
`fingerprint corresponding to an authorized user). (Ex. 1001, 5:57-61.) The
`
`specific type of user identification data that the security server asks for and
`
`11
`
`

`
`compares will depend on the determined authorization method. (Ex. 1001, 6:40-
`
`U.S. Patent 9,098,685
`Declaration of B. Clifford Neuman, Ph.D.
`
`
`54.) If the received user identification data matches the previously stored user
`
`data, the security server identifies the user and can authorize the user to access
`
`secured data. (Ex. 1001, 5:61-63.)1
`
`31.
`
`Surprisingly, the ‘685 patent explains little about the security policies
`
`that are determined other than to explain that the choice is based on computing
`
`conditions and not on the identification information for a particular user. As such
`
`it describes what the claimed application of policies does, but it doesn’t tell us how
`
`the policies operate and how the policies are implemented. Similarly, the ’685
`
`patent provides no examples regarding how security policies are evaluated to
`
`select an authorization method. We are only told that the result of this evaluation is
`
`based on certain inputs and not others.
`
`
`1 I note that this process of receiving and comparing user identification information
`
`against stored user information is commonly known in the art as “authentication”,
`
`but also falls within the meaning of an “authorization method” in the ’685 patent,
`
`as I discuss below. As used within the ’685 patent (see Section VI.B, supra), this
`
`may also constitute part of an “authorization method.”
`
`12
`
`

`
`U.S. Patent 9,098,685
`Declaration of B. Clifford Neuman, Ph.D.
`
`V.
`
`SUMMARY OF THE PROSECUTION HISTORY
`
`32.
`
`I have been informed that the ’685 patent (Ex. 1001) issued from U.S.
`
`Patent Appl. No. 10/847,884, filed on May 19, 2004, and is a continuation-in-part
`
`of U.S. Patent Appl. No. 09/625,548 (now U.S. Patent No. 7,137,008 (the ’008
`
`patent), Ex. 1031).2
`
`33.
`
`I have also been informed that the applicants initially sought claims
`
`directed to “receiving data relating to a workstation of the user” and “determining,
`
`in accordance with the received data and in accordance with stored policy data, at
`
`least one authorization method for authorizing the user.” (Ex. 1025 (Preliminary
`
`Amendment dated 8/4/2006), claim 25.) The Patent Office properly recognized
`
`that the subject matter was old, and rejected those claims (as well as other similar
`
`claims) multiple times. (Ex. 1021-1025 (representative rejections dated 5/21/2007,
`
`2/19/2008, 12/18/2008, and 1/20/2010.)
`
`2 The ’685 patent (Ex. 1001) claims new subject matter that was not present in its
`
`parent, the ’008 patent (Ex. 1031). For example, the ’008 patent lacks written
`
`description of determining a security policy or an authorization method based on a
`
`type of communication link between a workstation and a security server, a
`
`geographic location of the workstation, or a time of access. As a result, the ’685
`
`patent is entitled to a priority date no earlier than May 19, 2004.
`
`13
`
`

`
`U.S. Patent 9,098,685
`Declaration of B. Clifford Neuman, Ph.D.
`
`
`Further, I understand that the applicants eventually appealed to the
`
`34.
`
`PTAB on Mar. 31, 2011. (Ex. 1026 (Notice of Appeal dated 3/31/2011), Ex. 1013
`
`(Appeal Brief dated 5/20/2011).) A representative claim under appeal was claim 4:
`
`4. (Previously Presented) A method of authorizing a user to
`access a workstation using a security server, the method
`comprising:
`
`receiving security data relating to at least one of a type of
`communication link between the workstation and the security
`server, a geographic location of the workstation, or a time of
`access of the workstation by the user;
`
`determining a security policy from a plurality of predetermined
`security policies based on previously stored policy data and the
`received security data;
`
`determining an authorization method for authorizing the user,
`wherein the authorization method is determined from the
`determined security policy in accordance with the received
`security data;
`
`receiving user identification data; and
`
`registering the user identification data against stored user data
`in accordance with the determined authorization method,
`wherein different authorization methods for authorizing the user
`are determined upon receipt of different security data.
`
`14
`
`

`
`U.S. Patent 9,098,685
`Declaration of B. Clifford Neuman, Ph.D.
`
`
`I understand that the PTAB likewise determined that these claims
`
`35.
`
`recited old subject matter, and affirmed the rejection. In particular, the PTAB
`
`determined that the prior art taught “determining a security policy from a plurality
`
`of security policies based on previously stored policy data and the received
`
`security data,” as well as determining an “authorization method . . . from the
`
`determined security policy in accordance with the received security data.” (PTAB
`
`Decision at 4 (Ex. 1014.)
`
`36.
`
` I further understand that after the PTAB’s decision, the applicants
`
`amended the claims to require that the security policy and authorization method be
`
`determined based on a “received indication of the type of communication link
`
`between the workstation and the security server, the geographic location of the
`
`workstation, or the time of access of the workstation.” They also clarified that
`
`“the security data does not include identification information for a particular user.”
`
`(Ex. 1016 at 3, 16 (amended claim reproduced below) (emphasis added).) Thus,
`
`this meant that the choice of which security policy and authorization method to use
`
`for a specific access attempt is made independent of identification information for
`
`a particular user.
`
`4. (Currently Amended) A method of authorizing a user to
`access a workstation using a security server, the method
`comprising:
`
`15
`
`

`
`U.S. Patent 9,098,685
`Declaration of B. Clifford Neuman, Ph.D.
`
`
`receiving security data relating to computing conditions in
`which an authorization will be performed, wherein the security
`data comprises at least one indication of a type of
`communication link between the workstation and the security
`server, a geographic location of the workstation, or a time of
`access of the workstation by the user;
`
`determining a security policy from a plurality of predetermined
`security policies based on previously stored policy data and the
`received indication of the type of communication link between
`the workstation and the security server, the geographic location
`of the workstation, or the time of access of the workstation
`security data;
`
`determining an authorization method for authorizing the user,
`wherein the authorization method is determined from the
`determined security policy in accordance with the received
`indication of the type of communication link between the
`workstation and the security server, the geographic location of
`the workstation, or the time of access of the workstation
`security data;
`
`receiving user identification data; and
`
`registering the user identification data against stored user data
`in accordance with the determined authorization method,
`wherein different authorization methods for authorizing the user
`are determined upon receipt of different security data, and
`
`16
`
`

`
`U.S. Patent 9,098,685
`Declaration of B. Clifford Neuman, Ph.D.
`
`
`wherein the security data does not include identification
`information for a particular user.
`
`37.
`
`The claims were then allowed in response. (Ex. 1027 (Notice of
`
`Allowance).) Given that the claims were allowed only after this amendment, the
`
`record is clear that the added language described above was the purported
`
`distinction over the prior art in the rejection. As I discuss below, however, even
`
`these features, and the system as a whole, were well known in the art. Furthermore,
`
`since this added language is not described in the parent ’008 patent’s specification
`
`(Ex. 1031), the challenged claims of the ’685 patent are not entitled to the priority
`
`date of the parent ’008 patent.
`
`38.
`
`In sum, the ’685 patent claims as its novel concept the use of different
`
`security policies and authorization methods that vary based on computing
`
`conditions, specifically, one or more of: (1) the type of communication link being
`
`used, (2) the geographic location of the workstation, and/or (3) the time of access.
`
`(Ex. 1001, claim 1).
`
`39.
`
`In the ’685 patent, the choice of which security policy and
`
`authorization method to use for a specific access attempt is made independent of
`
`identification information for a particular user. (Ex. 1001, claim 1 (“wherein
`
`different authorization methods for authorizing the user are determined upon
`
`receipt of different security data, and wherein the security data does not include
`
`17
`
`

`
`identification information for a particular user.”) Instead, as discussed above,
`
`U.S. Patent 9,098,685
`Declaration of B. Clifford Neuman, Ph.D.
`
`
`the choice of security policy and authorization method is based on computing
`
`conditions, such as type of communication link, geographic location of the
`
`workstation, and/or the time of access.
`
`VI. CLAIM CONSTRUCTION
`40.
`I have been informed and understand that for purposes of this
`
`proceeding, the claim terms should be given their “broadest reasonable
`
`construction in light of the specification.” I have reviewed the constructions
`
`proposed by Petitioner in the Petition for Inter Partes Review. It is my opinion
`
`that Petitioner’s proposals are correct. I have used Petitioner’s proposals in the
`
`analysis that follows, and it is my opinion that the claims are invalid under each
`
`construction. In the paragraphs that follow I refer to these as the potentially
`
`“applicable” constructions. Every other term should be considered under its
`
`ordinary and customary meaning in light of the specification, as commonly
`
`understood by those of ordinary skill in the art.
`
`A.
`41.
`
`“security policy” (claims 1, 9, and 19)
`
`The broadest reasonable interpretation of a “security policy,” in the
`
`context of the ’685 specification and claims, is “rules specifying conditions for
`
`accessing a secure resource.”
`
`18
`
`

`
`U.S. Patent 9,098,685
`Declaration of B. Clifford Neuman, Ph.D.
`
`
`The ’685 specification describes a “security policy,” consistent with
`
`42.
`
`the ordinary meaning of the term, as something that “determin[es]…at least an
`
`authorization method for the user.” (Ex. 1001, 5:64-6:2; see also 7:50-54 (“In
`
`dependence upon the security policy…an authorization method…is selected.”);
`
`6:3-7 (“[T]he authorization method is varied because a security policy…is
`
`different.”).) As discussed above, the patent discusses a few examples of security
`
`policies. For example, a security policy may indicate that no access is to be
`
`provided between the hours of midnight and 6:00 a.m. (Ex. 1001, 7:55-58.) A
`
`security policy may also require the use of different user authentication devices.
`
`(Ex. 1001, 8:23-45; 9:8-15; 9:28-37.)3
`
`B.
`43.
`
`“authorization method” (claims 1, 9, and 19 )
`
`The broadest reasonable interpretation of an “authorization method,”
`
`in the context of the ’685 patent specification and claims, is a “method of
`
`identifying and/or authorizing a user to access a resource.”
`
`44.
`
`The ’685 patent specification describes an “authorization method” as
`
`a method of identifying and/or authorizing the user. (Ex. 1001, Abstract (“In the
`
`
`3 This construction is also consistent with the well understood meaning in the art.
`
`(See, e.g., Neuman 1999 IETF Draft (Ex. 1005), 3 (defining “SECURITY
`
`POLICY” as “the set of rules that govern access to objects.”)
`
`19
`
`

`
`authorization method, the user is first identified with the security server and then
`
`U.S. Patent 9,098,685
`Declaration of B. Clifford Neuman, Ph.D.
`
`
`optionally authorized thereby.”); see also 6:42-45 (“an authorization method to
`
`perform at least one of identifying and authorizing the user.”).) The ’685 patent
`
`describes various methods for identifying and authenticating users, including “a
`
`smart card reader” (id. at 5:24-27), a “biometric sampling device such as a
`
`fingerprint imager, a voice recognition system, a retinal imager or the like” (id.),
`
`“p

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket