`Filed on behalf of: VirnetX Inc.
`By:
`
`Joseph E. Palys
`Paul Hastings LLP
`875 15th Street NW
`Washington, DC 20005
`Telephone: (202) 551-1996
`Facsimile: (202) 551-0496
`E-mail: josephpalys@paulhastings.com
`
`
`
`Naveen Modi
`Paul Hastings LLP
`875 15th Street NW
`Washington, DC 20005
`Telephone: (202) 551-1990
`Facsimile: (202) 551-0490
`E-mail: naveenmodi@paulhastings.com
`
`
`
`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`
`
`
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`
`
`
`
`APPLE INC.
`Petitioner
`v.
`VIRNETX INC.
`Patent Owner
`
`
`
`Case IPR2014-00237
`Patent 8,504,697
`
`
`
`
`
`Declaration of Fabian Monrose, Ph.D.
`
`
`
`
`
`
`Page 1 of 55
`
`VIRNETX EXHIBIT 2001
`Apple v. VirnetX
`Trial IPR2015-00811
`
`
`
`
`
`Case No. IPR2014-00237
`
`Table of Contents
`I. Introduction .......................................................................................................... 4
`II. Resources Consulted ........................................................................................... 4
`III. Background and Qualifications ......................................................................... 5
`IV. Level of Ordinary Skill ...................................................................................10
`V. Claim Terms .....................................................................................................11
`A. “Secure Communication Link” (Claims 1-3, 11-13, 16-17, and
`24-27) ..................................................................................................11
`B. “Virtual Private Network (VPN) Communication Link” (Claims 3
`and 17) .................................................................................................14
`C. “Intercept[ing] . . . a request to look up an internet protocol (IP)
`address” (Claims 1 and 16) .................................................................15
`D. “Determining, in response to the request, whether the second
`network device is available for a secure communications
`service” (Claims 1 and 16) ..................................................................18
`VI. Beser ................................................................................................................20
`A. Beser’s Disclosure ..................................................................................20
`B. Claims 1 and 16 ......................................................................................24
`1. Intercepting a Request to Look Up an IP Address of the
`Second Network Device ...........................................................24
`2. Determining, in Response to the Request, Whether the
`Second Network Device Is Available for a Secure
`Communications Service ..........................................................27
`3. Secure Communication Link ........................................................33
`C. Dependent Claims...................................................................................35
`1. Dependent Claims 2 and 24—Encryption of Video or
`Audio Data ................................................................................35
`
`2
`
`Page 2 of 55
`
`
`
`
`
`Case No. IPR2014-00237
`
`2. Dependent Claims 3 and 17—Virtual Private Network
`Communication Link ................................................................36
`VII. Beser in View of RFC 2401 ...........................................................................37
`VIII. Conclusion .....................................................................................................38
`
`
`
`
`
`3
`
`Page 3 of 55
`
`
`
`
`
`
`I.
`
`Case No. IPR2014-00237
`
`I, FABIAN MONROSE, declare as follows:
`
`Introduction
`I have been retained by VirnetX Inc. (“VirnetX”) for this inter partes
`1.
`
`review proceeding. I understand that this proceeding involves U.S. Patent No.
`
`8,504,697 (“the ’697 patent”). I understand the ’697 patent is assigned to VirnetX
`
`and that it is part of a family of patents that stems from U.S. provisional
`
`application nos. 60/106,261 (“the ’261 application”), filed on October 30, 1998,
`
`and 60/137,704 (“the ’704 application”), filed on June 7, 1999. I understand that
`
`the ’697 patent has a continuation relationship through several applications to U.S.
`
`application no. 09/558,210 filed April 26, 2000 (“the ’210 application,”
`
`abandoned). And I understand the ’210 application is a continuation-in-part of
`
`U.S. application no. 09/504,783 filed February 15, 2000 (now U.S. Patent
`
`6,502,135, “the ’135 patent”), and that the ’135 patent is a continuation-in-part of
`
`U.S. application no. 09/429,643 (now U.S. Patent No. 7,010,604) filed October 29,
`
`1999, which claims priority to the ’261 and ’704 applications.
`
`II. Resources Consulted
`I have reviewed the ’697 patent, including claims 1-30. I have also
`2.
`
`reviewed the Petition for Inter Partes Review (Paper No. 1, the “Petition”) filed
`
`with the U.S. Patent and Trademark Office (“Office”) by Apple Inc. on December
`
`6, 2013. I have also reviewed the Patent Trial and Appeal Board’s (“Board”)
`
`4
`
`Page 4 of 55
`
`
`
`
`decision to institute inter partes review (Paper No. 15, the “Decision”) of May 14,
`
`Case No. IPR2014-00237
`
`2014. I understand that in this proceeding the Board instituted review of the ’697
`
`patent on two grounds: (1) anticipation of claims 1-11, 14-25, and 28-30 by Beser;
`
`and (2) obviousness of claims 1-11, 14-25, and 28-30 over Beser in view of RFC
`
`2401. I have reviewed the exhibits and other documentation supporting the
`
`Petition that are relevant to the Decision and the instituted grounds.
`
`III. Background and Qualifications
`I have a great deal of experience and familiarity with computer and
`3.
`
`network security, and have been working in this field since 1993 when I entered
`
`the Ph.D. program at New York University.
`
`4.
`
`I am currently a Professor of Computer Science at the University of
`
`North Carolina at Chapel Hill. I also hold an appointment as the Director of
`
`Computer and Information Security at the Renaissance Computing Institute
`
`(RENCI). RENCI develops and deploys advanced technologies to facilitate
`
`research discoveries and practical innovations. To that end, RENCI partners with
`
`researchers, policy makers, and technology leaders to solve the challenging
`
`problems that affect North Carolina and our nation as a whole. In my capacity as
`
`Director of Computer and Information Security, I
`
`lead
`
`the design and
`
`implementation of new platforms for enabling access to, and analysis of, large and
`
`sensitive biomedical data sets while ensuring security, privacy, and compliance
`
`5
`
`Page 5 of 55
`
`
`
`
`with regulatory requirements. At RENCI, we are designing new architectures for
`
`Case No. IPR2014-00237
`
`securing access to data (e.g., using virtual private networks and data leakage
`
`prevention technologies) hosted among many different institutions. Additionally, I
`
`serve on RENCI’s Security, Privacy, Ethics, and Regulatory Oversight Committee
`
`(SPOC), which oversees the security and regulatory compliance of technologies,
`
`designed under the newly-formed Data Science Research Program and the Secure
`
`Medical Research Workspace.
`
`5.
`
`I received my B.Sc. in Computer Science from Barry University in
`
`May 1993. I received my MSc. and Ph.D. in Computer Science from the Courant
`
`Institute of Mathematical Sciences at New York University in 1996 and 1999,
`
`respectively. Upon graduating from the Ph.D. program, I joined the Systems
`
`Security Group at Bell Labs, Lucent Technologies. There, my work focused on the
`
`analysis of
`
`Internet Security
`
`technologies
`
`(e.g.,
`
`IPsec and client-side
`
`authentication) and applying
`
`these
`
`technologies
`
`to Lucent’s portfolio of
`
`commercial products. In 2002, I joined the Johns Hopkins University as Assistant
`
`Professor in the Computer Science department. I also served as a founding
`
`member of the Johns Hopkins University Information Security Institute (JHUISI).
`
`At JHUISI, I served a key role in building a center of excellence in Cyber Security,
`
`leading efforts in research, education, and outreach.
`
`6
`
`Page 6 of 55
`
`
`
`
`
`Case No. IPR2014-00237
`
`6.
`
`In July of 2008, I joined the Computer Science department at the
`
`University of North Carolina (UNC) Chapel Hill as Associate Professor, and was
`
`promoted to Full Professor four years later. In my current position at UNC Chapel
`
`Hill, I work with a large group of students and research scientists on topics related
`
`to cyber security. My former students now work as engineers at several large
`
`companies, as researchers in labs, or as university professors themselves. Today,
`
`my research focuses on applied areas of computer and communications security,
`
`with a focus on traffic analysis of encrypted communications (e.g., Voice over IP);
`
`Domain Name System (DNS) monitoring for performance and network abuse;
`
`network
`
`security architectures
`
`for
`
`traffic engineering; biometrics and
`
`client-to-client authentication techniques; computer forensics and data provenance;
`
`runtime attacks and defenses for hardening operating system security; and large-
`
`scale empirical analyses of computer security incidents. I also regularly teach
`
`courses in computer and information security.
`
`7.
`
`I have published over 75 papers in prominent computer and
`
`communications security publications. My research has received numerous
`
`awards, including the Best Student Paper Award (IEEE Symposium on Security &
`
`Privacy, July, 2013), the Outstanding Research in Privacy Enhancing Technologies
`
`Award (July, 2012), the AT&T Best Applied Security Paper Award (NYU-Poly
`
`CSAW, Nov., 2011), and the Best Paper Award (IEEE Symposium on Security &
`
`7
`
`Page 7 of 55
`
`
`
`
`Privacy, May, 2011), among others. My research has also received corporate
`
`Case No. IPR2014-00237
`
`sponsorship, including two Google Faculty Research Awards (2009, 2011) for my
`
`work on network security and computer forensics, as well as an award from
`
`Verisign Inc. (2012) for my work on DNS.
`
`8.
`
`I am the sole inventor or a co-inventor on five issued US patents and
`
`two pending patent applications, nearly all of which relate to network and systems
`
`security. Over the past 12 years, I have been the lead investigator or a
`
`co-investigator on grants totaling nearly nine million US dollars from the National
`
`Science Foundation (NSF), the Department of Homeland Security (DHS), the
`
`Department of Defense (DoD), and industry. In 2014, I was invited to serve on the
`
`Information Science and Technology (ISAT) study group for the Defense
`
`Advanced Research Projects Agency (DARPA). During my
`
`three year
`
`appointment, I will assist DARPA by providing continuing and independent
`
`assessment of the state of advanced information science and technology as it
`
`relates to the U.S. Department of Defense.
`
`9.
`
`I have chaired several international conferences and workshops,
`
`including for example, the USENIX Security Symposium, which is the premier
`
`systems-security conference for academics and practitioners alike. Additionally, I
`
`have also served as Program Co-Chair for the UNENIX Workshop on Hot Topics
`
`in Security, the Program Chair for the USENIX Workshop on Large-scale Exploits
`
`8
`
`Page 8 of 55
`
`
`
`
`& Emergent Threats, the local arrangements Chair for the Financial Cryptography
`
`Case No. IPR2014-00237
`
`and Data Security Conference, and the General Chair of the Symposium on
`
`Research in Attacks and Defenses. As a leader in the field, I have also served on
`
`numerous technical program committees including the Research in Attacks,
`
`Intrusions, and Defenses Symposium (2012, 2013), USENIX Security Symposium
`
`(2013, 2005-2009), Financial Cryptography and Data Security (2011, 2012),
`
`Digital Forensics Research Conference (2011, 2012), ACM Conference on
`
`Computer and Communications Security (2009-2011, 2013), IEEE Symposium on
`
`Security and Privacy (2007, 2008), ISOC Network & Distributed System Security
`
`(2006—2009), International Conference on Distributed Computing Systems (2005,
`
`2009, 2010), and USENIX Workshop on Large-scale Exploits and Emergent
`
`Threats (2010-2012).
`
`10. From 2006 to 2009, I served as an Associate Editor for IEEE
`
`Transactions on Information and Systems Security (the leading technical journal
`
`on cyber security), and currently serve on the Steering Committee for the USENIX
`
`Security Symposium.
`
`11. My curriculum vitae, which is appended, details my background and
`
`technical qualifications. Although I am being compensated at my standard rate of
`
`$450/hour for my work in this matter, the compensation in no way affects the
`
`statements in this declaration.
`
`9
`
`Page 9 of 55
`
`
`
`
`IV. Level of Ordinary Skill
`I am familiar with the level of ordinary skill in the art with respect to
`12.
`
`Case No. IPR2014-00237
`
`the inventions of the ’697 patent as of what I understand is the patent’s early-2000
`
`priority date. Specifically, based on my review of the technology, the educational
`
`level of active workers in the field, and drawing on my own experience, I
`
`believe a person of ordinary skill in art at that time would have had a master’s
`
`degree in computer science or computer engineering, as well as two years of
`
`experience in computer networking with some accompanying exposure to network
`
`security. My view is consistent with VirnetX’s view that a person of ordinary skill
`
`in the art requires a master’s degree in computer science or computer engineering
`
`and approximately two years of experience in computer networking and computer
`
`security. I have been asked to respond to certain opinions offered by Apple’s
`
`expert, Michael Fratto, consider how one of ordinary skill would have understood
`
`certain claim terms, and consider how one of ordinary skill in the art would have
`
`understood the references mentioned above in relation to the claims of the ’697
`
`patent.1 My findings are set forth below.
`
`
`1 As noted, I was asked to opine only with respect to certain issues that are
`
`
`discussed in this declaration. By doing so, however, I do not necessarily agree
`
`with other positions taken by Apple or Mr. Fratto that I do not address here.
`
`10
`
`Page 10 of 55
`
`
`
`Case No. IPR2014-00237
`
`
`V. Claim Terms
` “Secure Communication Link” (Claims 1-3, 11-13, 16-17, and
`A.
`24-27)
`I understand that the Decision preliminarily construed “secure
`
`13.
`
`communication link” to mean “a transmission path that restricts access to data,
`
`addresses, or other information on the path, generally using obfuscation methods to
`
`hide information on the path, including, but not limited to, one or more of
`
`authentication, encryption, or address hopping.” (Decision at 10.) I disagree with
`
`this interpretation.
`
`14. The ordinary meaning of “secure communication link” and the
`
`broadest reasonable interpretation in light of the teachings of the ’697 patent,
`
`which I understand is the standard used by the Board in this proceeding, require
`
`encryption. (See Paper No. 12 at 17-23.) The patent specification teaches that
`
`“data security is usually tackled using some form of data encryption,” and it
`
`repeatedly discusses using encryption in various embodiments. (Ex. 1001 at 1:57-
`
`58; see also id. at 10:26-27, 11:42-49, 34:38-39.)
`
`15. Additionally, the Decision’s construction is technically incorrect. Of
`
`the “obfuscation” methods in the construction—authentication, encryption, and
`
`address hopping—only encryption restricts access to “data, addresses, or other
`
`information on the path.” The other techniques alone do not provide the claimed
`
`security.
`
`11
`
`Page 11 of 55
`
`
`
`
`
`Case No. IPR2014-00237
`
`16.
`
`In my opinion, authentication merely ensures the recipient that a
`
`message originated from the expected sender, which is consistent with the
`
`definition of authentication in a dictionary the ’697 patent incorporates by
`
`reference. (See Ex. 2004 at 3, Glossary for the Linux FreeS/WAN Project.)
`
`Authentication does not prevent an eavesdropper from accessing data transmitted
`
`over an unsecure communication link. The specification is also consistent with my
`
`understanding, as it describes at least one scenario where an authenticated
`
`transmission occurs “in the clear”—i.e., over an unsecured communication link:
`
`SDNS [secure domain name service] 3313 can be accessed
`through secure portal 3310 “in the clear”, that is, without using
`an administrative VPN communication link. In this situation,
`secure portal 3310 preferably authenticates the query using any
`well-known technique, such as a cryptographic technique,
`before allowing the query to proceed to SDNS [3313].
`(Ex. 1001 at 52:7-12.)
`17. Address hopping alone also does not provide security, as there is
`
`nothing inherent in moving from address to address that precludes an eavesdropper
`
`from reading the details of a communication. This is why the ’697 patent discloses
`
`embodiments that use encryption in conjunction with address hopping to protect,
`
`for example, the next address in a routing scheme from being viewed by
`
`eavesdroppers. (See, e.g., Ex. 1001 at 3:36-50, stating in part that “[e]ach TARP
`
`packet’s true destination is concealed behind a layer of encryption generated using
`
`12
`
`Page 12 of 55
`
`
`
`
`a link key.”) The Decision states that address hopping alone is sufficient because
`
`Case No. IPR2014-00237
`
`the ’697 patent states that “[a]ddress hopping provides security and privacy.”
`
`(Decision at 9, citing Ex. 1001 at 25:54-56, 40:66-68.) But the address hopping
`
`embodiments in the ’697 patent also use encryption, and it is the encryption that
`
`provides security while moving from address to address. (See, e.g., id. at 3:16-
`
`4:40.)
`
`18. The Decision cites dictionary definitions, but these definitions are for
`
`peripheral terms and do not support the Decision’s definition of “secure
`
`communication link.” (Decision at 9-10; Ex. 3002, 3003.) For example, the
`
`Decision refers to an IEEE dictionary definition of “security service.” As defined,
`
`however, “security service” broadly pertains to securing “system resources” in
`
`addition to data. (Ex. 3003.) System resources can be secured through physical
`
`means that include access controls or authentication, such as where the computing
`
`system is in a locked room and the person attempting to gain access to it must
`
`present a guard with credentials to physically access the machine. Securing data
`
`that travels over a public network of switches and routers controlled by third
`
`parties, where there is no opportunity for physically guarding access, cannot rely
`
`solely on techniques like authentication.
`
`19. Additionally,
`
`the Decision’s dictionary definitions support my
`
`understanding that a “secure communication link” requires encryption. For
`
`13
`
`Page 13 of 55
`
`
`
`
`example, a few lines above the general definition of “security” that the Decision
`
`Case No. IPR2014-00237
`
`relies on, the dictionary states that “secure visual communications” is defined as
`
`“[t]he transmission of an encrypted digital signal consisting of animated visual and
`
`audio information; the distance may vary from a few hundred feet to thousands of
`
`miles.” (Ex. 3002 at 3, emphasis added.) For signals traveling the distances
`
`contemplated by the ’697 patent, and particularly where those distances include
`
`third-party controlled network hardware, encryption is the only viable mechanism
`
`for security over those physically unsecured portions of the networks.
`
`20. Thus, while authentication and address hopping may be used in
`
`conjunction with encryption to achieve data security, neither is sufficient by itself
`
`to make a link a secure communication link.
`
`B.
`
`“Virtual Private Network (VPN) Communication Link” (Claims 3
`and 17)
`21. Neither party nor the Decision construed “virtual private network
`
`communication link” (“VPN communication link”), appearing in dependent claims
`
`3 and 17. To the extent the Board finds it helpful, in my opinion, a VPN
`
`communication link refers to a communication path between computers in a VPN.
`
`The ’697 patent supports my view. For example, the specification describes the
`
`ultimate VPN communication link as a path between the claimed first network
`
`device and the second network device in a VPN. (See, e.g., Ex. 1001 at 51:60-66.)
`
`The specification also describes an optional “administrative” VPN communication
`
`14
`
`Page 14 of 55
`
`
`
`
`link between the first network device and a secure name service (and/or between
`
`Case No. IPR2014-00237
`
`the secure name service and a gatekeeper computer that provisions the ultimate
`
`VPN) that facilitates establishing the ultimate VPN. (See, e.g., id. at 41:14-20
`
`51:17.) The administrative VPN communication link is similarly described as a
`
`path between these devices in an administrative VPN. (See, e.g., id. at. 40:44-48,
`
`41:14-20, 43-56, 47:11-14.)
`
`C.
`
`22.
`
`“Intercept[ing] . . . a request to look up an internet protocol (IP)
`address” (Claims 1 and 16)
`I understand
`that
`the Decision preliminarily construed
`
`the
`
`“intercepting” phrase in independent claims 1 and 16 to mean “receiving a request
`
`pertaining to a first entity at another entity.” (Decision at 12-13.) In my opinion,
`
`the meaning of this phrase would be apparent to one of ordinary skill in the art
`
`without further interpretation. The Decision’s construction is inconsistent with that
`
`understanding, in part, because it rewrites the language to focus on an aspect that
`
`the claimed embodiments may have in common with conventional systems
`
`identified in the ’697 patent, rather than on aspects that make the claimed
`
`embodiments different from those conventional systems.
`
`23. As explained in the ’697 patent, in conventional DNS, a DNS request
`
`requesting an address of a target web site 2503 (a first entity) is received at a DNS
`
`server 2502 (a second entity), which returns the address. (Ex. 1001, FIG. 25,
`
`reproduced below; id. at 9:38-39, 39:39-48.)
`
`15
`
`Page 15 of 55
`
`
`
`
`
`Case No. IPR2014-00237
`
`
`
`The same is true of the ’697 patent’s embodiment in which a DNS request to look
`
`up a network address of a secure target site 2604 or unsecure target site 2611 (a
`
`first entity) is received at a DNS server 2602 (a second entity). (Id., FIG. 26,
`
`reproduced below; see also id. at 40:31-40.)
`
`
`
`16
`
`Page 16 of 55
`
`
`
`
`
`Case No. IPR2014-00237
`
`24. However, the ’697 patent goes on to explain that the claimed
`
`embodiments differ from conventional DNS, in part, because they apply an
`
`additional layer of functionality to a request to look up a network address beyond
`
`merely resolving it and returning the network address. For example, the DNS
`
`proxy 2610 may intercept the request and “determine[ ] whether access to a secure
`
`site has been requested,” “determine[ ] whether the user has sufficient security
`
`privileges to access the site,” and/or “transmit[ ] a message to gatekeeper 2603
`
`requesting that a virtual private network be created between 40 user computer 2601
`
`and secure target site 2604.” (Id. at 40:31-40.) Additionally, the DNS resolves an
`
`address and returns it to the first network device. (Id. at 44-48.)
`
`25. The Decision’s construction addresses a common aspect of a
`
`conventional DNS and the disclosed embodiments, namely that a request to look
`
`up an address of one entity may be received at another entity. However, the
`
`construction overlooks the aspects distinguishing the “intercepting” phrase from
`
`conventional DNS.
`
`26. Thus, in my opinion, the “intercepting” phrase refers to the notion of
`
`performing an additional evaluation on a request to look up an IP address related to
`
`establishing a secure communication link, beyond conventionally resolving it and
`
`returning the address. The independent claims also support this view, for example,
`
`by reciting that a determination is made whether the second network device is
`
`17
`
`Page 17 of 55
`
`
`
`
`available for a secure communications service “in response to the request [to look
`
`Case No. IPR2014-00237
`
`up an IP address].” (Ex. 1001, claims 1 and 16.) Additionally, dependent claims
`
`10 and 29 expressly specify the evaluation, reciting that “intercepting” involves
`
`“receiving the request to determine whether the second network device is available
`
`for the secure communications service.”
`
`D.
`
`27.
`
`“Determining, in response to the request, whether the second
`network device is available for a secure communications service”
`(Claims 1 and 16)
`Independent claims 1 and 16 recite “determining, in response to the
`
`request, whether
`
`the second network device
`
`is available for a secure
`
`communications service.” I understand that the Decision preliminarily interprets
`
`this phrase to mean “determining, one or more of 1) whether the device is listed
`
`with a public internet address, and if so, allocating a private address for the second
`
`network device, or 2) some indication of the relative permission level or security
`
`privileges of the requester.” (Decision at 14-15.) I disagree with this
`
`interpretation.
`
`28. Determining a public internet address or allocating a private address
`
`has no relationship to whether a device is available for a secure communications
`
`service. For example, a device listed with a public internet address may or may not
`
`be available for a secure communications service. Similarly, a private address may
`
`be allocated for a device even if no determination is made that the device is
`
`18
`
`Page 18 of 55
`
`
`
`
`available for a secure communications service. These features have nothing to do
`
`Case No. IPR2014-00237
`
`with the claimed determination.
`
`29. The ’697 patent does not limit the determination of whether a network
`
`device is available for a secure communications service to require a determination
`
`of a public address and an allocation of a private address. Passages of the ’697
`
`patent cited in the Decision disclose an example in which a determination is made
`
`that a target computer is available for a secure communications service without
`
`“determining . . . whether the device is listed with a public internet address, and if
`
`so, allocating a private address for the second network device.” (See Ex. 1001 at
`
`41:6-64; see also Decision at 15.)
`
`30. The Decision also states that the “determining” phrase can mean
`
`“determining . . . some indication of the relative permission level or security
`
`privileges of the requester.” I disagree that the claimed “determination” requires
`
`“some indication of the relative permission level or security privileges of the
`
`requester.” The specification provides examples of determinations focusing on the
`
`second network device to which access is requested, as well as examples of
`
`separate determinations focusing on the first network device desiring the access.
`
`(Compare, for example, id. at 40:32-33, “determin[ing] whether access to a secure
`
`site has been requested” with id. at 40:36-37, “determines whether the user has
`
`sufficient security privileges to access the site.”) The claimed determination,
`
`19
`
`Page 19 of 55
`
`
`
`
`however, expressly focuses on the second network device (Ex. 1001, claims 1 and
`
`Case No. IPR2014-00237
`
`16, “whether the second network device is available for a secure communications
`
`service,” emphasis added), so I disagree that the “determining” phrase must be
`
`limited to the Decision’s determining “permission level or security privileges of
`
`the requester.”
`
`VI. Beser
`A. Beser’s Disclosure
`31. Beser discloses a method “for initiating a tunnelling association
`
`between an originating end and a terminating end of the tunnelling association.”
`
`(Ex. 1009 at 7:62-64.) A first network device may receive a “request to initiate the
`
`tunnelling connection” from an originating end. (Id. at 7:65-67.) “The request
`
`includes a unique identifier for the terminating end of the tunnelling association.”
`
`(Id. at 8:1-3.) The request may also include a “distinctive sequence of bits [that]
`
`indicates to [a] tunnelling application that it should examine the request message
`
`for its content and not ignore the datagram.” (Id. at 8:35-43.)
`
`32. A “trusted-third-party network device is informed of the request.” (Id.
`
`at 8:48-49.) The “informing message includes an indicator that . . . may be a
`
`distinctive sequence of bits” that “indicates to the tunnelling application that it
`
`should examine the informing message for its content and not ignore the
`
`datagram.” (Id. at 8:60 – 9:1.) A “public network address for a second network
`
`20
`
`Page 20 of 55
`
`
`
`
`device is associated with the unique identifier.” (Id. at 9:6-8.) “This association of
`
`Case No. IPR2014-00237
`
`the public IP 58 address for the second network device [ ] with the unique
`
`identifier is made on the trusted-third-party network device.” (Id. at 11:30-32.)
`
`“The second network device is accessible on the public network 12 by transmitting
`
`an IP 58 packet with the public IP 58 address for the second network device in the
`
`destination address field 90.” (Id. at 9:16-19.)
`
`33. “[A] first private network address and a second private network
`
`address are negotiated on the first network device and the second network device.”
`
`(Id. at 9:26-28.) “In one exemplary preferred embodiment, the negotiation is
`
`carried out through the trusted-third party network device.” (Id. at 9:29-34.)
`
`However, as depicted, for example, in Figure 9 of Beser, it is the first network
`
`device and second network device that select the private network addresses:
`
`21
`
`Page 21 of 55
`
`
`
`
`
`Case No. IPR2014-00237
`
`
`
`34. Beser discloses that “the first private network address [is selected]
`
`from a first pool of private addresses on the first network device.” (Id. at 12:40-
`
`45.) “[T]he first private network address is communicated from the first network
`
`device to the second network device through the public network.” (Id. at 12:45-
`
`48.) “The second private network address is selected from a second pool of private
`
`addresses on the second network device” and “communicated from the second
`
`network device to the first network device through the public network.” (Id. at
`
`22
`
`Page 22 of 55
`
`
`
`
`12:48-54.) “A tunnelling application in the application layer recognizes the private
`
`Case No. IPR2014-00237
`
`network addresses as being associated with the ends of the tunnelling association.
`
`The private network addresses may be included as the payload in data packets and
`
`are passed up to the application layer from the transport layer.” (Id. at 9:46-51.)
`
`35.
`
`In one embodiment, Beser discloses a method “for initiating a VoIP
`
`association between an originating telephony device 24 and a terminating
`
`telephony device 26.” (Id. at 9:64-66.) “First network device 14” may receive “a
`
`request to initiate the VoIP association from” the originating telephony device 24.
`
`(Id. at 10:2-7.) “[T]he request includes a unique identifier for the terminating
`
`telephony device 26,” which may be “any of a dial-up number, an electronic mail
`
`address, or a domain name.” (Id. at 10:5-6, 10:39-41.) “[T]he unique identifier
`
`may be included in the payload of an IP 58 or MAC 54 packet.” (Id. at 11:3-5.)
`
`36.
`
`In my opinion, portions of the Decision’s preliminary analysis are
`
`based on an inaccurate description of Beser’s teachings. For example, the
`
`Decision states that “[t]o begin the process for a secure transaction, at step 102,
`
`requesting device 24 sends to network device 16, as part of its request, an
`
`indicator.” (Decision at 17-18.) I disagree. Beser explains that “network device
`
`14,” not “network device 16,” receives a request from “originating telephony
`
`device 24.” (Ex. 1009 at 10:2-7.) This can also be seen in Beser’s Figure 6, which
`
`23
`
`Page 23 of 55
`
`
`
`
`depicts a request 112 being sent from originating telephony device 24 to network
`
`Case No. IPR2014-00237
`
`device 14 (not to network device 16).
`
`37. The Decision also states that Beser supports Apple’s position that
`
`“[t]he first network device [14] receives the request, evaluates it, and then sends it
`
`to [ ] trusted-third-party network device [30], if appropriate.” (Decision at 19-20.)
`
`This suggests that Beser discloses what to do if it is not “appropriate” to send a
`
`request to the trusted-third-party network device, which Beser does not do.
`
`B. Claims 1 and 16
`1. Intercepting a Request to Look Up an IP Address of the
`Second Network Device
`In my opinion, Beser does no