`Date: May 14, 2014
`
`Trials@uspto.gov
`571-272-7822
`
`
`
`
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`_____________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`____________
`
`APPLE INC.
`Petitioner
`
`v.
`
`VIRNETX INC.
`Patent Owner
`____________
`
`Case IPR2014-00237
`Patent 8,504,697 B2
`____________
`
`Before MICHAEL P. TIERNEY, KARL D. EASTHOM, and STEPHEN C. SIU,
`Administrative Patent Judges.
`
`EASTHOM, Administrative Patent Judge.
`
`
`
`
`DECISION
`Institution of Inter Partes Review
`37 C.F.R. § 42.108
`
`
`
`
`
`
`
`
`
`
`
`IPR2014-00237
`Patent 8,504,697 B2
`
`I.
`
`BACKGROUND
`
`
`
`
`
`Introduction
`A.
`Apple Inc. (“Petitioner”) filed a Petition requesting inter partes review of
`claims 1–11, 14–25, and 28–30 of U.S. Patent No. 8,504,697 B2 (“the ’697
`Patent,” Ex. 1001) pursuant to 35 U.S.C. §§ 311-319.1 Paper 1 (“Pet.”). In
`response, VirnetX, Inc. (“Patent Owner”) filed a Preliminary Response. Paper 12
`(“Prelim. Resp.”).
`We have jurisdiction under 35 U.S.C. § 314. The standard for instituting
`inter partes review is set forth in 35 U.S.C. § 314 (a), which follows:
`THRESHOLD -- The Director may not authorize an inter partes review
`to be instituted unless the Director determines that the information
`presented in the petition filed under section 311 and any response
`filed under section 313 shows that there is a reasonable likelihood that
`the petitioner would prevail with respect to at least 1 of the claims
`challenged in the petition.
`We determine, based on the record, that Petitioner has demonstrated, under
`35 U.S.C. § 314(a), that there is a reasonable likelihood of unpatentability with
`respect to at least one of the challenged claims.
`Petitioner relies on the following prior art:
`U.S. Patent No. 6,496,867 B1 (issued Dec. 17, 2002, filed Aug. 27, 1999)
`(“Beser”) Ex. 1009.
`
`S. Kent and R. Atkinson, Security Architecture for the Internet Protocol,
`Request for Comments: 2401 (Nov. 1998) (“RFC 2401”). Ex. 1010
`
`1 The ’697 Patent lists continuation-in-part status back to October 29, 1999. The
`’697 Patent also lists related provisional applications, but does not claim continuity
`to those applications. Petitioner sets forth reasons why the ’697 Patent only
`supports the independent claims back to February 15, 2000. See Pet. 3–5. Based
`on Petitioner’s preliminary showing, for purposes of this decision, we assume that
`the earliest effective filing date is February 15, 2000. Patent Owner has not
`challenged this date on this record.
`
`2
`
`
`
`
`IPR2014-00237
`Patent 8,504,697 B2
`
`
`
`
`
`
`
`M. Handley et al., SIP: Session Initiation Protocol, Request for Comments:
`2543 (Mar. 1999) (“RFC 2453”). Ex. 1012.
`
`H. Schulzrinne et al., RTP: A Transport Protocol for Real-Time
`Applications, Request for Comments: 1889 (Jan. 1996) (“RFC 1889”). Ex. 1013.
`
`M. Handley and V. Jacobson, SDP: Session Description Protocol, Request
`for Comments: 2327 (Apr. 1998) (“RFC 2327”). Ex. 1014.
`
`Elin Wedlund and Henning Schulzrinne, Mobility Support Using SIP,
`WoWMoM 99, 76–82 (1999) (“Mobility Support”). Ex. 1015.
`
`P. Mockapetris, Domain Names – Concepts and Facilities, Request for
`Comments: 1034 (Nov. 1987), http://www.ietf.org/rfc/rfc1034.txt (last visited on
`July 8, 2011) (“RFC 1034”). Ex. 1016.
`
`P. Mockapetris, Domain Names – Implementation and Specification,
`Request for Comments: 1035 (Nov. 1987), http://tools.ietf.org/html/rfc1035 (last
`visited on July 8, 2011) (“RFC 1035”). Ex. 1017.
`
`Petitioner contends that the challenged claims are unpatentable under
`35 U.S.C. § 102 and § 103 based on the following specific grounds. Pet. 3.
`
`Reference(s)
`
`Basis
`
`Claims challenged
`
`Beser
`RFC 2453
`Beser and RFC 2401
`RFC 2453, RFC 1889, and
`RFC 2327
`RFC 2453 and Mobility
`Support
`
`§ 102 (e)
`§ 102 (a)
`§ 103(a)
`§ 103(a)
`
`§ 103(a)
`
`1–11, 14–25, and 28–30
`1–11, 14–25, and 28–30
`1–11, 14–25, and 28–30
`1–11, 14–25, and 28–30
`
`8, 9, 22, and 23
`
`
`
`
`
`3
`
`
`
`
`IPR2014-00237
`Patent 8,504,697 B2
`
`
`
`
`
`The ’697 Patent
`B.
`To provide a secure network, the ’697 Patent system modifies conventional
`Domain Name Servers, which the ’697 Patent describes as follows:
`Conventional Domain Name Servers (DNSs) provide a look-up
`function that returns the IP [Internet Protocol] address of a
`requested computer or host. For example, when a computer
`user types in the web name “Yahoo.com,” the user’s web
`browser transmits a request to a DNS, which converts the name
`into a four-part IP address that is returned to the user’s browser
`and then used by the browser to contact the destination web
`site.
`Ex. 1001, 39:32–38.
`The ’697 Patent system establishes a secure communication link between a
`first computer and a second computer using a specialized DNS server that traps
`DNS requests. Prior to setting up the secure network or Virtual Private Network
`(“VPN”), a DNS proxy server determines, using a domain name extension, a table,
`or a rule, or by requesting further information from the user, whether the user has
`sufficient security privileges to access a desired target site. See Ex. 1001, 41:6–64.
`If so, the proxy DNS requests a gatekeeper to set up a secure communication link
`between the user and target by passing a “resolved” address or “hopblocks” for the
`addresses. See Ex. 1001, 40:37–65; Fig. 27. Any of various fields can be
`“hopped,” for example, “IP source/destination addresses” or “a field in the
`header.” Ex. 1001, 41:38–39. If the user lacks sufficient security privileges, the
`system returns a “HOST UNKNOWN” error message. Ex. 1001, Fig. 27.
`In other words, to provide security, the proxy server does not send back the
`true IP address of the target computer. See Ex. 1001, 40:1–20. For example, the
`proxy server may receive the client’s DNS request, which forwards it to a
`gatekeeper, which returns a “resolved” destination address to the proxy based on a
`
`4
`
`
`
`
`IPR2014-00237
`Patent 8,504,697 B2
`
`
`
`“resolved” name, which then forwards the “resolved address” back to the client “in
`a secure administrative VPN.” See Ex. 1001, 41:49–56.
`Claims 2–11, 14–25, and 28–30 depend from independent claims 1 or 16,
`which follow:
`1. A method of connecting a first network device and a second
`network device, the method comprising:
`
`intercepting, from the first network device, a request to look up
`an internet protocol (IP) address of the second network device based
`on a domain name associated with the second network device;
`
`determining, in response to the request, whether the second
`network device is available for a secure communications service; and
`
`initiating a secure communication link between the first
`network device and the second network device based on a
`determination that the second network device is available for the
`secure communications service;
`
`wherein the secure communications service uses the secure
`communication link to communicate at least one of video data and
`audio data between the first network device and the second network
`device.
`
`
`
`16. A system for connecting a first network device and a second
`network device, the system including one or more servers configured
`to:
`intercept, from the first network device, a request to look up an
`
`internet protocol (IP) address of the second network device based on a
`domain name associated with the second network device;
`
`determine, in response to the request, whether the
`second network device is available for a secure communications
`service; and
`
`initiate a secure communication link between the first network
`device and the second network device based on a determination that
`the second network device is available for the secure communications
`service,
`
`wherein the secure communications service uses the secure
`communication link to communicate at least one of video data and
`audio data between the first network device and the second network
`device.
`
`5
`
`
`
`
`
`
`IPR2014-00237
`Patent 8,504,697 B2
`
`
`
`Related District Court Litigation
`C.
`According to Petitioner, Patent Owner amended its complaint to assert the
`’697 Patent in VirnetX Inc. and Science Applications International Corporation v.
`Apple Inc., No. 6:12-cv-00855-LED (E.D. Tex., Aug. 27, 2013).2 See Pet. 1.
`D.
`Claim Interpretation
`Consistent with the statute and the legislative history of the Leahy-Smith
`America Invents Act, Pub. L. No. 112-29, 125 Stat. 284, 329 (Sept. 16, 2011)
`(“AIA”), the Patent Trial and Appeal Board (“Board”) interprets claim terms by
`applying the broadest reasonable construction in the context of the Specification in
`which the claims reside. 37 C.F.R. § 42.100(b); see Office Patent Trial Practice
`Guide, 77 Fed. Reg. 48,756, 48,766 (Aug. 14, 2012.)
`Under the broadest reasonable interpretation standard, claim terms are given
`their ordinary and customary meaning, as would be understood by one of ordinary
`skill in the art in the context of the entire disclosure. In re Translogic Tech., Inc.,
`504 F.3d 1249, 1257 (Fed. Cir. 2007). Any special definition for a claim term
`must be set forth in the specification with reasonable clarity, deliberateness, and
`precision. In re Paulsen, 30 F.3d 1475, 1480 (Fed. Cir. 1994). Claim terms
`typically do not include limitations from embodiments described in a patent
`specification if the claim language is broader than the embodiment. See In re Van
`Geuns, 988 F.2d 1181, 1184 (Fed. Cir. 1993).
`
`
`
`
`2 According to the complaint, Science Applications International Corporation
`“maintains an equity interest and review rights related to the [’]697 patent.” See
`https://www.docketnavigator.com/document/order/472377e1-70d4-cb08-d909-
`dac6593604f4 (complaint at 4, n.6). Ex. 3001.
`6
`
`
`
`
`IPR2014-00237
`Patent 8,504,697 B2
`
`
`
`
`
`We construe the following claim terms in light of the Specification of the
`’697 Patent:
`“secure communication link”
`1.
`Claim 1, for example, recites initiating “a secure communication link”
`between devices. Petitioner argues that the term “secure communication link”
`should be construed to include “a communication link in which computers
`privately and directly communicate with each other on insecure paths between the
`computers where the communication is both secure and anonymous, and where the
`data transferred may or may not be encrypted.” Pet. 9–10 (citation omitted).
`Patent Owner argues that the term should be construed to mean “a direct
`communication link that provides data security through encryption.” Prelim. Resp.
`23.
`
`As described above, Petitioner argues that “secure communication link,”
`should include the features of computers “privately and directly” communicating
`with each other “on insecure paths” and that the “communication is both secure
`and anonymous.” Petitioner has not demonstrated sufficiently that the
`Specification supports the contention that a “secure communication link” must
`include each of the proposed limitations. Therefore, we are not persuaded by
`Petitioner’s arguments that a reasonable construction of the term must include
`providing “private” and “direct” communication “on insecure paths . . . where the
`communication is both secure and anonymous.”
`As described above, Patent Owner argues that the broadest, reasonable
`construction of the term “secure communication link” must include the feature that
`the link “provides data security through encryption.” We are not persuaded that a
`broadest, reasonable construction of the term “secure communication link”
`requires the link to be limited to providing data security through encryption.
`
`7
`
`
`
`
`IPR2014-00237
`Patent 8,504,697 B2
`
`
`
`
`
`Patent Owner reasons that this claim term may be limited to certain features
`“[b]ased on the [P]atent specification and its prosecution history.” Prelim. Resp.
`22. However, as Petitioner explains, and as explained further below, the
`Specification does not appear to require encryption as the only method of
`providing a “secure communication link.” See Pet. 7–9. Nor does Patent Owner
`specify particular citations from the Specification that would indicate a
`requirement that a “secure communication link” must require encryption.
`Patent Owner’s prosecution history arguments also do not redound to an
`implicit encryption limitation. See Prelim. Resp. 18–19 (citing Ex. 1056, 25
`(Rexam. Control No. 95/001,788)). The relied-upon citation refers to an ongoing
`reexamination of a patent from which the ’697 Patent claims continuity, U.S.
`Patent No. 7,418,504. Patent Owner fails to explain persuasively how this ongoing
`proceeding limits the claim term. See Tempo Lighting, Inc. V. Tivoli, LLC, 742
`F.3d 973, 978 (Fed. Cir. 2014) (“This court also observes that the PTO is under no
`obligation to accept a claim construction proffered as a prosecution history
`disclaimer, which generally only binds the patent owner.”).3 Moreover, the
`examiner in that proceeding determined that a “secure communication link” does
`not require encryption. See Reexam. Control No. 95/001,788, Action Closing
`Prosecution 33 (Sept. 26, 2012).
`Patent Owner also argues that the ’697 Patent “stems from a continuation-in-
`part application and explains that . . . later-discussed . . . embodiments can
`
`3 If the proceeding is ongoing, the “prosecution history” is incomplete. In any
`event, “while the prosecution history can inform whether the inventor limited the
`claim scope in the course of prosecution, it often produces ambiguities created by
`ongoing negotiations between the inventor and the PTO. Therefore, the doctrine of
`prosecution disclaimer only applies to unambiguous disavowals.” Grober v. Mako
`Prods., Inc., 686 F.3d 1335, 1341 (Fed. Cir. 2012) (citing Abbott Labs. v. Sandoz,
`Inc., 566 F.3d 1282, 1289 (Fed. Cir. 2009)).
`8
`
`
`
`
`
`
`IPR2014-00237
`Patent 8,504,697 B2
`
`
`incorporate the earlier-described principles of encryption” and “can be employed
`using . . . aforementioned principles.” Prelim. Resp. 19–20 (citation
`omitted). Even if the Specification states that embodiments described therein “can
`incorporate” previously described embodiments or “can be employed” using
`previously described features, Patent Owner does not demonstrate persuasively
`that the Specification discloses or implies that a “secure communication link” must
`be implemented with encryption only.
`Therefore, even if the disclosed “DNS-based VPN scheme” “can” include
`encryption, it need not. See Pet. 8–9, n. 1; Prelim. Resp. 19; Ex. 1001,
`39:28–42:16. The ’697 Patent states that “[a]ddress hopping provides security and
`privacy.” Ex. 1001, 25:54–56. It also states that “[s]ecure hosts such as site 2604
`are assumed to be equipped with a secure communication function such as an IP
`hopping function 2608.” Id. at 40:66–68.
`According to Patent Owner, “[a] link that prevents others from
`understanding the communications sent over it may still be considered ‘secure’
`even if the communicating parties do not enjoy any anonymity.” Prelim. Resp. 22.
`According to one technical dictionary, the term “secure,” in the context of
`communications, carries a broad meaning, and supports Patent Owner’s statement.
`For example, “security” is “[t]he existence and enforcement of techniques which
`restrict access to data, and the conditions under which data may be obtained.”
`MCGRAW-HILL DICTIONARY OF SCIENTIFIC AND TECHNICAL TERMS 1780 (5th ed.
`1994). Ex. 3002.
`Another technical source provides the following, similar broad meaning:
`security service (1) A service, provided by a layer of communicating
`open systems, that ensures adequate security of the systems or of data
`transfers. . . .
`(2) The capability of the system to ensure the security of system
`
`9
`
`
`
`
`IPR2014-00237
`Patent 8,504,697 B2
`
`
`
`
`
`resources or data transfers. Access controls, authentication,
`data confidentiality, data integrity, and nonrepudiation
`are traditional data communications security services.
`IEEE 100, THE AUTHORITATIVE DICTIONARY OF IEEE STANDARDS TERMS 1016
`(7th ed. 2000), available at
`http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=4116807. Ex. 3003.
`Based on the foregoing, using a plain and ordinary construction in light of
`the ’697 Patent, the broadest reasonable construction of the term “secure
`communication link” is a transmission path that restricts access to data, addresses,
`or other information on the path, generally using obfuscation methods to hide
`information on the path, including, but not limited to, one or more of
`authentication, encryption, or address hopping.
`2.
`“secure communication service”
`Petitioner argues that the term “secure communication service” should be
`construed to include the “functional configuration of a computer that enables it to
`participate in a secure communication link with another computer.” Pet. 11.
`Patent Owner argues that the term should be construed to mean the “functional
`configuration of a network device that enables it to participate in a secure
`communication link with another network device.” Prelim. Resp. 24. Claim 1,
`which is similar to claim 16, recites determining whether a “second network device
`is available for a secure communications service,” and the “secure communications
`service uses the secure communication link to communicate at least one of video
`data and audio data between” the first and second network devices.
`Petitioner and Patent Owner appear to agree generally on the construction of
`the term “secure communication service” with the exception of whether the
`“secure communication service” should include a computer or, more broadly, a
`“network device.” In view of Patent Owner’s explanation that the Specification
`
`10
`
`
`
`
`
`
`IPR2014-00237
`Patent 8,504,697 B2
`
`
`discloses different embodiments in which devices other than computers are used
`(see Prelim. Resp. 23–24), and in view of the fact that independent claims 1 and 16
`do not limit the “secure communication service” to a computer, we adopt Patent
`Owner’s construction of the term “secure communication service.”
`3. “virtual private network (VPN)”
`Dependent claims 3 and 17 recite “wherein the secure communication link is
`a virtual private network communication link.” Similar to the disagreement over
`the “secure communication service,” Petitioner and Patent Owner disagree over
`whether a VPN requires encryption. See Pet. 8–9, n. 1; Prelim. Resp. 17–23. For
`similar reasons, Petitioner’s contention that it does not is more persuasive. Patent
`Owner and Petitioner do not argue a clear distinction exists between a VPN and a
`“secure communication service.” The ’697 Patent explains a “secure
`communication link” is “a virtual private communication link over the computer
`network.” Ex. 1001, 6:63–65.
`Petitioner provides evidence that in 1998, the term VPN had “myriad
`definitions,” but it does not require encryption. See Ex. 1073, 1; Pet. 8–9, n.1
`(citing Ex. 1073, 2). One definition follows, although other definitions, with more
`stringent requirements, exist: “[a] VPN is private network constructed within a
`public network.” Ex. 1073, 5. In some contexts, “private,” as compared to
`“public,” implies restricted access by a defined set of entities, and hence, security.
`See id. at 2. Petitioner’s declarant, Mr. Fratto, cites another source that states that
`“[a] VPN can be built using tunnels or encryption . . . or both,” and also shows that
`a VPN requires “segregation of communications to a closed community of
`interest.” Ex. 1003 (“Fratto Declaration”) ¶ 209 (quoting Ex. 1024, 16–17)
`(emphases omitted). Mr. Fratto refers to the ’697 Patent Specification as
`employing hiding or obfuscation techniques, and not necessarily encryption, to
`
`11
`
`
`
`
`
`
`IPR2014-00237
`Patent 8,504,697 B2
`
`
`create private, secure communications. Id. ¶ 210 (discussing Ex. 1001, 1:57–58;
`2:44–54).
`On this record, a VPN is interpreted to mean a “secure communication link”
`with the additional requirement that the link includes a portion of a public network.
`4.
`“intercepting a request”
`Petitioner argues that the term “intercepting,” as recited in independent
`claim 1, which is similar to independent claim 16, should be construed to include
`“a proxy computer or device receiving and acting on a request sent by a first
`computer that was intended for another computer.” Pet. 12. Patent Owner
`disagrees with Petitioner’s construction and argues the term should be construed as
`“receiving a request to look up an internet protocol address and, apart from
`resolving it into an address, performing an evaluation on it related to establishing a
`secure communication link.” Prelim. Resp. 24. Claim 1, for example, recites
`“intercepting . . . a request to look up an internet protocol (IP) address of the
`second network device.”
`Neither Petitioner nor Patent Owner points to an explicit definition of the
`term “intercepting” in the Specification. In the absence of a specialized definition
`provided in the Specification, we generally agree with Petitioner that one of
`ordinary skill in the art would have understood that “intercepting” a request would
`require “receiving and acting on” a request, the request being “intended for”
`receipt at a destination other than the destination at which the request is
`intercepted. On the other hand, Patent Owner’s addition of various unclaimed
`limitations and other features into the construction of the term “intercepting” does
`not redound to the broadest reasonable interpretation.
`For example, to the extent that “resolving [the request] into an address”
`would include “look[ing] up an internet protocol (IP) address of the second
`
`12
`
`
`
`
`
`
`IPR2014-00237
`Patent 8,504,697 B2
`
`
`network device,” as recited in claim 1, Patent Owner does not explain adequately
`why incorporating an explicitly recited claim requirement into the construction of
`the term “intercepting” is warranted. That claim requirement defines the “request,”
`not the “intercepting”––i.e., “a request to look up an internet protocol (IP)
`address.”
`Patent Owner also argues that the Specification explicitly discloses an
`example in which a device (i.e., “DNS proxy 2610”) “intercepts” a request that is
`“intended for” another device (i.e., “DNS server 2609”), but that the “DNS proxy
`2610” and the “DNS server 2609” “might be on the same computer 2602”. Prelim.
`Resp. 25 (citation omitted). Based on this embodiment, Patent Owner argues that
`“intercepting,” as recited in claim 1, for example, need not be accomplished by
`“another computer” (or device).
`The Specification discloses that the “DNS Server 2609” and the “DNS proxy
`2610” are located on the same “computer 2602,” but that “DNS Server 2609” and
`“DNS proxy 2610” are disclosed as distinguishable entities. See Prelim. Resp. 25–
`26 (discussing Ex. 1001, Fig. 26). Hence, on the record at this preliminary stage of
`the proceedings, we modify Petitioner’s proposed construction of “intercepting a
`request” to accommodate the specific embodiment as described by Patent Owner,
`so that “intercepting a request” includes “receiving a request pertaining to a first
`entity at another entity.”
`Patent Owner also argues that “intercepting a request” does not require
`“illicitly” receiving a request or “perform[ance] by a specific hardware
`apparatus.” Prelim. Resp. 26, 28. Petitioner’s definition does not include these
`requirements. Based on the foregoing discussion, the term “intercepting” means
`“receiving a request pertaining to a first entity at another entity.”
`
`
`13
`
`
`
`
`
`
`
`
`IPR2014-00237
`Patent 8,504,697 B2
`
`
`“modulation”
`5.
`Claims 6 and 20 recite “wherein the telephony service uses modulation.”
`Petitioner argues that the term “modulation” should be construed to include “the
`process of encoding data for transmission over a physical or electromagnetic
`medium by varying a carrier signal.” Pet. 16. Patent Owner argues that the term
`need not be construed, or that the term should be construed to include “the process
`of encoding data for transmission over a medium by varying a carrier
`signal.” Prelim. Resp. 29.
`Petitioner and Patent Owner appear generally to agree that “modulation”
`includes “encoding data for transmission.” At this preliminary stage of the
`proceedings, we adopt the parties’ broad but reasonable construction of the term
`“modulation” to include “the process of encoding data for transmission.”
`6.
`“determining, in response to the request, whether the second network
`device is available for a secure communications”
`Claim 1 recites this phrase. Claim 16 recites a similar phrase. The parties
`do not define the phrase. Petitioner implies that this step may include evaluating a
`packet request to initiate a secure transaction, including evaluating a unique
`identifier, determining if it has a public internet address, and designating private IP
`addresses for the requesting device, the target device, or both. See Pet. 19–21. In
`other words, finding and obtaining the private and secure IP addresses for the
`requesting and target devices, in response to a request that includes a domain
`name, ensures that the requesting device has access to the “available” target
`device––the second network device.
`Patent Owner generally argues that Petitioner fails to show how the prior art
`teaches the step and cites to the ’697 Patent to show support for it. See Prelim.
`Resp. 5–6, 14 (citing Ex. 1001, 40:37–49; 41:30–39, 47–56; Fig. 27 (process flow
`
`14
`
`
`
`
`
`
`IPR2014-00237
`Patent 8,504,697 B2
`
`
`box 2704)). The citations generally discuss determining availability from the
`perspective of the requester. For example, box 2704 of Figure 27 represents
`determining if “USER [IS] AUTHORIZED TO CONNECT?” Ex. 1001. The
`citations describe how the process determines whether the requester has “sufficient
`security privileges” id. at 40:37–48; 41:30, or “permission,” id. at 41:47, to access
`the target, and if so, it allocates a secure address hopping scheme between the
`target and requester. See id. at 40:37–49; 41:30–39, 47–56; Fig. 27.
`These citations that refer to determining “security privileges” of the first
`network device do not explicitly refer to determining “availability” of the target––
`the second network device. See Prelim. Resp. 14 (citing Ex. 1001 as noted
`above).) Claims 1 and 16 do not require determining accessibility from the
`perspective of the requester in terms of its privilege or level of security.
`Nevertheless, such a determination of privilege, though not required, may be
`included in determining access by a requester to a specific target. In addition,
`according to the citations discussed above, determining availability reasonably
`means determining if a public and private address for a second network exists.
`Based on the record, “determining, in response to the request, whether the second
`network device is available for a secure communications,” includes determining,
`one or more of 1) whether the device is listed with a public internet address, and if
`so, allocating a private address for the second network device, or 2) some
`indication of the relative permission level or security privileges of the requester.
`
`
`
`
`
`
`15
`
`
`
`
`IPR2014-00237
`Patent 8,504,697 B2
`
`II. ANALYSIS
`
`
`
`
`
`Preliminary Arguments
`A.
`Patent Owner maintains that the Petition is defective because 1) it presents
`redundant grounds, 2) the examiner, in the underlying examination of the ’697
`Patent, considered at least one of the references in an Information Disclosure
`Statement, 3) the Petition “proposes unreasonable claim constructions” 4) the
`Petition cites almost exclusively to a declaration instead of the prior art references,
`and 5) the Petition effectively circumvents the page requirements by using that
`citation style. See Prelim. Resp. 1–5.
`Notwithstanding Petitioner’s arguments, the Board has discretion not to go
`forward on redundant grounds in the interests of expediency. See 37 C.F.R. §§
`42.5(a), 42.108(a). Patent Owner does not specify why the Board may or may not
`consider prior art cited during prosecution. Cf. 35 U.S.C. § 315(d) (discretion to
`consider current proceedings before the office). The Board may not institute
`unless there are “[s]ufficient grounds.” See 37 C.F.R. § 42.108(c). Citing to
`declaration evidence, which, in turn, cites to prior art evidence, raises the risk that
`the Board will not consider the evidence. See 37 C.F.R. § 42.104(b)(5) (petition
`must include “[t]he exhibit number of the supporting evidence relied upon”).
`Notwithstanding Patent Owner’s assertions, we determine, based on the record,
`that Petitioner establishes “sufficient grounds” within the prescribed page limits.
`B.
`Anticipation by Beser
`Petitioner asserts that Beser anticipates claims 1–11, 14–25, and 28–30
`under 35 U.S.C. § 102(e).4 Pet. 16–33. In support of this ground of
`
`
`4 Under an alternative ground, Petitioner asserts that the combination of Beser and
`RFC 2401 renders these claims obvious. See Pet. 16, 33–38.
`16
`
`
`
`
`IPR2014-00237
`Patent 8,504,697 B2
`
`
`
`unpatentability, Petitioner relies on its declarant, Mr. Fratto. See Ex. 1003 (“Fratto
`Declaration”).
`In general, Beser describes systems that establish an IP (internet protocol)
`tunneling association between two end devices 24 and 26 on private networks,
`using first and second network devices 14 and 16, and trusted-third-party network
`device 30, over public network 12. See Ex. 1009, Abstract, Fig. 1; Pet. 16.
`Figure 1 of Beser follows:
`
`
`Figure 1 above represents Beser’s basic system, which includes the Internet
`or a campus network, as public network 12, with any number of private networks
`20 connected thereto. See Ex. 1009, Abstract, 3:60–4:18.
`
`Beser’s system “increases the security of communication on the data
`network” by providing and hiding, in packets, “private addresses” for originating
`(requesting) and terminating (receiving) devices on the network, for example
`devices 24 and 26. See Ex. 1009, Abstract. The IP packets on the public internet
`portion of the network “may require encryption or authentication.” Id. at 11:22–
`25; see also id. at 1:54–2:17 (discussing encryption as background art).
`
`To begin the process for a secure transaction, at step 102, requesting device
`24 sends to network device 16, as part of its request, an indicator that “may be a
`17
`
`
`
`
`
`
`IPR2014-00237
`Patent 8,504,697 B2
`
`
`distinctive sequence of bits [that] indicates to the tunneling application that it
`should examine its request and not ignore the datagram.” Ex. 1009, 8:34–44,
`Figs. 1, 4. The request also includes a unique identifier, such as a domain name,
`employee number, telephone number, social security number, or other similar
`identifier, associated with terminating device 26. Ex. 1009, 10:37–11:8. At step
`104, network device 16 informs trusted-third-party network device 30 of the
`request. This informing request may include a unique set of bits to “indicate[] to
`the tunneling application that it should examine the informing message for its
`content and not ignore the datagram.” Id. at 8:66–9:1. In general, trusted-third-
`party device 30 contains a directory of users, and may be, for example, a server,
`including a domain name server (DNS), that retains a list of public IP addresses
`associated at least with second network devices 16 and terminating devices 26.
`See id. at 11:32–58.
`
`Step 106 (and parallel step 116) involves associating terminating network
`device 26, based on its unique name (e.g., domain name), with router or modem
`16. See Ex. 1009, 11:26–28, Figs. 4, 5.5 As noted, Beser’s system includes stored
`public IP address for router or modem 16 and second network device 26, which are
`involved in the association with the domain name. Id. at 11:48–52. As one
`example, the system “retains a list of [telephone] numbers of its subscribers,”
`which each have a public IP address linked thereto in a database. Id. at 11:47–52.
`In addition, the system may use other “data structures . . . known to those skilled in
`the art . . . for the association of the unique identifiers and IP 58 ad