`571-272-7822
`
`Paper 11
`Date: February 28, 2022
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`SAMSUNG ELECTRONICS AMERICA, INC.,
`Petitioner,
`v.
`PROXENSE, LLC,
`Patent Owner.
`
`IPR2021-01444
`Patent 8,352,730 B2
`
`Before KEVIN F. TURNER, JUSTIN T. ARBES, and
`DAVID C. McKONE, Administrative Patent Judges.
`ARBES, Administrative Patent Judge.
`
`DECISION
`Denying Institution of Inter Partes Review
`35 U.S.C. § 314
`
`INTRODUCTION
`I.
`A. Background and Summary
`Petitioner Samsung Electronics America, Inc. filed a Petition (Paper 2,
`“Pet.”) requesting inter partes review of claims 1–11 of U.S. Patent
`No. 8,352,730 B2 (Ex. 1001, “the ’730 patent”) pursuant to 35 U.S.C.
`§ 311(a). Patent Owner Proxense, LLC filed a Preliminary Response
`(Paper 9, “Prelim. Resp.”) pursuant to 35 U.S.C. § 313.
`
`1
`
`MICROSOFT 1007
`
`
`
`IPR2021-01444
`Patent 8,352,730 B2
`Pursuant to 35 U.S.C. § 314(a), the Director may not authorize an
`inter partes review unless the information in the petition and preliminary
`response “shows that there is a reasonable likelihood that the petitioner
`would prevail with respect to at least 1 of the claims challenged in the
`petition.” See 37 C.F.R. § 42.4(a) (“The Board institutes the trial on behalf
`of the Director.”). For the reasons that follow, we do not institute an inter
`partes review.
`
`
`B. Related Matters
`The parties indicate that the ’730 patent is the subject of Proxense,
`LLC v. Samsung Electronics Co., Ltd., Case No. 6:21-cv-00210 (W.D. Tex.)
`(“the district court case”). See Pet. 3; Paper 5, 1. Petitioner also filed
`petitions challenging claims of other patents asserted in the district court
`case in Cases IPR2021-01438, IPR2021-01439, IPR2021-01447, and
`IPR2021-01448.
`
`
`C. The ’730 Patent
`The ’730 patent discloses systems for “authentication responsive to
`biometric verification of a user being authenticated,” using “a biometric key
`[that] persistently (or permanently) stores a code such as a device identifier
`(ID) and biometric data for a user in a tamper-resistant format.” Ex. 1001,
`col. 1, ll. 57–62. The ’730 patent states that “[c]onventional user
`authentication techniques,” such as requiring input of a password, were
`deficient because they “require[d] the user to memorize or otherwise keep
`track of the credentials” and “it can be quite difficult to keep track of them
`all.” Id. at col. 1, ll. 23–32. Other techniques, such as “provid[ing] the user
`with an access object . . . that the user can present to obtain access,” were
`
`2
`
`
`
`IPR2021-01444
`Patent 8,352,730 B2
`inadequate because “authentication merely proves that the access object
`itself is valid; it does not verify that the legitimate user is using the access
`object.” Id. at col. 1, ll. 33–43. According to the ’730 patent, there was a
`need in the art for a system for “verifying a user that is being authenticated
`that does not suffer from [such] limitations” and “ease[s] authentications by
`wirelessly providing an identification of the user.” Id. at col. 1, ll. 49–53.
`Figure 2 of the ’730 patent is reproduced below.
`
`
`Figure 2 is a block diagram of the functional modules of a biometric key.
`Id. at col. 2, ll. 41–43. Enrollment module 222 registers a user with
`biometric key 100 by persistently storing biometric data associated with the
`user (e.g., a digital image of the retina, fingerprint, or voice sample) in
`persistent storage 226. Id. at col. 4, ll. 4–28. Enrollment module 222
`registers biometric key 100 with a trusted authority by providing a code,
`such as a device ID, to the trusted authority or, alternatively, the trusted
`authority can provide a code to biometric key 100. Id. at col. 4, ll. 8–12.
`The code is stored in persistent storage 226. Id. at col. 4, ll. 43–45.
`“Persistent storage 226 is itself, and stores data in, a tamper-proof format to
`prevent any changes to the stored data.” Id. at col. 4, ll. 36–38.
`“Tamper-proofing increases reliability of authentication because it does not
`allow any changes to biometric data (i.e., allows reads of stored data, but not
`
`3
`
`
`
`IPR2021-01444
`Patent 8,352,730 B2
`writes to store new data or modify existing data).” Id. at col. 4, ll. 38–41.
`In a fingerprint embodiment, validation module 224 uses scan pad 120
`(shown in Figure 1) to capture scan data from the user’s fingerprint and
`compares the scanned data to the stored fingerprint to determine whether the
`scanned data matches the stored data. Id. at col. 4, ll. 14–23.
`The interaction of biometric key 100 with other system components is
`illustrated in Figure 3, reproduced below.
`
`
`Figure 3 is “a block diagram illustrating a system for providing
`authentication information for a biometrically verified user.” Id. at col. 2,
`ll. 44–46. Authentication module 310 is coupled to biometric key 100 via
`line 311 (a wireless medium) and with trusted key authority 320 via line 312
`(a secure data network such as the Internet). Id. at col. 5, ll. 8–12.
`Authentication module 310 requires the device ID code (indicating
`successful biometric verification) from biometric key 100 before allowing
`
`4
`
`
`
`IPR2021-01444
`Patent 8,352,730 B2
`the user to access application 330. Id. at col. 5, ll. 12–19. Authentication
`module 310 provides the device ID code from biometric key 100 to trusted
`key authority 320 to verify that it belongs to a legitimate key. Id. at col. 5,
`ll. 19–23; see also id. at col. 5, ll. 42–48 (“In one embodiment, trusted key
`authority 320 verifies that a code from a biometric key is legitimate. To do
`so, the trusted key authority 320 stores a list of codes for legitimate
`biometric keys. . . . In one embodiment, trusted key authority 320 can also
`store a profile associated with a biometric key.”). Authentication module
`310 then sends a message to application 330 to allow the user access to the
`application responsive to a successful authentication by trusted key authority
`320. Id. at col. 5, ll. 23–26.
`“Application 330 can be, for example, a casino machine, a keyless
`lock, a garage door opener, an ATM machine, a hard drive, computer
`software, a web site, a file, and the like.” Id. at col. 5, ll. 28–31. Trusted
`key authority 320 can be operated by an agent, such as “a government
`official, a notary, and/or an employee of a third party which operates the
`trusted key authority, or another form of witness.” Id. at col. 6, ll. 35–38.
`“The agent can follow standardized procedures such as requiring
`identification based on a state issued driver license, or a federally issued
`passport in order to establish a true identity of the user.” Id. at col. 6,
`ll. 38–41.
`
`
`D. Illustrative Claim
`Challenged claims 1 and 8 of the ’730 patent are independent.
`Claims 2–7 depend directly from claim 1 and claims 9–11 depend directly or
`indirectly from claim 8.
`
`
`
`5
`
`
`
`IPR2021-01444
`Patent 8,352,730 B2
`Claim 1 recites (with letter designations used in the Petition to refer to
`the various limitations):
`1. A method for verifying a user during authentication of
`an integrated device, comprising the steps of:
`[A] persistently storing biometric data of the user and a
`plurality of codes and other data values comprising a device ID
`code uniquely identifying the integrated device and a secret
`decryption value in a tamper proof format written to a storage
`element on the integrated device that is unable to be subsequently
`altered; [B] wherein the biometric data is selected from a group
`consisting of a palm print, a retinal scan, an iris scan, a hand
`geometry, a facial recognition, a signature recognition and a
`voice recognition;
`[C] responsive to receiving a request for a biometric
`verification of the user, receiving scan data from a biometric
`scan;
`
`[D] comparing the scan data to the biometric data to
`determine whether the scan data matches the biometric data;
`[E] responsive to a determination that the scan data
`matches the biometric data, wirelessly sending one or more codes
`from the plurality of codes and the other data values for
`authentication by an agent that is a third-party trusted authority
`possessing a list of device ID codes uniquely identifying
`legitimate integrated devices, [F] wherein the one or more codes
`and other data values includes the device ID code; and
`[G] responsive to authentication of the one or more codes
`and the other data values by the agent, receiving an access
`message from the agent allowing the user access to an
`application, [H] wherein the application is selected from a group
`consisting of a casino machine, a keyless lock, a garage door
`opener, an ATM machine, a hard drive, computer software, a
`web site and a file.
`
`
`
`
`
`
`6
`
`
`
`IPR2021-01444
`Patent 8,352,730 B2
`
`E. Evidence
`Petitioner relies on the following prior art:
`U.S. Patent No. 7,239,226 B2, filed July 9, 2002, issued
`July 3, 2007 (Ex. 1010, “Berardi”);
`U.S. Patent No. 6,175,921 B1, issued Jan. 16, 2001
`(Ex. 1011, “Rosen”);
`U.S. Patent Application Publication No. 2004/0153649
`A1, published Aug. 5, 2004 (Ex. 1009, “Rhoads”);
`U.S. Patent Application Publication No. 2004/0044627
`A1, published Mar. 4, 2004 (Ex. 1006, “Russell”);
`U.S. Patent Application Publication No. 2003/0177102
`A1, published Sept. 18, 2003 (Ex. 1008, “Robinson”);
`U.S. Patent Application Publication No. 2003/0055792
`A1, published Mar. 20, 2003 (Ex. 1013, “Kinoshita”);
`U.S. Patent Application Publication No. 2002/0109580
`A1, published Aug. 15, 2002 (Ex. 1012, “Shreve”);
`U.S. Patent Application Publication No. 2001/0000535
`A1, published Apr. 26, 2001 (Ex. 1007, “Lapsley”); and
`International
`Patent
`Application
`Publication
`No. WO 99/56429, published Nov. 4, 1999 (Ex. 1005, “Scott”).1
`
`
`
`
`
`
`
`1 When referencing portions of Scott, we refer to the original publication’s
`page numbers in the top-center of each page (not the page numbers added by
`Petitioner in the lower-right corner of each page), consistent with the parties’
`usage in their papers.
`
`7
`
`
`
`IPR2021-01444
`Patent 8,352,730 B2
`F. Prior Art and Asserted Grounds
`Petitioner asserts that claims 1–11 of the ’730 patent are unpatentable
`on the following grounds:
`Claim(s) Challenged 35 U.S.C. §
`1, 2, 4–9, 11
`103(a)2
`
`References/Basis
`Scott, Russell, Lapsley
`Scott, Russell, Lapsley,
`Robinson
`Scott, Russell, Lapsley,
`Rhoads
`Berardi, Rosen, Shreve,
`Kinoshita
`
`3, 10
`
`6
`
`103(a)
`
`103(a)
`
`1, 2, 4–6, 8, 9, 11
`
`103(a)
`
`
`
`II. ANALYSIS
`A. Discretionary Denial Under 35 U.S.C. § 314(a)
`Institution of inter partes review is discretionary. See Harmonic Inc.
`v. Avid Tech., Inc., 815 F.3d 1356, 1367 (Fed. Cir. 2016) (“[T]he PTO is
`permitted, but never compelled, to institute an [inter partes review (IPR)]
`proceeding.”); 35 U.S.C. § 314(a) (“The Director may not authorize an inter
`partes review to be instituted unless the Director determines that the
`information presented in the petition filed under section 311 and any
`response filed under section 313 shows that there is a reasonable likelihood
`that the petitioner would prevail with respect to at least 1 of the claims
`challenged in the petition.” (emphasis added)). In the Preliminary Response,
`
`
`2 The Leahy-Smith America Invents Act, Pub. L. No. 112-29, 125 Stat. 284
`(2011) (“AIA”), amended 35 U.S.C. § 103. Because the challenged claims
`of the ’730 patent have an effective filing date before the effective date of
`the applicable AIA amendment, we refer to the pre-AIA version of
`35 U.S.C. § 103. See Ex. 1001, codes (22), (60).
`
`8
`
`
`
`IPR2021-01444
`Patent 8,352,730 B2
`Patent Owner argues that we should exercise our discretion to deny the
`Petition in view of the district court case. Prelim. Resp. 4–6 (citing Apple
`Inc. v. Fintiv, Inc., IPR2020-00019, Paper 11 (PTAB Mar. 20, 2020)
`(precedential)). We need not decide this issue, however, as we determine
`that Petitioner has not shown a reasonable likelihood that it would prevail
`with respect to at least one of the challenged claims.
`
`
`B. Legal Standards
`A claim is unpatentable for obviousness if, to one of ordinary skill in
`the pertinent art, “the differences between the subject matter sought to be
`patented and the prior art are such that the subject matter as a whole would
`have been obvious at the time the invention was made.” KSR Int’l Co. v.
`Teleflex Inc., 550 U.S. 398, 406 (2007) (quoting 35 U.S.C. § 103(a) (2006)).
`The question of obviousness is resolved on the basis of underlying factual
`determinations, including “the scope and content of the prior art”;
`“differences between the prior art and the claims at issue”; and “the level of
`ordinary skill in the pertinent art.” Graham v. John Deere Co., 383 U.S. 1,
`17–18 (1966). Additionally, secondary considerations, such as “commercial
`success, long felt but unsolved needs, failure of others, etc., might be utilized
`to give light to the circumstances surrounding the origin of the subject
`matter sought to be patented. As indicia of obviousness or nonobviousness,
`these inquiries may have relevancy.” Id. When conducting an obviousness
`analysis, we consider a prior art reference “not only for what it expressly
`teaches, but also for what it fairly suggests.” Bradium Techs. LLC v. Iancu,
`923 F.3d 1032, 1049 (Fed. Cir. 2019) (citation omitted).
`A patent claim “is not proved obvious merely by demonstrating that
`each of its elements was, independently, known in the prior art.” KSR,
`
`9
`
`
`
`IPR2021-01444
`Patent 8,352,730 B2
`550 U.S. at 418. An obviousness determination requires finding “both ‘that
`a skilled artisan would have been motivated to combine the teachings of the
`prior art references to achieve the claimed invention, and that the skilled
`artisan would have had a reasonable expectation of success in doing so.’”
`Intelligent Bio-Sys., Inc. v. Illumina Cambridge Ltd., 821 F.3d 1359,
`1367–68 (Fed. Cir. 2016) (citation omitted); see KSR, 550 U.S. at 418
`(for an obviousness analysis, “it can be important to identify a reason that
`would have prompted a person of ordinary skill in the relevant field to
`combine the elements in the way the claimed new invention does”).
`“Although the KSR test is flexible, the Board ‘must still be careful not
`to allow hindsight reconstruction of references . . . without any explanation
`as to how or why the references would be combined to produce the claimed
`invention.’” TriVascular, Inc. v. Samuels, 812 F.3d 1056, 1066 (Fed. Cir.
`2016) (citation omitted). Further, an assertion of obviousness “cannot be
`sustained by mere conclusory statements; instead, there must be some
`articulated reasoning with some rational underpinning to support the legal
`conclusion of obviousness.” KSR, 550 U.S. at 418 (quoting In re Kahn,
`441 F.3d 977, 988 (Fed. Cir. 2006)); accord In re NuVasive, Inc., 842 F.3d
`1376, 1383 (Fed. Cir. 2016) (stating that “conclusory statements” amount to
`an “insufficient articulation[] of motivation to combine”; “instead, the
`finding must be supported by a ‘reasoned explanation’” (citation omitted));
`In re Magnum Oil Tools Int’l, Ltd., 829 F.3d 1364, 1380 (Fed. Cir. 2016)
`(“To satisfy its burden of proving obviousness, a petitioner cannot employ
`mere conclusory statements. The petitioner must instead articulate specific
`reasoning, based on evidence of record, to support the legal conclusion of
`obviousness.”).
`
`
`10
`
`
`
`IPR2021-01444
`Patent 8,352,730 B2
`C. Level of Ordinary Skill in the Art
`Petitioner argues that at the time of the ’730 patent, a person of
`ordinary skill in the art would have had “a bachelor’s degree in computer or
`electrical engineering (or an equivalent degree) with at least three years of
`experience in the field of encryption and security (or an equivalent).” Pet. 4.
`Patent Owner does not address the level of ordinary skill in the art in its
`Preliminary Response. Based on the record presented, including our review
`of the ’730 patent and the types of problems and solutions described in the
`’730 patent and cited prior art, we adopt Petitioner’s definition of the level
`of ordinary skill in the art and apply it for purposes of this Decision.
`
`
`D. Claim Interpretation
`We interpret the challenged claims
`using the same claim construction standard that would be used to
`construe the claim in a civil action under 35 U.S.C. 282(b),
`including construing the claim in accordance with the ordinary
`and customary meaning of such claim as understood by one of
`ordinary skill in the art and the prosecution history pertaining to
`the patent.
`37 C.F.R. § 42.100(b) (2021). “In determining the meaning of [a] disputed
`claim limitation, we look principally to the intrinsic evidence of record,
`examining the claim language itself, the written description, and the
`prosecution history, if in evidence.” DePuy Spine, Inc. v. Medtronic
`Sofamor Danek, Inc., 469 F.3d 1005, 1014 (Fed. Cir. 2006). Claim terms
`are generally given their ordinary and customary meaning as would be
`understood by a person of ordinary skill in the art at the time of the invention
`and in the context of the entire patent disclosure. Phillips v. AWH Corp.,
`415 F.3d 1303, 1313 (Fed. Cir. 2005) (en banc). “There are only two
`
`11
`
`
`
`IPR2021-01444
`Patent 8,352,730 B2
`exceptions to this general rule: 1) when a patentee sets out a definition and
`acts as his own lexicographer, or 2) when the patentee disavows the full
`scope of a claim term either in the specification or during prosecution.”
`Thorner v. Sony Comput. Entm’t Am. LLC, 669 F.3d 1362, 1365 (Fed. Cir.
`2012).
`Petitioner submits that no express interpretations of any claim terms
`are necessary to resolve the parties’ dispute and “the claim terms should be
`given their plain and ordinary meaning.” Pet. 8. Petitioner, however, makes
`various arguments regarding the phrase “tamper proof format” in claims 1
`and 8. Id. at 8–9. We conclude that interpretation of “tamper proof format”
`is not necessary to determine whether to institute an inter partes review in
`this proceeding. See Nidec Motor Corp. v. Zhongshan Broad Ocean Motor
`Co., 868 F.3d 1013, 1017 (Fed. Cir. 2017) (“Because we need only construe
`terms ‘that are in controversy, and only to the extent necessary to resolve the
`controversy,’ we need not construe [a particular claim limitation] where the
`construction is not ‘material to the . . . dispute.’” (citation omitted)).
`We interpret another phrase in the claims: “third-party trusted
`authority.” Claims 1 and 8 of the ’730 patent recite wirelessly sending “one
`or more codes from the plurality of codes and the other data values for
`authentication by an agent that is a third-party trusted authority possessing a
`list of device ID codes uniquely identifying legitimate integrated devices.”
`Patent Owner points out that Petitioner argued in the district court case that a
`“third-party” is “an entity with a responsibility separate from executing the
`transaction itself.” Prelim. Resp. 2 (quoting Ex. 2001 (Petitioner’s Opening
`Claim Construction Brief in the district court case), 7). In the district court
`case, Petitioner argued that no construction of “third-party trusted authority”
`was needed and “[t]he intrinsic evidence does not suggest a party or
`
`12
`
`
`
`IPR2021-01444
`Patent 8,352,730 B2
`component numbered after a second party.” Ex. 2001, 6–7. This was in
`response to its articulation of Patent Owner’s position, namely, that a
`“third-party trusted authority” is “[a] third component that provides a second
`level of authentication.” Id. at 6. 3
`Petitioner further argued that “[d]uring prosecution, the applicant
`explained a ‘user []prov[ing] to the same institution that authenticates the
`fingerprint information that the user is who he purports to be’ does not
`satisfy the ‘third party’ limitation,” and “[t]he applicant emphasized the prior
`art ‘disclose[d] two parties: the user and the institution.’” Id. at 7 (quoting
`remarks made by the applicant for the ’730 patent) (alterations by
`Petitioner). Petitioner argued that “the intrinsic evidence suggests ‘third
`party’ relates to a specific class of entity occupying the aforementioned
`particular relationship.” Id. Petitioner then pointed out that “the
`[S]pecification explains the agent for the trusted authority ‘can be, for
`example, a government official, a notary, and/or an employee of a third party
`which operates the trusted key authority, or another form of witness.’” Id.
`(quoting Ex. 1001, col. 6, ll. 32–41). According to Petitioner, “[t]his
`‘witness’ role further aligns with prosecution history where the applicant
`explained ‘sending a code to a receiver of a door that the user is trying to
`access’ does not satisfy the ‘third party’ limitation.” Id. (quoting an
`amendment in the ’730 patent file history).
`
`
`33 In its responsive brief, Patent Owner changed its position with respect to
`the terms “third-party trusted authority” and “agent” to “[n]o construction
`needed” and stated that it “no longer seeks to submit th[e] term[s] for
`construction.” Ex. 2002, 12. The district court did not construe the terms.
`Ex. 3001.
`
`13
`
`
`
`IPR2021-01444
`Patent 8,352,730 B2
`Petitioner further offered expert testimony in the district court case to
`support an argument that “[c]ommon industry use of ‘trusted third party’
`identifies ‘third party’ as an entity with a responsibility separate from
`executing the transaction itself.” Id. at 7–8 (citing a declaration of Seth
`James Nielson, Ph.D. (Ex. 2003 ¶¶ 70–74) and a book, CRYPTOGRAPHIC
`LIBRARIES FOR DEVELOPERS). Dr. Nielson appears to best articulate the
`distinction Petitioner was drawing between a third component and a third
`entity: “the [S]pecification, the prosecution history, and common industry
`usage all suggest a ‘third party’ refers to an entity outside of the transaction
`or a witness thereto rather than a ‘third component.’” Ex. 2003 ¶ 74.
`In light of Petitioner’s arguments in the district court case, Patent
`Owner contends that the parties “agree that a ‘third-party trusted authority’
`is an institution or entity that is outside of the multi-party system (user and
`application or vendor) that is being authenticated.” Prelim. Resp. 3.
`We agree with Patent Owner. The plain meaning of “third-party
`trusted authority” suggests an entity or party separate from the principal
`parties to a transaction. See, e.g., THE AMERICAN HERITAGE COLLEGE
`DICTIONARY 1433 (4th ed. 2004) (“third party n. . . . 2. One other than the
`principals involved in a transaction.”) (Ex. 3002).
`This is consistent with the description in the Specification. For
`example, Figure 3, reproduced above, depicts trusted key authority 320 as
`an entity separate from biometric key 100, authentication module 310, and
`application 330. As the Specification states, “[t]rusted key authority 320 is
`a third-party authority that is present in some embodiments in order to
`provide enhanced security.” Ex. 1001, col. 5, ll. 40–42. Examples of trusted
`key authorities include “a government official, a notary, and/or an employee
`of a third party which operates the trusted key authority, or another form of
`
`14
`
`
`
`IPR2021-01444
`Patent 8,352,730 B2
`witness.” Id. at col. 6, ll. 35–38. Petitioner’s citations to the applicant’s
`statement during prosecution of the ’730 patent also are consistent with a
`third-party trusted authority being an entity separate from the principal
`parties to a transaction, as is the declaration of Dr. Nielson. See Ex. 2001,
`6–8; Ex. 2003 ¶¶ 70–74. Thus, we interpret “third-party trusted authority”
`to mean a trusted authority that is an entity separate from the parties to a
`transaction.
`Claim 1 also recites “responsive to authentication of the one or more
`codes and the other data values by the agent, receiving an access message
`from the agent allowing the user access to an application,” and claim 8
`similarly recites “responsive to the agent authenticating the one or more
`codes and the other data values, a radio frequency communicator, receives
`an access message from the agent allowing the user access to an
`application.” Patent Owner argues that “[a]n ‘access message’ must be sent
`by the ‘third party trusted authority’” and the parties “agree that the ‘access
`message’ originates from the ‘third-party trusted authority’ and
`communicates information regarding authentication.” Prelim. Resp. 3–4
`(citing Ex. 2001, 9). Both claims recite that the access message is received
`“from” the agent (which is a third-party trusted authority). Given that
`express requirement, we conclude that no further interpretation of the
`“access message” limitations is necessary. 4
`
`
`
`4 Petitioner argued in the district court case that the limitations mean
`“[r]eceiving a signal from the agent permitting a user to access an
`application,” Patent Owner argued that no construction was necessary and
`the phrases should be given their plain and ordinary meaning, and the district
`court determined that no construction was necessary. Ex. 3001, 3.
`
`15
`
`
`
`IPR2021-01444
`Patent 8,352,730 B2
`E. Obviousness Ground Based on Scott, Russell, and Lapsley
`Petitioner contends that claims 1, 2, 4–9, and 11 are unpatentable over
`Scott, Russell, and Lapsley under 35 U.S.C. § 103(a), citing the testimony of
`Andrew Wolfe, Ph.D., as support. Pet. 22–43 (citing Ex. 1003). We are not
`persuaded that Petitioner has established a reasonable likelihood of
`prevailing on this asserted ground as to any of the challenged claims.
`
`
`1. Scott
`Scott describes “a portable personal identification device for
`providing secure access to a host facility” in which the device “includes a
`biometric sensor system capable of sensing a biometric trait of a user that is
`unique to the user and provides a biometric signal indicative thereof.”
`Ex. 1005, p. 2, ll. 5–8. Figure 1 of Scott, reproduced below, illustrates an
`example.
`
`
`Figure 1 is a block diagram of security system 2 that provides access to host
`facility 4 (e.g., a bank, store, military base, computer system, automobile,
`home security system, or gate). Id. at p. 10, l. 1, p. 10, ll. 24–28.
`
`16
`
`
`
`IPR2021-01444
`Patent 8,352,730 B2
`A registered person carries battery powered personal identification
`device (PID) 6 (e.g., similar in size to a hand-held pager), which includes
`biometric sensor 11. Id. at p. 10, l. 28–p. 11, l. 5. Memory 20 stores an
`ID code that is set in PID 6 by the manufacturer. Id. at p. 11, ll. 11–13. The
`owner of PID 6 enrolls into the unit by scanning a finger using biometric
`sensor 11 to create an image that is stored as the fingerprint template in
`memory 20. Id. at p. 11, ll. 14–20, p. 15, l. 30–p. 16, l. 6. PID 6
`communicates wirelessly via transmission signal 41 with host facility 4.
`Id. at p. 12, ll. 14–16.
`Host facility 4 is part of host system 30 (e.g., a bank ATM system or
`point of sale system), which also includes host processing unit 32. Id. at
`p. 11, l. 31–p. 12, l. 2. “Host processing unit 32 may be located with host
`facility 4, or may be located at a remote location, where it may also serve
`other host facilities 4 in a distributed network 42.” Id. at p. 12, ll. 3–5.
`“Memory 36 stores ID codes of enrolled individuals who have registered
`with host system 30.” Id. at p. 12, ll. 6–7.
`
`
`
`17
`
`
`
`IPR2021-01444
`Patent 8,352,730 B2
`Figure 7 of Scott is reproduced below.
`
`
`Figure 7 is a flow diagram of a method of accessing a host facility with a
`personal identification device. Id. at p. 10, ll. 13–14.
`A user with PID 6 approaches host facility 4 (e.g., an ATM) and
`transmits the ID code to host receiver module 38, which passes it to host
`processing unit 32. Id. at p. 17, ll. 20–27 (steps 100–104). Host processing
`unit 32 verifies that the received ID code represents a registered ID code
`and, if so, the account or user information is located. Id. at p. 17, ll. 27–30
`(steps 106, 110). Host processing unit 32, via transmitter module 40, sends
`a random number to PID 6, in response to which PID 6 performs a user
`
`18
`
`
`
`IPR2021-01444
`Patent 8,352,730 B2
`verification. Id. at p. 18, ll. 1–4 (steps 112–118). PID 6 verifies the user’s
`fingerprint when the user places their finger on platen 15 of biometric sensor
`11 by comparing the fingerprint signal to the stored fingerprint template. Id.
`at p. 16, ll. 19–29. If PID 6 successfully verifies the user’s fingerprint, PID
`6 encrypts the random number and sends it back to host processing unit 32,
`which decrypts the random number and verifies that it matches the random
`number it sent to PID 6. Id. at p. 18, ll. 5–14 (steps 120–126). If the random
`number is a match, host processing unit 32 grants the user access to host
`facility 4. Id. at p. 18, ll. 14–15 (step 128).
`
`
`2. Russell
`Russell describes systems “for providing secure interactions.”
`Ex. 1006 ¶ 3. A PID with a biometric input component and transceiver is
`used “to both receive and transmit signals to external devices” to conduct
`secure interactions. Id. ¶¶ 182, 189. Russell discloses that signals received
`by and transmitted from the PID are “encrypted” and “[p]referably, an
`asymmetric encryption scheme, such as a public key/private key scheme is
`used.” Id. ¶ 171.
`
`
`3. Lapsley
`Lapsley describes “a system and method of using biometrics for
`processing electronic financial transactions such as on-line debit, off-line
`debit and credit transactions without requiring the user to directly use or
`possess any man-made tokens such as debit or credit cards or checks.”
`Ex. 1007 ¶ 2.
`
`
`
`
`19
`
`
`
`IPR2021-01444
`Patent 8,352,730 B2
`Figure 2 of Lapsley, reproduced below, illustrates an example.
`
`
`Figure 2 is a block diagram showing the connections among Party
`Identification Devices (PIAs) 101, router 202, and Network Operations
`Center (NOC) 203. Id. ¶ 47. Figure 2 is in the context of a supermarket
`chain or other multi-lane retail chain with multiple PIAs 101 connected via
`an in-store local area network to local router 202, which is connected to
`NOC 203 via frame relay lines. Id. ¶ 98. NOC 203 includes Data
`Processing Center (DPC) 204. Id.
`Each PIA 101 has a hardware identification code that is assigned to it
`and registered with DPC 204 at the time of manufacture, making the PIA
`uniquely identifiable to DPC 204 in transmissions from the PIA. Id. ¶¶ 85,
`161. An entity uses the PIA hardware identification code to identify itself to
`the DPC. Id. ¶ 158.
`
`20
`
`
`
`IPR2021-01444
`Patent 8,352,730 B2
`Figure 7 of Lapsley is reproduced below.
`
`
`
`Figure 7 is a flow diagram showing a transaction flow among the
`participants in a retail point-of-sale transaction. Id. ¶¶ 52, 166.
`The customer/payor originates an electronic payment at a
`point-of-sale by submitting a bid biometric sample obtained by a biometric
`sensor of the PIA controlled by a payee/seller. Id. ¶¶ 166–167 (step 702).
`The PIA determines that the sample is not fraudulent and sends the sample
`to the DPC. Id. ¶ 167. The payor enters a PIN code into the PIA, and the
`PIA transmits the biometric data, PIN, and hardware identification code of
`the PIA to the DPC. Id. ¶ 168 (step 720). The DPC identifies the payor
`using the biometric sample, retrieves a list of financial accounts that the
`payor has registered with the system, and transmits the list to the PIA. Id.
`(steps 706, 708). The DPC identifies the payee using the PIA hardware
`identification code. Id. ¶¶ 166, 168. The payor selects a financial account at
`the PIA and the PIA transmits the financial information to the payee’s
`in-store payment system (e.g., point-of-sale terminal or electronic cash
`
`21
`
`
`
`IPR2021-01444
`Patent 8,352,730 B2
`register). Id. ¶¶ 169–170. The in-store payment system authorizes the
`transaction. Id. ¶ 170.
`
`
`4. Claim 1
`Petitioner argues that Scott, Russell, and Lapsley collectively teach all
`of the limitations of claim 1. Pet. 22–35. With respect to the preamble of
`claim 1, Petitioner contends that Scott teaches a method of verifying a user
`during authentication of an “integrated device” (i.e., PID 6). Id. at 22.
`With respect to limitation 1A, Petitioner argues that Scott teaches
`persistently storing “biometric data of the user” (i.e., a biometric template)
`and “a plurality of codes and other data values comprising a device ID code
`uniquely identifying the integrated device” (i.e., an ID code